How to transform large groups of similar crappy html pages into quality css-based pages?
What is the best way to transform large bunches of very similar web pages into a newer css-based layout programatically?
I am changing all the contents of an old website into a new css-based layout. Many of the pages are very similar, and I want to be able to automate the process.
What I am currently thinking of doing is to read the pages in using HtmlAgilityPack, and make a method for each group of similar pages that will create the output text.
What do you think is the best way to do this? The pages mostly differ by things like which .jpg file is used for the image, or how many groups of heading-image-text there are on that particular page
EDIT: I cannot use any other file type than .html, as that is all I am authorized to do. Any suggestions?
EDIT2: Ideally, I would also be able to make this be generic enough that I could use it for many different groups of html files by just switching around a few moving parts.
The above link is a sample of what I am dealing with. The parts that would differ between pages would be:
- the meta description tag
- various headers, especailly the main header
- almost every image on the page will be new
- the text for each video will be unique, but they will be grouped together in similar chunks
- the video files, and video sizes will be unique
Everything else is the same, and the format of the pages is also the same.
EDIT3: When in doubt another thing that might be helpful is to write some code that will write the pages for me. I just need to cut out the parts of the originals that are variable, and put them into a data file that gets read and used to write the new versions.