Creating A Dynamic Picture Gallery

 

I've thought about adding some more picture galleries to my website that I can crawl for my slideshows. I put all of the image files I want into a folder, and then at the command line I enter DIR /b  > pics.html  too create a list of the image files. Next I run a program that reads in the file names and creates an html page that consists of lines like <img src="url of image file">. Here's something I put together in Java:

mport java.io.*;
import java.lang.*;
class addTags
{
   public static void main(String[] args)
   {      
      try
      {
      String newLine = System.getProperty("line.separator");
      FileInputStream fstream = new FileInputStream(args[0]);
      DataInputStream in = new DataInputStream(fstream);
      BufferedReader br = new BufferedReader(new InputStreamReader(in));
      
      FileWriter ostream = new FileWriter(args[1]);
      BufferedWriter out = new BufferedWriter(ostream);     
      String newline = br.readLine();
      int cntr = 0;
      out.write("myPics[0]=<img alt=\"");
       Integer i = new Integer(cntr);
       String temp = i.toString();
       out.write(temp);
       out.write("\" src=\"community-info.org/CitronGallery/");         
      out.write(newline);  
      out.write("\">");  
      out.write(newLine); 
      while ((newline = br.readLine()) != null)
      {     
         i = new Integer(++cntr);
         temp = i.toString();    
         out.write("myPics[");
         out.write(temp);
         out.write("]=\"<img alt=\"");         
         out.write(temp);
         out.write("\" src=\"community-info.org/CitronGallery/");         
         out.write(newline);  
         out.write("\">");  
         out.write(newLine); 
      }
      out.close();
      in.close();
      }
      catch (Exception e) {};
   }
}

My web crawler reads the <img> tags, and that's what I display on my web pages. That's been working out pretty good for me, but why not save some time by writing a reusable script that produces the desired html page for whatever folder I put it in?  That was my thought today, and here's what I wrote (in Perl):

#!/usr/bin/perl
use warnings;
use strict;

print "Content-type: text/html\n\n";
print "<html>";
print "<head>";
print "<title>Web Page of Images</title>";
print "</head>";
print "<body>"; 
my @files = <*>; 
my $imgURL = "http://community-info.org/CitronGallery/";

foreach my $file (@files)
{
   if ($file =~ /\.jpg$/i) 
   { 
      print '<img src="'.$imgURL.$file.'"> <br />';
   }
}
print '</body>';
print '</html>';

 

The only line of code I have to change for different folders is my $imgURL = "the virtual address I've assigned to this folder in my web server".  I'm not going back to change the html pages I've already created, but if I add any image files to the folders where they reside, I'll just replace the html files with the above Perl script (and have the crawler search for it instead of the html files). Going forward, I'll use the Perl script for future image folders.

 

 

Return To My Blog Page                                              Return To My Programming Page