This is a home site, designed and created by Rose Nichols, Mehreen Hanif, Martin Francis and Michael Collins, for Data, Schemas and Applications, Coursework 1. There are five main components: Maps, Flickr, Weather, News and Twitter. The Maps section is divided into two components: one with a traditional map view, and the other with a satellite view.
HomeHours: 28
Report: The first thing the group had to decide on was the locations of the twin towns that we were going to feature. This was no joke as there needed to be information available for all elements of the site, particularly we looked at whether there were a weather station and news source available for the regions. The main problem with towns twinned in the UK is that most are twinned with places in France, and the language difference became a barrier when looking for news sources. We settled on two sets of twin towns, each comprising of a town in the UK and a town in the US. The module of the site that I took the lead in developing was the weather section, however I also assisted with the Flickr section. To begin, I looked again at the weather station example I had completed in the DSA tutorial and tested the programme with the URLs for the group’s twin towns. I already had the URLs for clientraw.txt files from the earlier time spent gathering the necessary resources so I was able to quickly ascertain that the files I had would work. The problem that I then faced was that the programme in its current form was extracting a parameter from the programme’s URL in order to append ‘/clientraw.txt’ to form the URL of the of the weather station’s data file. This meant that the page could only display weather information about one town at a time. I tried taking out the code which extracted the information from the URL and replacing it with code that did essentially the same thing but extracted the necessary URL from variables. I knew what I wanted to do in theory, however I couldn’t get this to work in practice. I created a PHP config (dsa_config.php) file, as per the example we were given by Prakash, and put into it the entire URL for the client raw files. This separation made it easier to work out what code I needed to put where. In the utility file (dsa_utility.php) I entered the function which I had and edited it so that it pulled the URL for the weather URL out of the config file and then performed actions upon it. The data from the CSV files is exploded on the space and fed into an array. I was then able to pull out just the piece of information I needed, using constants to define the meaning of each piece of data. The function degree_to_compass_point converts the data which is in degree format into a more readily understandable form, namely compass directions. The get_weather_data function is called from the main page of the site. The site has two sets of config files, Martin having used XML, and other group members having used PHP. I am aware that it would have been an advantage if all members had used XML as this would have been more versatile.
Language written: PHP
Location
Purpose: Contains data for weather stations, Flickr images and Twitter feeds.
Problems: The PHP file works perfectly adequately, however putting the data contained in this file into an XML file would've been a better solution.
Language written: PHP
Location
Purpose: Lists the weather data obtained from clientraw.txt files.
Problems: I couldn't get the degree symbol to display, even using the character entity because it was just printing out exactly what was written and not converting to the degrees symbol.
Hours: 10Below Expected Hours
Report: I started with helping Rose searching for twin towns. It was hard to find twin cities as some of them were in UK and twinned with France. We wanted two English countries for example so we don’t have to convert French news in English. We then had to search for cities in UK that were twinned in USA and they also had weather stations and news channels. Once we found the twinned cites, we searched for their weather stations and because they didn’t had weather stations, we had to search again so once we found the right cities we had to make sure those cities have weather stations, news channels, twitter feed and Flickr images. After finalising on the cities I then started with going to flickr and searching for the images for Winchester UK, Winchester VA, Hackney London and Long Island New York. If I searched just the word “Long Island, New York” images results were mix of everything for example lakes, houses etc whereas I wanted particularly the landmarks or town centre. I made an account with Flickr so I can get hold of an API key but we didn’t use the API key, we used Flickr’s built in public search function. I looked at the example Prakash uploaded on Blackboard and used the configuration file to see how he has put the flickr URLs and I edited them with the cities we are using. In that configuration file, there is an array for each city, which holds a line of code for flickr URL. That URL gets the images. In dsa1.php file there’s a section for Flickr. That code gets 8 images for each city. In utility file, the code for flickr has functions that gets images from flickr atom feed and returns them into an array of image links. It then matches any image with extension of jpg and builds the image and saves it into an array. We came across few problems and some of them were Flickr was displaying many images and we worked out a way where we only wanted few images to display so we used for loop. Images size was another big problem because we only had a div that was 300px wide that was holding images for each city. When Flickr was first displaying images they were really big and it was messing up the layout so we specified the image size for each image and made them small so they fit in the image div. The layout of the website didn’t allow us to display the images any bigger than 80px otherwise we could have made a separate area where we could display images by using Lightbox. The images aren’t clickable because flickr returns an atom feed instead of RSS feed. We couldn’t negotiate the atom file because we’re familiar with RSS more. In future we will try to use atom feeds.
Language written: PHP
Location
Purpose: Searches through publicly available photos on Flickr and picks off relevant ones based on keyword tags. Only 8 images are displayed for each town.
Problems: I had trouble getting the PHP to pick off the link to the photos on the actual Flickr site.
Hours: 30
Report: I set up the HTML page structure in a conventional 3 part page with a head section, main section div and footer div. I immediately made the head and footer sections into php includes, freeing up space on the page and making it easier for future development. This left me to concentrate on just the main content elements. The main div is split into smaller divs so that each member of the group can populate their allocated div, thereby making it easier to read. I did the map and twitter component. Firstly I followed the PHP and KML tutorial in the Student Wiki which pointed to the Google Code KML tutorial which I followed. I wrote a simple KML file in Notepad++ that included a relevant image (held on my server, so had an absolute link to it) to each one of the 4 locations. To get the longitude and latitude co-ordinates I looked them up using Google earth and to get the decimal versions of the longitude and latitude went to the site http://www.fcc.gov/mb/audio/bickel/DDDMMSS-decimal.html which did the conversion for me. I then entered the KML file into the search box in Google Maps, which read the KML correctly and returned back the 4 locations as expected. I then extracted just the link from the embed code that Google provides, because the rest of the code does not validate. I knew this from other projects, but originally learnt from W3C schools how to get embed coded to validate. I the used the link as an attribute of an object tag and included this in my page. This worked but it was not the right way to do it. The proper way is to hold the unique data in an XML config file and keep the server side script generic as to be able to use a same schema config file that contains different data. Or, conversely, the server side script could change but pull data from the same XML config file. A layered architecture approach where there is a separation of concerns. So what I did next was used a PHP script to gather the relevant data from the Config XML file and create the KML file. I adapted the XML file provided by Chris Wallace to write an XML config file and the PHP script supplied by Chris to then generate the KML. The PHP generated KML file I posted on the server and entered to absolute link to the KML validator http://www.kmlvalidator.com/validate.htm. For the page titles, as taught in DSA lecture 5 I used simpleXML tree navigation to drill down into the XML config file to grab the title to use for the h1 title of the page and for loop to extract multiple instances of the names of each one of the 4 stations. Lastly I added the twitter component and based the code on Prakash’s yahoo example
Language written: XML
Location
Purpose: Holds geographic location data, and data for the titles
Problems: Originally I had more specialised data including images of the places, I realised that this data resulted in errors when I called it in using the config_to_kml.php. So I then kept the data held in the config file to just essential data, such as links and location data.
Language written: XML
Location
Purpose: Produces the code to generate kml from XML data provided by the
stations.
Problems: Early on in the project I hard coded the KML file, but realised later that this did not fulfil the brief, so this file is no longer used
Language written: HTML
Location
Purpose: Merges all components and sub-pages and displays formatted content.
Problems: I have been the lead on this project but unfortunately the least available due to working this has meant that we've each done out components separately.
The result of that is that we have ended up with 3 config files, one in XML and 2 in PHP. By adding the Twitter feed I hope that I can demonstrate that I understand the way that Rose and Mehreen used Prakash's feed method as well as using the XML config file for the page titles and map.
Language written: HTML
Location
Purpose: Merges all components and sub-pages and displays formatted content.
Problems: This page's contents will change with a change of config file, but not the Google Map. I would have liked to gone a stage further and changed added the results of this page http://maps.google.co.uk/maps?q=http://www.cems.uwe.ac.uk/~mfrancis/dsa/config_to_kml.php as an embed to the Index page but I wasn't able to do it with out the javascript code and so I had to settle for using the Google map generated link.
Language written: PHP
Location
Purpose: Calls 4 relevant twitter feeds
Problems: I tried several different ways to get this to work. At home using my own server, I used the new simpleXMLElement($xml) method to call in the feeds and it worked perfectly. However at UWE it did not. Problem getting past the UWE proxy. I then tried using an AJAX feed using code from www.dynamicdrive.com that I have used for other assignments that is not effected by this problem, but this only partially worked. And I found out that Twitter is not allowed in some rooms in UWE. I then adapted Prakash's yahoo news feed and added the component onto Rose and Mehreens files and this worked.
Hours: 15
Report: My assignment given me by the group is to create a news feed for the DSA1 Assignment, The news feed must be based on all four of the towns we have chosen To get the news rss feed I visited http://www.bbc.co.uk and typed into the news search form provided for an xml news feed for the first town which is Winchester in Hampshire. http://newsrss.bbc.co.uk/rss/local/hampshire/front_page/rss.xml with the xml details at hand, I wrote a php simplexml_load_file script to extract the news items required. As per the assignment instructions I also created a config file to include the links to the xml files and got the php script to access the file then extract items from it. The second feed is news covering Winchester VA, I visited many site like google.com, yahoo.com, searching for news xml link, eventually I arrived at this link http://feeds.bbci.co.uk/news/scienceandenvironment/rss.xml where once again I used the php script to extract the feed item link, item title and item description. All this was done using notepad++ The third feed is somehow a local news so I went to bbc.co.uk again and searched for hackney news feed http://newsrss.bbc.co.uk/rss/local/london/front_page/rss.xml. The php script is as a template so all I need to do again is to change the feedurl and extract the items needed I used the procedure for the rest of remaining news feeds. As a group work we all have to make sure that each project taken will come together as one project we had a page made and within the page there is an individual div space allocated for each member to fill the space provided with our script after that we have to test it both locally and remotely on a server to make sure it works to specification.
Language written: PHP
Location
Purpose: To extract RSS news feed for our four location towns in the DSA1 Assignment project the news feed
to display the current news when the page is loaded
Problems: The one main problem with creating the newsfeed component, I had difficulties in finding a
selection of web sites that provide RSS feeds by location. Could not find RSS news feed xml for
one of the towns on our project, for the listed feed was for the County. Solved this problem by using the
town name in the quering string of the url and got the xml data needed for the town.
Total Hours: 83
Total Team Members: 4