Developing a custom content system

We’ve worked with Practical Action for many years developing their site and online presence, but back in 2010 they took the bold step to completely overhaul their site. This naturally involved considering both changes to the design and the system on which the site was built. After considering a few different systems the team decided on a bespoke path that led to the creation of a custom Content Management System that really establishes new foundations for the site to develop over the coming years.

Starting with the redesign

The design process was exiting, and working on such a large site was always going to be a challenge. The core challenge as ever is to find a way to effectively deliver all the content through an interface that pleases the visitor, helping them to clearly find and read content.

The designs needed to streamline the flow of content, simplifying the navigation and present a lot of content. Lots of time had to be given to planning content layout and we manage to nestle a lot of featured content into carousels on the home page and on all the category home pages as well as embracing a tab system for breaking up in-page content into usable groups. It helped that Practical Action had commissioned a report on their old site which provided some great recommendations including how menus should be constructed (one tier only for main menus). This helped direct early decisions and after a number of design concepts this final design was born.

CMS Highlights

We can’t go into too much detail about all the tech behind the CMS (that will follow in future posts), but here’s a snapshot of some of the features that have been created as part of this solution:

XML at the core

The key objective was to build a system that was not restricted to the web and build a flexible system for the future. To achieve this we needed to move away from the standard web editors and so devised a way to create content in XML. In essence we’ve developed a publishing system, rather than a web content management system. Editors write content which is stored as XML then our CMS, depending on the choosen the template, will be able to translate the XML into what ever media the end user requests. Currently the site is only delivering the content as web pages (HTML), however given the XML system underneath, this content could just as easily be delivered to mobile browsers, or used to create PDF’s ready for print or screen, or even syndicated to third-party services.

Reusable Content

Content is an asset and shouldn’t be locked to a page and given our ‘publishing’ mindset we needed to disconnect ‘pages’ from ‘content’. So the system allows content creators to work their magic using the editor interface as with any CMS, but to publish this content the users must assign this content to the site map to allow the content to be pulled through to the site. This ‘node’ architecture enables content to be repurposed in different parts of the site, allowing different ‘leafs’ or pages to use the same content. In future this same content could be used by a mobile sitemap, or even by a PDF system – allowing content to be written once and automatically distributed to multiple platforms.

Cross Linking Content

There’s no point having loads of content unless visitors can easily get to it so another key goal of the new system was to provide a flexible way for content creators to build lists of relevant content from their article.

We developed a pretty powerful query builder that allows editors to create a list of content based on all sorts of criteria. They can list content from a particular branch of the sitemap or list content that matches certain keywords.

Practical Action’s content is also tagged with information such as language, content type (case studies, technical briefs, annual reports etc) and audience (Development practitioners, teachers, MEP/MPs etc), all of which are available to the query builder, allowing the editor to build content relevant to the reader.

This feature is also being developed to automatically suggest content based on what the visitor is reading, much like how a shop suggest other products that you might be interested in.

Previewing Media

It was important that the fixed media, i.e. PDF, word documents etc could be previewed from within the site, rather than requiring the visitor to download the file to view.

We found a document viewer and built it into the CMS in a way that automatically converts media files into a format suitable for the viewer and will allow visitors to scroll through the media content from within the website, with the option to download it if its what they are after.

Currently only PDF’s are previewed through the viewer but plans are already underway to extend this to support other media files.

 

Delivering a scalable solution

The previous site was getting around 80,000 visitors a month, and had thousands of pages so this was a large site to begin.  Ou solution needed to account for the growth plans of the client (i.e. delivering a system that could be easily managed and facilitate the new content) whilst also supporting the increased traffic and keeping the site quick to load. To do this we needed to embrace the latest tech and tools available to us to deliver a responsive site that would provide us with the foundations for future development. We knew from the outset that we’d be building the system on CakePHP which is a leading rapid development framework. Well formed code wasn’t enough though, so we embraced the cloud technologies at Amazon to deliver the site on a load balanced hosting platform. A complex caching system was devised to keep request times down and that cache is synchronized across all servers in the cluster. All amazing tech we will write more about but that’s just the start…

Delivering content around the world

The site carries a great deal of media that is accessed by users across the world, so it seemed only sensible to build the CMS on a Content Delivery Network (CDN). When a user publishes media from within the CMS, the system pushes a copy of that file to the CDN. The website when listing media, uses the CDN version of the file, allowing the visitor to download the file from a web server in their region (rather than having to transfer that data all the way from the UK). Essentially the CDN means that there are copies of the media on servers across the world and when a user wants the file, Amazon will allow them to download it from the location closest to them to allow the file to download quickly. We’ve also embraced the CDN’s access control via the CMS to allow certain documents to only be available for download when a visitor is logged in – very slick. Needless to say any images and files used by the sites core code are all delivered via the same CDN to ensure load times are kept snappy.

Creating little workers to bear the load

There’s so much going on behind the scenes of the CMS that we want to create a window into the world of “little helpers” so users can see everything that gets carried out to ensure the site is blazing. The principle behind developing a process queue, was that we did want administrators having to wait while for the CMS to finish doing all it needs to do when a page is published. So from the users perspective they click a button and see a response – job done. But behind the scense the CMS’s little workers busy away to:

  • Create the necessary thumbnail images (used on the listings) for the content
  • Add the content information to the search index
  • Push any associated media to the Content Delivery Network (CDN)
  • Convert any associated media to the format required for the sites document viewer

Migrating from old to new

What’s a shinny new site with all these great features if there’s no content? Probably one of the largest and most difficult tasks of the project was to import the existing content from the old site to the new site as well as import all the documents and carry across existing user accounts. To give you some idea of the scale, here are some approximate figures:

  • 4,500+ web pages
  • 3,500+ documents (PDF’s, word documents etc)
  • 64,000+ user accounts

The process wasn’t without it’s glitches, cleaning the old data and preparing it in a format for the new system was quite time consuming. Remember we were moving from a traditional CMS that had allowed users to create content in HTML, to a new system that wanted to remove that dependancy from content and reformat content to XML. We also needed to ensure that old content would be presented in the new layouts, so we devised an import system that allowed the administrators to pick an old page and convert it into a new page. This would involve selecting the one of the new templates available and assigning old content to the various parts of the new page.  Let’s not forget the sheer time to process the all the physical documents and ensure they were all uploaded and synchronized with the CDN correctly. We were even able to recreate the user accounts on the new CMS without having to reset any of their credentials, so they were all able to login with their details without any potential blockers to them accepting the new system.

 

Check out the site for yourself: http://practicalaction.org

Building Magento around Booksonix

Practical Action Publishing approached FLUID7 to redesign their Development Bookshop. It was an exciting opportunity to move them off of an ageing OSCommerce site and into a shiny new Magento instal. To really spice things up we wanted to connect their shop with their book management software, Booksonix. Creating a slick solution that pulls new and updated titles from Booksonix into Magento automatically, cutting out the need for the team to manage titles in both applications independently.

You can check out the new bookshop here: http://developmentbookshop.com/

 

Redesigning the Bookshop

The new design pulled in core styles that had been created for the recent group site redesign for Practical Action. Presenting enough similarities in the layout and navigation to enhance the group connections, whilst also creating a unique colour scheme and content elements that would let it stand out from the group and allow it to stand alongside its bookshop peers.

The Magento store offers Practical Action Publishing a firm ecommerce platform to promote and distribute their titles from. It includes specific listings for each of the publishers that they sell titles on behalf of, such as Oxfam. Not only can titles be viewed exclusively by Publisher but standard category listings can also be filtered by publisher allowing the visitor to navigate the site in a number of different ways, to find the titles of interest.

With the designs in place we turned our attention to the build. Anyone working with Magento knows it’s a real beast to wresstle, but once tamed yeilds exceptional results. After developing a number of Magento sites for our clients we’ve already overcome the hosting issues. Magento is very large and requires a hefty server to allow it to run quickly, so we’ve got a sweet server over at Amazon specifically tuned for our Magento sites. Yet the challenge of importing and managing data from Booksonix was to proove pretty tricky.

 

Importing data from Booksonix

Early on we discovered that booksonix didn’t have a live XML feed, instead they ftp an XML to your server on a nightly basis. So instead of being able to use a nice XML module already created for Magento imports, we had to write a custom importer that would pick up any un-imported xml files (and their associated jacket images) and push all that data into Magento.

Booksonix XML consists of a range of shortcodes to represent all the fields managed by their book management software. It took some time to determine what fields were relevant to the clients store-front requirements and with a little processing we were able to format the data and assign it to fields in our Magento site. Probably the hardest challenge was developing a way for the client to manage the store categories from within Booksonix. We were able to utilise the custom fields of Booksonix to allow the client to assign titles to categories. When the importer runs it will now create a new Magento category (if it doesn’t already exist) and assign all coresponding titles to it.

It took a little time to run the first few imports given the sheer volume of data (and image files) that had to be processed, but after a lot of tweaking our importer now runs daily, reading the files from Booksonix and importing anything that is new into Magento.

 

Managing the Magento Store

Whilst the importing of data into a Magento store dramatically enhances the efficiently of the site, there’s still a fair amount that needs to be handled manually within the Magento environment. Core Manageo features allow the client to select featured titles and title of the month, leaving bestsellers to be driven by sales. Practical Action Publishing are also responsible for Managing offers and processing sales, but the beatify of this solution is that now they can focus on selling the titles with reduced time managing the data behind the products.

 

This article doesn’t do the coding behind this solution any justice. We ourselves certainly underestimated the challenge of connecting Booksonix with Magento but we are extremely proud of our achievements (and grateful to the client for bearing with us as we worked through the issues). The benefit of managing data in one place (booksonix) and having other systems (Magento) use that data automatically is a model we try and replicate wherever possible, using technology to simplify the work of our clients (and ourselves).

 

 

 

 

The Talent Business Redesign

Designer Ben Merrington got us onboard to deliver his designs for the thetalentbusiness.com. This redesign for The Talent Business, provider of senior talent to communications agencies and creative businesses around the globe, needed to utilise the latest web-tech. The result is smooth HTML5 & CSS3 transitions, some pretty special Ajax content loading and all built on our in-house CMS.

 

On the outside

This really was an exciting contract to work on. Not only are the designs great, but the requirement to make the experience as slick as possible allowed us to get really geeky with the latest techniques. Take a look at the site, www.thetalentbusiness.com, to check out some of the features:

  • Smooth fades between new content, smooth rollovers on the home page, and smooth menus in the main site. It’s all very smooth…
  • When in the main site, we load all content via AJax meaning the page doesn’t have to load the entire page each time. Instead, when you click a link, you’ll notice that only the content that is changing fades in and out, the header and menu remain visible at all times. Getting really geeky, we even update the web address using Ajax so the page name changes so you always know where you are in the site.
  • We also created some neat contact cards to appear when you hover over a member of staff, allowing the core layout to remain clean whilst providing some extra contact details when required.

As well as using HTML5, CSS3 and Ajax to create the required experience, we also made sure as much of the experience could be recreated using JQuery (if the visitors browser doesn’t support HTML5).

 

On the inside

Building this on our in-house CMS helped us deliver the functional requirements of the clients, including:

  • Handling the profile images automatically. Our CMS auto-crops images to fit the designs and desaturates them to allow them to appear in greyscale. So staff don’t need to get into Photoshop to update a profile.
  • Staff profiles and other core content can also be created once and used many times, allowing TTB admin to reuse content across the site without the hassle of duplicating and managing multiple content.
  • Our CMS’s multi-theme system also allowed us to create the CREAM microsite from within the same CMS. Allowing all content to be managed through one system whilst looking pretty different on the outside.

 

The beauty of this project for us is that it uses advanced techniques to subtly improve the vistors experience without shouting too hard that under the surface it’s doing stuff pretty differently to conventional sites. If you like what you see and have a potential project, or want to get geeky and chat more about the tech involved, please throw your comments below!

 

Drupal – Theming to keep your modules modular

Drupal is a powerful CMS and allows us as developers to create very bespoke web sites and applications.

I tend to create a module for every website to handle its Page and Block declarations. But its messy and not to mention unconventional to include HTML in your modules. I want to share how to theme your page declarations and any other piece of HTML for that matter to keep your modules tidy.
Continue reading “Drupal – Theming to keep your modules modular”

Magento – Add custom content layouts for CMS pages

Magento

Magento

Sometimes in the main content block of your theme, the design requires to have a complicated CMS page content layout that stray from using basic linear content (Adding blocks below whatever is under the main content block). You may want various blocks to be set anywhere you want EG the homepage..

This post shows a good clean way to use the layout XML and PPH in your layout files to position blocks of content exactly where you want inside your main content block.
Continue reading “Magento – Add custom content layouts for CMS pages”

Drupal CSS aggregator

A couple of pointers when you’re getting into theming Drupal the correct way rather than just hacking around as is most fun.

I seem to hit troubles getting the aggregator feature of Drupal working, and often end up just slapping an external CSS link call in to the page template.

The proper way to do is a little long winded, but gives us the speed optimisations offered by the aggregator facility. Instead of putting <link … /> in the page.tpl.php file, use the drupal_add_css() function in your template.php file.

The best place to put it is in a function called <themename>_preprocess_page().

And here’s an example of what that function can contain…

function mytheme_preprocess_page(&$vars) {
  //JA Inject theme styles and js
  $resetcss = drupal_get_path('theme', 'mytheme') . '/yui/build/reset-fonts-grids/reset-fonts-grids.css';
  $thickboxcss = 'misc/thickbox/thickbox.css';
  $thickboxjs = 'misc/thickbox/thickbox-compressed.js';

  drupal_add_css($resetcss, 'module', 'all', 1);
  drupal_add_css($thickboxcss, 'theme', 'all', 1);
  drupal_add_js($thickboxjs, 'theme', 'header');

  $css = drupal_add_css();
  $vars['styles'] = drupal_get_css($css);
  $vars['scripts'] = drupal_get_js();
}

Some other things to watch out for .. make sure the path you provide the aggregator is relative from root but not relative to root… I’m not helping much am I!
I mean this …
misc/thickbox/thickbox.css
as oppossed to this …
/misc/thickbox/thickbox.css

Also make sure the web server has access to the files .. correct permissions etc.
I found that even pointing the aggregator at symlinks instead of the actual files was causing a problem .. probably to do with permissions on the real files.

Anyways .. hope that helps!

References:
http://api.drupal.org/api/function/drupal_get_css/6
http://api.drupal.org/api/function/drupal_add_js/6

osCommerce redirect error when SSL enabled and shop directory is different to NONSSL directory

I found the solution to this issue from a thread on the forums. It means making a correction to the tep_redirect function in functions/general.php.

You need to add the DIR_WS_HTTPS_CATALOG constant into the url transformation part…

Before:

// Redirect to another page or site
function tep_redirect($url) {
$url = str_replace(‘&’, ‘&’, $url); //Ampersand fix
if ( (strstr($url, “\n”) != false) || (strstr($url, “\r”) != false) ) {
tep_redirect(tep_href_link(FILENAME_DEFAULT, ”, ‘NONSSL’, false));
}

if ( (ENABLE_SSL == true) && (getenv(‘HTTPS’) == ‘on’) ) { // We are loading an SSL page
if (substr($url, 0, strlen(HTTP_SERVER)) == HTTP_SERVER) { // NONSSL url
$url = HTTPS_SERVER . DIR_WS_HTTPS_CATALOG . substr($url, strlen(HTTP_SERVER)); // Change it to SSL
}
}

header(‘Location: ‘ . $url);

tep_exit();
}

After:

// Redirect to another page or site
function tep_redirect($url) {
$url = str_replace(‘&’, ‘&’, $url); //Ampersand fix
if ( (strstr($url, “\n”) != false) || (strstr($url, “\r”) != false) ) {
tep_redirect(tep_href_link(FILENAME_DEFAULT, ”, ‘NONSSL’, false));
}

if ( (ENABLE_SSL == true) && (getenv(‘HTTPS’) == ‘on’) ) { // We are loading an SSL page
if (substr($url, 0, strlen(HTTP_SERVER)) == HTTP_SERVER) { // NONSSL url
$url = HTTPS_SERVER . substr($url, strlen(HTTP_SERVER)); // Change it to SSL
}
}

header(‘Location: ‘ . $url);

tep_exit();
}

Note: I have modified the function from standard already to provide xhtml compliant pages. You can strip the ‘//Ampersand fix’ line.