• This email address is being protected from spambots. You need JavaScript enabled to view it.
    • +44 (0) 20374 57368


The huge undertaking that is Crossrail pushes that limits of engineering and technology alike this brings its own special set of challenges. One of these challenges was monitoring all the existing city infrastructure to detect any geographic changes due to the tunneling operations. As the TBM (Tunnel Boring Machine) passes under the city this causes the earth around it to move normally sinking down by a small degree. This has to be closely monitoring so that the engineers can react should there be a deviation from the expected movement. Certain areas where particularly valuable like old protected buildings and other transportation infrastructure.

Due to the scale of this task an automated system was require to do most of the work. The team had installed a series of RTS (Robotic Total Stations) and attached 10,000’s of prisms. The RTS would automatically measure the distance between all the prisms and send the data back over GPRS predefined schedule. This means the engineers where always being feed live up to date monitoring data of city which could be used to calculate any movement and trigger alarms if necessary.


Crossrail had purchased software to ingest and analysis the data so that they engineers could view it on a map. This work well until the sensor network grow to such an extent that the system could not process the amount of data it was fed in a timely manner. This ment the engineers where having to wait and increasing amount of time to be able to view the most up to date data. It is work mention that at it peak the Crossrail sensor network was the second largest in Europe the only larger one was at the LHC in Cern.


Working with their team of Geotechnitions and in house developers. We were able to a number of major improvements to their computing infrastructure and software performance.

  • Private Cloud

    Crossrail where hosting all their own servers in data center just outside of London this was originally provisioned by the initial contractor. However much of the hardware was already at end of life with another 3 years of the project left. The decision was main to upgrade. We help design and install a brand new highly fault tolerant private cloud setup.

  • Database Performance, Resilience and Scalability

    We found that there where a number of small improvements that could be made with the database to help the performance. Most importantly we were able to pool of database servers to provide high resilience and scalability. We did this by offloading heavy reads to a number of database replicas.

  • Data Importing

    One of the biggest bottlenecks we found with the existing system was the single threaded approach to importing. The import mechanism was also susceptible to getting blocked by a bad data file. We create a new import system using highly optimized methods of data conversion which could be scaled horizontally so during time of high demand new process could be provisioned to cater for the demand. As part of this new import system we also had full error handling should any bad files make it in.


After these improvement Crossrail was able to delivery as near to real time data to their engineers. This mean that potential problems could be detected and dealt with faster before they escalated. With the prospect of Crossrail 2 they will be able to take this optimized technology stack forward and using it straight of the bat.

  • Client Logo

    Client Logo
  • About

    Crossrail a public sector project to build a new 118 Kilometer rail link across London running from Reading in the west to Abbey Wood in the east. Costing in excess of £15 billion and taking 10 years to complete.

  • Technologies Used