Aug 8, 2017

The value of advanced analytics in the construction sector

1 comment


Critical Project Cost Accounting

The construction industry is responsible for undertaking some of the biggest and most expensive projects on Earth. Huge amounts of resources and work go into major construction projects and of course this means that huge volumes of data are generated.

Number crunching has always been a big part of construction, a commonly heard phrase is that construction companies are accounting companies which happen to erect buildings. It’s an industry where 35% of costs are accounted for by material waste and loss of productive days.

Predictive Analytics

Over the past few years’ revolutionary advances in computing technology and the explosion of new digital data sources have expanded and reinvented the core disciplines of construction firms and managing agents. Today’s advanced analytics push far beyond the boundaries of traditional cost accounting.

Construction firms are starting to move into arenas such as real-time, cloud-powered analytics of large and unstructured datasets including 2D & 3D data, financial data, documents, schedule elements and weather data. Modern analytics methods have the potential to redefine the traditionally fraught relationships between the interested parties. Architects who want to unleash their creative energy, engineers who must try and make it all fit together and not fall down again and owners, desperate to keep costs from spiralling out of control

Building Information Modeling (BIM)

Now the combination of innovation, applications and data sets combining means advanced analytics is emerging in all types of construction. The sector has lagged other areas such as financial services, but they are now catching up in their adoption of predictive and optimisation models delivering more accurate cost models and predictions reducing materials and man power waste and shortening the pre-construction phase. From now, the creative sourcing of data and the distinctiveness of analytics methods will be much greater sources of competitive advantage for constructors

Weather Analytics

Constructors and planners have always looked at weather patterns to determine when, where and how to build whilst optimising energy efficiency and environmental impact. But in a time when it appears climate change may be contributing to an increase in natural disasters, these companies are now turning to weather analytics for greater insight to lower build costs and reduce construction risk. Innovation in weather data has been long overdue and is now here

Smart Buildings

Smarter buildings mean lower operating costs. The modelling of real-time local climate data when interacting with HVAC ( Heating, Ventilation & Aircon) through the buildings management system dynamically gives control to spend the minimum amount of money to provide the comfort level desired in line with the occupancy pattern

When further combined with signals from the electricity market dynamic power consumption can be deployed ensuring the smart building achieves the lowest possible energy costs and can generate revenue by selling load reductions back to the grid.

Data Science over these multiple data sets including government statistics combined with construction data going back over 150 years allows companies to determine specific risk from weather, combining with latest climate forecasts to deliver more accurate predictions.



Feb 10, 2018

A Lesson from Carillion is the limitations of today’s financial statements, and the limitations of audit due to the lack of adoption of forensic analytics by auditors. Auditing firms have a responsibility to ensure financial statements to give a true and fair view of the financial condition of a company, year by year. That way, all of the many stakeholders of the company who rely on the financial statements – customers, suppliers, employees, investors – can make a timely assessment of the risks of dealing with it. But, rather, accounting has become a game of financial hide and seek. Auditor data analytics needs to be adopted to deliver enhanced audit quality. There are different angles on what this means in practice but audit quality has to become a common objective of auditors, regulators and standard-setters alike.

New Posts
  • Rail companies use data analytics daily to run everything from operations and maintenance to making key business decisions based on intelligence from the collected data. In the last 10 years, the emergence of the digital railway has incurred a demand for Big Data techniques to respond to an overwhelming surge in data collection capabilities and speed. Largely, the UK rail sector has embraced this development and has begun implementing Big Data analysis and Data Science technologies to stay ahead of competitors and provide a better modern service to their customers while maintaining assets and efficiency. The Digitisation of Rail Historically, the rail industry has been criticised for lack of innovation, falling behind the times in many areas and failing to capitalise on emerging technologies. Certainly, this has been improving in recent years. Today, companies collect vast amounts of data from rail stakeholders who provide intelligence via computer systems, the Internet of Things (IoT) and Cloud computing, which constitutes the move towards ‘smart railways´. To ensure the data’s business potential is realised and optimised, companies analyse and work to unlock the potential of the collected data, as Porterbrook, the train leasing company have done in the last year. The digitalisation of rail also offers new services for customers. Mobile ticketing has undoubtedly revolutionised the travel experience and new technologies such as the Trainline´s voice activated customer communication greatly enhances the way in which customers interact with companies. Undoubtedly, digital technology governs a great deal of rail customers behaviours including purchasing, expectations, operator information and reservations.  These leaps in digitisation in rail have led to improved monitoring of assets, automation in operations, and customer interactions, and as the technology becomes readily available and cheaper, the digital railway nears reality. Where Has the UK Rail Industry Been failing? While digitisation has provided the means for innovation and improvement of operations, recent reports indicate performance of trains services have not met expectations. Since the privatisation of Britain’s Rail Network, the government predicted that more competition between the companies in the sector would lead to a better service. In fact, the opposite has been true as ticket prices have risen, and lateness hit a 13 year high in the UK in 2018.  The issue is that while UK rail has certainly improved the maintenance and delivery of rolling stock assets, it is still way behind other countries in customer experience. Customers simply do not trust rail companies to deliver on time, and why should they given recent statistics? According to the Office of Rail and Road , 86.3% of trains arrived on time in the UK. In comparison, the Spanish high-speed network achieves 98.5% punctuality. Dan Ascher´s BBC article suggests that future rail companies should be underpinned by punctuality and excellent customer experience.  Many argue the root of this problem is a reluctancy to invest in smart technology combined with a lack of data science workers within rail companies. Indeed, it will require a change of mindsets and business models as well as significant financial investment in rail digitisation to improve operations and rekindle customer faith. Improvements will also need to be made into customer interactions utilising new technology to provide up to date, accurate and accessible information for passengers. How Can Data Science Improve the Rail Industry? Data Scientists extract meaning from and interpret large quantities of data. In rail, this includes all the current and emerging data used by railways to help monitor infrastructure and equipment and optimise maintenance to improve safety. “Big data analytics have the potential to influence several dimensions of the railway sector and can overcome organisational, operational and technical complexities, including economic and human effects and information handling.”  - Professors D. Galar, U. Kumar and R. Karim at Luleå University of Technology. Operations and maintenance are areas generating considerable excitement for rail companies due to self-learning and smart systems that predict failure, make diagnoses and trigger maintenance actions. Companies looking to introduce smart technology systems require data scientists to build these predictive Machine Learning (ML) models and provide a service that helps uncover the potential of the data. ML models and data science services that predict delays across the network are popular because they can: Help make savings on delay-repay compensation Identify interventions to minimise disruption Inform better train scheduling Inform better maintenance scheduling Improve public performance metrics To utilise the insights from these models, the right platforms needs to be well utilised for a company's specific data sets and requirements. Data scientists and rail industry experts should seek to collaborate to build the necessary algorithms and analysis tools to deliver the most effective solutions for the business' needs. The UK rail leasing company Porterbrook recently consulted Elastacloud to realise these advantages. Read the full case study here. Follow us: @elastacloud
  • Maintaining sufficient quantities of drinking water on board an underway vessel is a critical safety element for the crew members and general operation of the vessel. While sea water may be satisfactory for cleaning decks and flushing toilets, fresh water must be maintained for human consumption and power generation. Crew members face multiple challenges in optimising potable water quantity planning, including trip variables such as voyage length, speed, weather conditions, consumption rates, passenger volumes and fuel costs. When optimised, the savings can be substantial for ship operators. As an example, the weight and space created by eliminating excess water tanks can be used to store extra fuel or cargo, leading to significant savings per year. By analysing fleet onboard data sources such as ship type (e.g. bulk carrier, container ship), the expected number of crew and passengers, voyage length, anticipated speed and weather conditions combined with the onboard systems capability to generate potable water a prediction can be made for the volume of potable water required for the voyage. During the voyage, live onboard sensor data tracks actual water consumption, temperature patterns, and travel speed to adjust bunkering and docking locations as needed. When deviations occur, crew members receive early warning alerts and are rerouted to nearby distilling sites, ensuring the safety of all crew and passengers aboard. The machine learning algorithms continue to adapt and gain accuracy with each voyage.
  • What if you need to present real-time data in a Power BI dashboard? In this post I will give you an overview of two different approaches to address the requirements for real-time reporting. Power BI Streaming First and easier option is to create a Streaming dataset in Power BI, and then write some small code to push data directly into this dataset. Notes & Limitations: With “historic data analysis” option ON, data will be retained forever and available on reports and alerts from dashboard tiles. Rows can also be removed using the API. With “historic data analysis” option OFF, data will not be retained, and Power BI will keep up to 200k rows, FIFO. 120 POST rows requests per minute. Azure Stream Analytics Another option is to use Azure and then ingest data via Events Hub. Output will be the Power BI dataset “cbiStreamAnalytics”. And the code? Simple! A console C# application is the easiest way to test and push data to both datasets. // Push to Power BI var postToPowerBi = HttpPostAsync(measureTypeItem.powerBiPostUrl, "[" + jsonString + "]"); // Push to Azure Stream Analytics EventHubClient client = EventHubClient.CreateFromConnectionString(ConnectionStringStreamAnalytics, "cbistreaminghubpbi"); client.Send(new EventData(Encoding.UTF8.GetBytes(jsonString))); Result Here is an example of a Power BI dashboard showing real-time data from both datasets to compare latency.