ResponseVault Launches Autodesk Construction Cloud Integration

New integration streamlines custom form creation and workflow automation for construction projects

Conshohocken, PA – 10/15/2024 – ResponseVault, a leading provider of data collection and engineering tools for the construction industry, today announced a new integration with Autodesk Construction Cloud®, a portfolio of software and services that combines advanced technology, a builders network and predictive insights for construction teams. This powerful integration enables contractors to create custom form applications in ResponseVault that leverages project information from Autodesk® Build, improving the way construction teams manage and process critical project information.

The construction industry faces significant challenges in efficiently collecting, processing, and moving data between various systems and processes. ResponseVault’s integration with Autodesk Construction Cloud addresses these challenges by connecting ResponseVault’s rapid application development platform and Autodesk’s robust construction management solution.

Key Features

  • Custom Form Creation: Contractors can now build tailored forms using key project data including the project name, site address, etc. from Autodesk Construction Cloud to power dropdowns, tables, custom formulas, and validations.
  • AI-Powered Setup: Utilize artificial intelligence to convert complex forms from PDF, images, and Excel, significantly reducing setup time and costs.
  • Automated Workflows: Submitted forms can trigger automated approval processes, notifications, and PDF file uploads directly into Autodesk Build, Autodesk Docs, or BIM 360®.
  • Rich Media Capture: Easily incorporate photos, videos, signatures, and barcodes into forms for comprehensive documentation.

“With this integration, we’ve prioritized speed above all else…speed of setup, and form completion in the field,” said Matt Monihan, President of ResponseVault. “By combining ResponseVault’s specialized form-building capabilities with Autodesk Construction Cloud’s comprehensive solutions, we’re empowering construction teams to streamline their data collection and processing like never before.”

The ResponseVault form builder is ideal for a wide range of construction management tasks, including Job Hazard Analyses, inspections, checklists, permits, and weekly progress reports.

“ResponseVault enhances collaboration among key stakeholders by making it easy to capture and share project data,” said James Cook, director of industry and technology partnerships at Autodesk. “This new integration further enhances team collaboration by incorporating key project details from Autodesk Construction Cloud and sending completed reports back, so builders can link them to the key construction processes.”

This integration marks a significant step forward in construction project management, offering unprecedented flexibility and efficiency in data collection and processing.

For more information about ResponseVault and its integration with Autodesk Construction Cloud, visit ResponseVault.com, or the Autodesk App Store.

About ResponseVault

ResponseVault makes data collection and data engineering tools for the construction industry. With a rapid application development approach, ResponseVault enables construction teams to create, test, and deploy data collection tools immediately, significantly reducing setup time and increasing project efficiency. ResponseVault’s suite of products empowers construction professionals to harness the power of data engineering, including ETL processes and business intelligence, without the need for specialized IT support.

Autodesk, Autodesk Construction Cloud, and BIM 360 are registered trademarks or trademarks of Autodesk, Inc., and/or its subsidiaries and/or affiliates in the USA and/or other countries. All other brand names, product names, or trademarks belong to their respective holder

Choosing Your Beginner Construction Analytics Stack: A Comprehensive Guide

In today’s data-driven construction industry, leveraging analytics can be a game-changer for improving project outcomes, increasing efficiency, and making informed decisions. As a beginner in construction analytics, building your initial stack might seem daunting, but with the right approach and tools, you can set a solid foundation for data-driven success. This guide will walk you through the process of choosing and setting up your beginner construction analytics stack, focusing on readily available tools and best practices.

Leveraging Existing Resources: Excel and Power BI

Before diving into new tools, it’s crucial to recognize the power of what you might already have at your disposal. If your organization uses Office 365, you’re in luck – Excel and Power BI are included in your subscription, providing a robust starting point for your analytics journey.

Excel: The Swiss Army Knife of Data Analysis
Excel remains one of the most versatile tools for data manipulation and basic analysis. Its familiar interface and wide range of functions make it an excellent starting point for:

  • Data cleaning and preparation
  • Basic data visualization
  • Quick calculations and pivot tables
  • Ad-hoc analysis and reporting

While Excel has limitations for handling large datasets or complex analytics, it’s an invaluable tool for getting started and performing initial data explorations.

Power BI: Your Gateway to Advanced Analytics
Power BI takes data analysis to the next level, offering:

  • Advanced data visualization capabilities
  • Data modeling and relationship mapping
  • Real-time dashboards
  • Integration with various data sources
  • Sharing and collaboration features

As a beginner, Power BI might seem overwhelming at first, but its integration with Excel and user-friendly interface makes it an ideal next step in your analytics journey.

Identifying Your Systems of Record

Before you can analyze data, you need to know where it’s coming from. In construction, several key systems typically serve as the primary sources of valuable data:

Enterprise Resource Planning (ERP) System

Your ERP system is the backbone of your organization’s data infrastructure. It typically handles:

  • Financial data
  • Resource allocation
  • Inventory management
  • Procurement information

Identifying your ERP system and understanding its data structure is crucial for integrating it into your analytics stack.

Customer Relationship Management (CRM) System

While not always associated with construction, a CRM system can provide valuable insights into:

  • Client interactions and history
  • Sales pipeline and forecasting
  • Marketing campaign effectiveness
  • Customer satisfaction metrics

Project Management System

This is where the rubber meets the road in construction analytics. Your project management system likely contains:

  • Schedule data
  • Resource allocation
  • Task progress and milestones
  • Budget vs. actual cost tracking
  • Change orders and RFIs

Safety Compliance Platform

Safety is paramount in construction, and your safety compliance platform holds critical data on:

  • Incident reports
  • Safety training records
  • Compliance documentation
  • Risk assessments

Understanding these systems and the data they contain is essential for building a comprehensive analytics strategy.

Setting Up Your Data Infrastructure

With your data sources identified, the next step is to create a centralized repository for all this information. This is where a data lake or data warehouse comes into play.

Choosing Between Azure and Neon


For beginners, two popular options for setting up a cloud-based data infrastructure are Microsoft Azure and Neon.

Azure Data Lake Storage:

  • Seamless integration with other Microsoft tools
  • Scalable and secure
  • Supports various data types and formats
  • Offers advanced analytics capabilities

Neon (PostgreSQL Database):

  • Open-source and cost-effective
  • Serverless architecture for easy scaling
  • Optimized for analytical workloads
  • Supports SQL queries for data analysis

The choice between Azure and Neon often depends on your existing infrastructure, budget, and specific needs. Azure might be the natural choice if you’re already heavily invested in the Microsoft ecosystem, while Neon could be more appealing if you’re looking for a lightweight, cost-effective solution.

Setting Up Your Data Lake/Warehouse

Regardless of your choice, the process of setting up your data lake or warehouse typically involves:

  1. Creating your cloud account and configuring security settings
  2. Designing your data structure (schema for a warehouse, or folder hierarchy for a lake)
  3. Setting up data ingestion processes from your systems of record
  4. Implementing data transformation and cleaning procedures
  5. Establishing data governance policies

This step might require assistance from IT professionals or data engineers, especially if you’re dealing with large volumes of data or complex integrations.

Connecting to Power BI

With your data centralized in a lake or warehouse, the final step is connecting it to Power BI for analysis and visualization. Power BI offers native connectors for both Azure and PostgreSQL databases, making this process relatively straightforward:

  1. Open Power BI Desktop
  2. Click on “Get Data” and select your data source (Azure or PostgreSQL)
  3. Enter your connection details and credentials
  4. Select the tables or data you want to import
  5. Use Power Query to clean and transform your data as needed
  6. Create relationships between different data tables in the model view
  7. Start building your visualizations and dashboards
  8. Best Practices for Beginners

As you embark on your construction analytics journey, keep these best practices in mind:

Start Small: Begin with a specific problem or question you want to solve. This focused approach will help you learn the tools without getting overwhelmed.

Prioritize Data Quality: Ensure your data is clean, consistent, and accurate. Poor data quality can lead to misleading insights.

Collaborate Across Departments: Work with project managers, finance teams, and field personnel to understand their data needs and challenges.

Invest in Learning: Take advantage of free resources, tutorials, and courses to improve your skills in Excel, Power BI, and data analysis.

Iterate and Improve: Your first attempts at analytics might not be perfect, and that’s okay. Continuously seek feedback and refine your approach.

Consider Data Security: Construction data often includes sensitive information. Ensure you’re following best practices for data security and compliance.

Conclusion

Building your beginner construction analytics stack doesn’t have to be complicated or expensive. By leveraging existing tools like Excel and Power BI, identifying your key data sources, setting up a centralized data repository, and following best practices, you can create a powerful foundation for data-driven decision-making in your construction business.

Remember, the goal is not to become a data scientist overnight, but to start harnessing the power of your data to improve project outcomes, increase efficiency, and gain a competitive edge. As you grow more comfortable with these tools and processes, you can expand your analytics capabilities, potentially incorporating more advanced techniques like machine learning and predictive analytics.

The construction industry is evolving, and those who can effectively leverage data will be best positioned for success. By taking these first steps in building your analytics stack, you’re not just improving your current operations – you’re future-proofing your business for the data-driven era of construction.

Data Engineering for Construction: Unlocking the Power of Your Information

Photo by Anamul Rezwan on Pexels.com

As a construction company managing multiple SaaS tools for project management, scheduling, job site safety, and time and materials reporting, you’re sitting on a goldmine of data. But are you making the most of it? This is where data engineering comes in.

What is Data Engineering?

Data engineering is the practice of designing, building, and maintaining the systems and infrastructure that collect, store, and process data at scale. It’s the foundation that enables effective data analysis and business intelligence.

Why Does It Matter for Construction?

  1. Integration of Multiple Data Sources Your various SaaS tools are likely generating data in different formats and storing it in separate systems. Data engineering can help you bring all this information together, giving you a complete picture of your operations.
  2. Real-time Insights With proper data engineering, you can transform raw data from your job sites into real-time insights, allowing for quicker decision-making and problem-solving.
  3. Predictive Analytics By centralizing and structuring your data, you open the door to advanced analytics. This could help predict project delays, safety risks, or cost overruns before they occur.
  4. Data Quality and Consistency Data engineering ensures that the information flowing through your systems is accurate, consistent, and reliable – crucial for making informed decisions.
  5. Scalability As your company grows and takes on more projects, data engineering provides the framework to handle increasing volumes of data without sacrificing performance.

Key Components of Data Engineering

  1. Data Pipeline Development Creating automated processes to extract data from your SaaS tools, transform it into a usable format, and load it into a central repository.
  2. Data Warehousing Designing and implementing a centralized storage solution optimized for analysis and reporting.
  3. Data Modeling Structuring your data in a way that reflects your business processes and enables efficient querying.
  4. ETL (Extract, Transform, Load) Processes Developing routines to regularly update your data warehouse with the latest information from your various tools.
  5. Data Governance Establishing policies and procedures to ensure data quality, security, and compliance with regulations.

Getting Started

Implementing a data engineering strategy doesn’t have to be overwhelming. Start by:

  1. Assessing your current data landscape
  2. Identifying key data sources and potential integration points
  3. Defining your data goals and requirements
  4. Exploring data engineering tools and technologies that fit your needs
  5. Considering partnering with data engineering experts to develop a tailored solution

By embracing data engineering, your construction company can turn the wealth of information generated by your various SaaS tools into a powerful asset, driving efficiency, safety, and profitability across your operations.

The Impact of AI on Construction: Pros and Cons Revealed

As artificial intelligence continues to transform industries across the board, the construction sector is no exception. We’ve been exploring AI’s potential to revolutionize our workflows. But like any powerful tool, AI comes with both benefits and challenges. Let’s break down some key pros and cons we’ve encountered:

Pros

Document Intelligence

By leveraging AI with Retrieval-Augmented Generation (RAG) on our SharePoint sites, we’ve unlocked a treasure trove of institutional knowledge. AI can now quickly sift through years of project reports, safety guidelines, and best practices to provide relevant insights on demand.

Predictive Maintenance:

AI algorithms play a crucial role in predictive maintenance, enabling us to proactively identify potential issues with machinery and equipment. By analyzing historical data and real-time performance metrics, these advanced algorithms can predict equipment failures before they occur, thereby minimizing unplanned downtime and optimizing operational efficiency. This proactive approach not only saves time and resources but also enhances safety by preventing potential hazards on job sites. By leveraging the power of AI, we can ensure that our equipment remains reliable and safe, ultimately contributing to a more productive and secure working environment.

Design Optimization

Generative AI plays a pivotal role in assisting our architects and engineers by leveraging advanced algorithms and machine learning techniques to explore innovative design solutions. By harnessing the power of Generative AI, we are able to optimize designs not only for cost, sustainability, and structural integrity but also for energy efficiency, aesthetic appeal, and user experience. This cutting-edge technology empowers us to push the boundaries of creativity and efficiency, ultimately leading to the development of forward-thinking and sustainable architectural and engineering solutions.

    Cons:

    Data Hallucinations

    We’ve encountered instances where AI confidently presents false information as fact. This “hallucination” problem requires constant human oversight to catch and correct. It’s crucial for AI developers to implement robust fact-checking mechanisms to ensure that the information generated is accurate and reliable. Additionally, ongoing research and development in the field of AI ethics and accountability are essential to address these challenges. As AI technology continues to advance, the need for human involvement in validating the output becomes increasingly evident. Finding the right balance between autonomous AI capabilities and human intervention is a key area of focus for ensuring the responsible and effective deployment of AI systems.

    Inconsistent Calculations

    While AI excels at processing vast amounts of data, we’ve found its mathematical accuracy can be surprisingly unreliable for critical structural calculations. This has prompted further research into enhancing the accuracy of AI algorithms for specialized tasks such as structural engineering. The complexity and nuances of structural calculations require a more robust and nuanced approach, considering not only the mathematical precision but also the real-world implications of the results. As AI continues to evolve, it is crucial to address these limitations and refine the algorithms to ensure their reliability in critical scenarios.

    Over-reliance Risks

    There’s a danger of team members becoming too dependent on AI, potentially eroding crucial problem-solving skills and domain expertise. As AI technology continues to advance and automate various tasks, there is a risk of employees relying excessively on AI for decision-making, analysis, and creative problem-solving. This over-reliance on AI may lead to a decline in critical thinking abilities, innovation, and hands-on experience within specific domains. It is essential for organizations to foster a balance between leveraging AI tools for efficiency and preserving the cognitive capabilities and specialized knowledge of their teams.

    Implementation Challenges

    Integrating AI systems with our existing software stack and training our workforce to use them effectively has been a significant undertaking. Not only have we had to ensure seamless compatibility between the new AI components and our current infrastructure, but we have also dedicated substantial resources to educate and empower our employees to leverage these advanced tools. This comprehensive effort has involved restructuring internal processes, providing specialized training programs, and fostering a culture of continuous learning and adaptation. As we continue on this journey, we are seeing the transformative impact of AI on our operations, decision-making processes, and overall business performance.

      The Path Forward

      Despite these challenges, we believe AI’s potential in construction is immense. The key lies in thoughtful implementation:

      • Establish clear processes for human verification of AI outputs
      • Invest in ongoing AI literacy training for all employees
      • Collaborate with AI providers to address industry-specific needs
      • Use AI as a complement to, not a replacement for, human expertise

      By embracing AI’s strengths while mitigating its weaknesses, we can build safer, more efficient, and more innovative projects for our clients.

      In an upcoming post, we’re going to go through the exact strategies we took to overcome the challenges presented here. Stay tuned.

      What has your experience been with AI in construction? I’d love to hear your thoughts in the comments.

      How one contractor uses NYC Department of Buildings(DOB) permit requirements to drive inspections in the field with Procore

      On your job site, especially if you’re in New York City, you’ll need to ensure your team complies with all of the requirements listed in each trade’s work permit. But, getting the status of any given permit on a job is a cumbersome process, and keeping track of what’s left to address is even more challenging. It takes a highly diligent superintendent or project manager to stay on top of the requirements at any point in the job and communicate them to the field. 

      Below is how an NYC contractor used data integration to save several hours per week of coordination and data-entry.

      Challenge

      The contractor was looking to reduce the amount of data entry required each week to pull updated permit status and required inspection items, and then relay that data to the field. Before this solution came online, their process was:

      • Check the NYC DOB NOW website for permit status and required items.
      • Create a new inspection template for each project in Procore and add the required items for each permit as sections.
      • Communicate with the field to complete the inspection.
      • And, if any of the required items changed…repeat the process.

      Solution

      The contractor worked with data engineering firm, ResponseVault, to create both the integrations between the NYC DOB NOW website, and Procore Inspections, but also to create a management screen for every project. 

      The integration loads the database of permit information from the DOB into a data warehouse, then lets the contractor’s team choose which permits they want to track per project. When it’s ready, it will trigger an inspection to be created for the project in Procore, including each permit and a line item for each open requirement for the field to respond to.

      How does this work? 

      First, the integration team uses the NYC Open Data Portal to periodically extract and load permit data from the DOB into a PostgreSQL data warehouse. This process happens each day, as the dataset is updated daily. The data has two columns (fields) that contain the information we’re looking for in each permit. They are Special Inspection Requirement and Progress Inspection Requirements. The data in those columns is a comma-separated list of requirements for the permit to move forward. A typical value may look like “Fire-Resistant Penetrations and Joints, High-Pressure Fuel-Gas Piping (Welding).” 

      ResponseVault takes the data in its raw form, transforms each requirement into a separate record, and presents the contractor’s team with a management interface that includes the project information, a list of each of the permits relevant to that project, and a list of all requirements for each permit.

      As the job is in progress, the contractor’s staff in the office can trigger a workflow that creates a customized Procore inspection with sections for each permit and the requirements as entries. The inspection is unique to the project and generated through an automated process that leverages the information in the data warehouse.

      The inspection with up-to-date permit data from the data warehouse and attachments uploaded by office staff is now ready to be completed by the field. With this integration, the contractor can reduce data entry and communication time with the field by 60% weekly. 

      If you’re looking to integrate permit information or other project data with your project management system, contact the ResponseVault team.

      Why start a career in the construction industry as a data engineer?

      The biggest draw is simply that since there has been such a delay in modernization, there’s a ton of opportunities that are straightforward

      This post originally appeared on the ConstructionDataJobs.com Blog.

      I’m a data engineer myself, and many of my colleagues are surprised that I work almost exclusively in the construction industry. They think, “Isn’t that industry stuck in the past? I bet you can’t get much done.” And, while it is true that construction is almost the least digitized, that is changing, fast.

      When I was first approached by my first customers in the industry to modernize their workflows, I became struck by a number of factors that made me decide to dedicate my time to helping construction companies exclusively.

      1. There’s a lot of low-hanging fruit

      The biggest draw is simply that since there has been such a delay in modernization, there’s a ton of opportunities that are straightforward, even easy, for someone with a data engineering skillset. I look for any process that still exists on paper, in spreadsheets, or even in dedicated software tools who’ve just seen better days. The terms “data pipeline” and “data warehouse” are not ubiquitous in this industry, but many of the customers I work with are excel masters, and jump at the opportunity to pull up-to-date data from project management and scheduling software into their models.

      2. Despite the industry lagging in technology adoption, that is changing rapidly

      Yes, there’s low hanging fruit for data engineers, and on top of that, there’s a young workforce that has much higher expectations when it comes to user experience and data capabilities. This is the generation that grew up with social media, now getting into management positions at major companies and thinking, “Why is our technology so terrible, or nonexistent?” Well, now that is about to change because a large group of motivated managers can now allocate a budget to move their company to the next level.

      3. There is a big opportunity to design your work environment

      So, there’s obviously a lot of pent-up demand, but at the same time there’s a shortage of qualified and motivated data engineers. This means that your skills are highly valued, and you have many options for employment: full-time, part-time, and consulting gigs all exist and you can choose those which fit your lifestyle goals and ambition. It’s never been a better time to jump into this industry, and it’s only going to get better.

      4. The industry is going to space

      I say this semi-seriously. It is true that space exploration has had a renaissance as of recently. It is also true that as more people go to space, more things will have to be built and sent there, constructed in a zero-G environment, and even built in space and sent home. And if that prediction comes true, there will be high demand for robust, scalable data pipelines that connect systems that are both terrestrial and beyond. I’m looking forward to it! 

      Get out there and help this industry build the future!

      For a list of data engineering jobs in the construction industry, check out ConstructionDataJobs.com

      3 Approaches to Data Engineering in the Construction Industry

      For many firms, there are more questions than answers on how to begin analyzing and using data. A good place to start is understanding where you are in this process. Here’s a high-level look at three states of progression in the journey toward data analysis most firms go through that can serve as a road map to help you move forward.

      Understanding where you’re at in capturing and using data is a good way to build a road map to the future.

      Two years ago, the president of Primus Builders knew he had a data-analysis problem, but he had no idea how big that problem actually was. All he really knew was that none of his tech solutions were “talking” to one another, leaving him with islands of data and redundant workarounds. Then he called in an expert and realized his problem was much worse.

      That’s because Primus Builders was where a lot of construction firms are on their data-analysis journey — siloed point solutions. In practice, that meant the company was wasting money on technology it no longer used — and hadn’t been for years — without anyone realizing it. It also meant workers were duplicating efforts to produce reports, but the information in those reports often conflicted because there was no single source of truth. Finally, it meant the company was at risk for hacking, ransomware attacks and other security breaches.

      “The problem that a lot of people have is how do I get these applications to talk? How do I take my financial data and combine it with my schedule and my project management? There’s not really a clean way to do it,” said John Robba, chief technology officer for Primus Builders. “But collecting data is one thing. The real question is, what do you do with it? And that’s usually the hardest thing for companies to answer. So, the first thing is knowing what’s the goal you’re trying to accomplish? Then you can back into the details, and it will be very apparent if you selected the right partnerships and have the right internal resources to pull that off.”

      Before any of that happens though, Robba and others say it’s important to have a clear understanding of where your company is in its data-analysis journey — and how to move forward.

      But for commercial construction firms looking to start a data-analysis journey — and capitalize on the promise of data analytics—the path is anything but clear. FMI reports that 96% of data collected in the engineering and construction industry goes unused.

      That’s a big mistake.

      “The organizations that take the time to gather the data, analyze it and turn it into actionable insights will gain a competitive advantage. The ones that bury their heads in the sand and hope it goes away will be quickly left behind,” warn the FMI report experts.

      For many firms, there are more questions than answers on how to begin analyzing and using data. A good place to start is understanding where you are in this process. Here’s a high-level look at three states of progression in the journey toward data analysis most firms go through that can serve as a road map to help you move forward.

      Understanding Three Basic Approaches to Data Analysis — and Where You’re at on Your Journey

      Step 1: Data Collected in siloed point solutions

      What it looks like: Companies in this scenario typically use multiple apps and tech tools to gather data. Although processes are being digitized, the apps and tools don’t integrate with one another, and thus, data becomes siloed in the various solutions. That means that to get a report, someone has to manually go into the different tools or apps and export the data. In fact, 42% of companies use four to six apps on their construction jobs, and 27% report that none of those apps integrate with one another. As a result, data is transferred manually nearly 50% of the time, according to the 2020 JBKnowledge Construction Technology Report.

      Why it creates problems: It’s not just the siloing and manual data transfer that are problems. Using multiple apps and tools also means workers have to sign in and navigate multiple passwords, interfaces and hardware. Because of that friction, workers sometimes simply don’t use the tools at all. Even if there is consistent use of the tech tools, someone ends up spending a lot of time trying to figure out if the data is consistent and updated. Answering the question, “What happened on the job site today?” becomes a data wrangling challenge, instead of a quick and easy answer.

      How to move forward: Rather than manually pulling data from individual apps to run reports, companies can begin using a data warehouse or data lake to pull in data from the various apps and tools, which can then be used to run reports. This data integration can be almost instantaneous and automatic. But these products and services can also be pricey. Another option for firms with capable IT departments is to create an in-house data warehouse using services such as Amazon Web Services. Either way, companies can start small with incremental steps and grow from there.

      Step 2: Arguments over conflicting data: “Who’s right?”

      What it looks like: Just as it sounds, this scenario finds two or more workers arguing over whose data is correct. For example, the field might have a different data point for hours worked than the office. Companies in this stage often have started to integrate and automate data capture, but they haven’t yet developed systems and approaches to ensure a single source of truth.

      Why it creates problems: Without that single source of truth, trust in the system is lost, and buy-in from executives in the C-suite, all the way down to the field workers, is lost with it. When that happens, even the best tech tools and integration won’t produce useful data because people aren’t taking the time to input it. That’s more than just academic. Fully 52% of rework is caused by poor project data and miscommunication, meaning $31.3 billion in rework was caused by poor project data and miscommunication in the U.S. alone in 2018, according to an FMI/Autodesk report.

      How to move forward: The solution to this problem is something called metadata — a set of information that describes and gives information about other data. In practice, this means that definitions must be provided for data in the user interface or dashboard. For example, what is the specific definition for hours worked? This question may seem obvious, but without clear definitions, workers may make assumptions that corrupt the data. Once data is defined with metadata, companies must implement a data-documentation system that verifies where the data is coming from and how it gets input. This process is known as data lineage. Data lineage includes the data origin, what happens to it and where it moves over time. Data lineage provides visibility into data and simplifies the ability to trace errors back to the root cause.

      Step 3: Enterprise-wide data accessibility

      What it looks like: Companies in this phase of the data-analysis journey can run data reports and view data dashboards quickly and easily. The data is secure and reliable, and there is a trusted single source of truth. That in turn allows companies to use different tools that they wouldn’t otherwise because all the inputs are readily available. For example, with BIM, changes can be made in a 3D render using real-time job site data. That means if an architect or owner wants to make a change, a cost analysis can not only be done in real time, but also based on how your project’s going at that point in time. And most importantly, there is a clear process for resolving discrepancies. The best companies use a combination of documentation, monitoring tools, and a dedicated support team to maintain data accuracy, but also allow stakeholders to avoid making judgements on inaccurate data.

      Why it works: Because these companies see real-time data, they can use that insight to make actionable decisions about day-to-day operations that lead to greater efficiency and profitability. Questions that used to take hours or days to get answers can now be cleared up in seconds. This ability allows companies to be more aggressive and creative with different building scenarios that lead to still greater efficiency and cost savings. More time gets spent on high-value efforts rather than trying to find and massage data. Everyone in the organization is able to confidently and accurately answer the question, “What happened on the job site today?” The benefits of being able to answer that question (and others like it) are obvious to most contractors. In fact, 70% of contractors believe that advanced technologies can increase productivity (78%), improve schedule (75%) and enhance safety (79%), according to the 2019 USG + U.S. Chamber of Commerce Commercial Construction Index.

      How to fully capitalize on it: Companies that have moved into this realm haven’t had to reinvent the wheel. Most other industries have made headway on how to move to enterprise-wide data accessibility, and a number of tools, services and products exist to help. Like Primus Builders, these companies have also prioritized data analysis by hiring experts to help them untangle the different apps and tools they use and develop a cohesive data-analysis strategy. No one tool or hire will solve this problem; it has to be an organization-wide initiative that plays to everyone’s strengths.

      Moving Forward on Your Journey

      Clearly, enterprise-wide data accessibility could benefit all contractors. But getting there requires a new way of thinking about building — and knowing when and from whom to ask for help along the way. The good news is that contractors don’t have to figure it all out immediately. They can move in the direction of enterprise-wide data accessibility in incremental steps. They can also look to other industries and experts for support and guidance.

      The most important thing, though, is to start moving in the direction of enterprise-wide data accessibility — before it’s too late. Some topics to research moving forward:

      • Hiring a data team.
      • Understanding where data came from with data lineage.
      • Understanding why data may appear inaccurate with data observability.
      • Understanding how data is defined with a data dictionary.

      “We’ve seen many customers come in with pieces of the puzzle, but they need help with staffing and tooling to put the full picture together,” explained Matt Monihan, CEO of ResponseVault, a data engineering firm that specializes in the construction industry.

      Ultimately, experts like Robba warned getting that full picture is becoming more and more vital to not only remaining competitive, but also surviving.

      “Wake up, because you’re already living the nightmare. You’re already in the weeds, you just don’t know how tall the weeds are or even where the road is, or how far it is to the road to get out of the weeds,” Robba said. “Most people running organizations are going a million miles an hour. And data analytics is one of those things where you have to have the expertise. And then you actually have to take the time to sit down and go, ‘Yes, I’m going to focus on this and put an action plan together.’ But the bottom line is, you need to invest in your future now, because your competition definitely is.”

      Construction Data Engineer Job Description 2021

      Job Overview

      We are looking for a savvy Data Engineer to join our growing team of analytics experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.

      Responsibilities for Data Engineer

      • Create and maintain optimal data pipeline architecture,
      • Assemble large, complex data sets that meet functional / non-functional business requirements.
      • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
      • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
      • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
      • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
      • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
      • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
      • Work with data and analytics experts to strive for greater functionality in our data systems.

      Qualifications for Data Engineer

      • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
      • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
      • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
      • Strong analytic skills related to working with unstructured datasets.
      • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
      • A successful history of manipulating, processing and extracting value from large disconnected datasets.
      • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
      • Strong project management and organizational skills.
      • Experience supporting and working with cross-functional teams in a dynamic environment.
      • We are looking for a candidate with 5+ years of experience in a Data Engineer role. They should also have experience using the following software/tools:
        • Experience with big data tools: Hadoop, Spark, Kafka, etc.
        • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
        • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
        • Experience with AWS cloud services: EC2, EMR, RDS, Redshift
        • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
        • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

      70% of contractors have a hard time finding qualified craft workers

      A new survey from The Associated General Contractors of America and Autodesk reports that 70% of firms surveyed are having a hard time filling jobs with qualified, hourly craft workers.

      1,600 respondents also reported:

      • 46% are upping the amount of in-house training they provide.
      • 47% are increasing overtime
      • 41% are hiring out subcontractors to fill the roles
      • 22% are using labor-saving equipment (automation!)
      • 11% are using prefab parts and structures
      • 7% are using BIM(Building Information Modeling) to pick up the slack.

      What’s the broader impact?

      “In the short-term, fewer firms will be able to bid on construction projects if they are concerned they will not have enough workers to meet demand,” said Stephen Sandherr, chief executive officer for the Associated General Contractors.  “Over the long-term, either construction firms will find a way to do more with fewer workers or public officials will take steps to encourage more people to pursue careers in construction.”

      As America’s infrastructure ages, and as we’ve seen from the damage from hurricane Harvey in Texas, there is going to be more and more demand for skilled craft workers. Technology and automation will pick up the slack, but we’re still a long way from needing fewer man-power in the field.

      Link: Seventy-percent of contractors have had a hard find finding qualified craft workers to hire amid growing construction demand