1.0 Purpose and Scope
1.1 Purpose
This standard defines the validation and verification requirements for digital commissioning (DCx) of Building Automation Systems (BAS). It specifies what aspects of BAS digital representation and network infrastructure must be tested, verified, and validated to ensure machine-readable, interoperable systems that enable advanced applications such as automated fault detection and diagnostics (AFDD), performance analytics, and seamless integration with other building systems.
This standard is structured as both:
- Commissioning Specification: Defining what must be validated and the acceptance criteria
- Verification Procedures: Providing standardized methods for conducting validation tests
1.2 Scope
This standard applies to the validation and verification of:
- BACnet network infrastructure configurations
- Digital metadata structure and completeness
- System interoperability and data accessibility
- Cybersecurity controls for operational technology networks
- Software licensing and vendor lock-in prevention
- All control systems and devices associated with mechanical (HVAC) and plumbing systems, and integrated building systems that exchange data with the BAS
The validation process covers five primary areas:
- General Requirements: Project roles, responsibilities, coordination, and cybersecurity validation
- BACnet Network Infrastructure Validation: Verification of network architecture, addressing, device configuration, communication protocols, and software interoperability
- Integrated Systems Validation: Validation of lighting controls, utility metering, and other systems that interface with the BAS
- Metadata Validation: Three progressive levels of digital representation validation, from naming conventions to semantic modeling
- Data Accessibility: Performance requirements for data access, visualization, and system responsiveness
1.3 Relationship to System Design Standards
This standard does not prescribe how BAS systems should be designed or configured. Instead, it defines validation and verification methods that can be applied regardless of the specific design standard, naming convention, or configuration approach used for a project. External design standards, project specifications, and configuration requirements are referenced as validation criteria, enabling this standard to be applied across different project types and design methodologies.
2.0 General Requirements
2.1 Submission Format
All required documentation and metadata shall be submitted to the Owner's Representative in a non-proprietary, machine-readable format (e.g., RDF, CSV, JSON, JSON-LD) prior to substantial completion.
2.2 Documentation
The Controls Contractor shall provide comprehensive documentation detailing all network configurations, addressing schedules, naming conventions, tagging libraries, and semantic models used. This documentation is a final commissioning deliverable.
2.3 Validation Framework
2.3.1 Validation Approach
This standard employs a dual-component validation framework:
Commissioning Specifications
Define what must be validated including:
- Required deliverables and documentation
- Validation coverage requirements (e.g., 100% of devices, sampling percentages)
- Acceptance criteria and pass/fail thresholds
- References to external standards and project-specific requirements
Verification Procedures
Define how to conduct validation including:
- Standardized testing methodologies
- Required tools and equipment
- Step-by-step verification processes
- Data collection and analysis methods
2.3.2 Implementation Independence
The validation requirements are designed to be implementation-independent, meaning they can be applied regardless of:
- Specific system design standards used
- Equipment manufacturers or models selected
- Project-specific naming conventions or configuration approaches
- Network architectures or addressing schemes
2.3.3 Tool Requirements
Validation will be performed using industry-standard diagnostic tools and automated verification scripts, including but not limited to:
- Network analysis tools (e.g., Wireshark, network scanners)
- BACnet diagnostic tools (e.g., YABE, BACnet browsers)
- Automated validation scripts and databases queries
- Documentation review and cross-reference tools
2.3.4 Access Requirements
The system contractor must provide necessary access for validation activities:
- Network access for diagnostic tools and automated scripts
- System database access for metadata validation
- Robust metadata available in non-proprietary interoperable formats
- Documentation and configuration data as specified in each section
- Technical support during validation testing periods
2.4 Roles and Responsibilities
Overview
Clear definition of project roles and responsibilities is essential for successful digital commissioning. This section establishes the organizational structure, accountability, and coordination requirements necessary to deliver validated, interoperable building automation systems.
Requirements
2.4.1 Lead System Integrator
A Lead System Integrator shall be designated and identified in the project documentation. The Lead System Integrator shall be responsible for:
- Overall coordination of all control systems integration activities
- Network architecture design and implementation oversight
- Ensuring interoperability between all integrated systems
- Coordination of digital commissioning validation activities
- Resolution of integration conflicts between systems and vendors
The Lead System Integrator shall possess demonstrated experience with multi-vendor BAS integration and digital commissioning processes.
2.4.2 Coordination Matrix
A Coordination Matrix shall be submitted documenting the division of responsibilities for all integration activities. The matrix shall identify:
- Primary and secondary responsibility for each deliverable
- Review and approval authorities for each system component
- Coordination requirements between trades (mechanical, electrical, lighting, IT)
- Communication protocols and escalation procedures
- Digital commissioning validation responsibilities by system
2.4.3 Digital Commissioning Authority
The project shall identify a Digital Commissioning Authority (DCA) responsible for:
- Review and acceptance of all digital commissioning deliverables
- Verification of validation test results
- Approval of network architecture and addressing schemes
- Validation of metadata compliance at all levels
- Final acceptance of the digital representation
2.4.4 Owner Representation
The Owner shall designate a technical representative with authority to:
- Approve network architecture and system integration approach
- Make decisions regarding cybersecurity and licensing requirements
- Accept or reject digital commissioning validation results
- Serve as final arbiter for integration disputes
Examples
Example 1: Mechanical Contractor as Lead Integrator
Scenario: A project where the mechanical contractor provides the BAS as part of their base scope and integrates lighting and metering systems.
Coordination Matrix: - BAS Design & Programming: Mechanical Contractor (Primary), Electrical/Lighting (Review) - Network Architecture: Mechanical Contractor (Primary), IT Consultant (Review) - Lighting Integration: Electrical Contractor (Primary), Mechanical (Coordination) - Metering Integration: Electrical Contractor (Primary), Mechanical (Coordination) - Digital Commissioning: Commissioning Agent (Primary), Mechanical (Execution)
Result: Clear accountability prevents coordination failures and ensures all systems meet validation requirements.
Example 2: Third-Party System Integrator
Scenario: A project with a dedicated controls contractor serving as Lead System Integrator, coordinating between mechanical, electrical, and IT contractors.
Implementation: - Controls contractor holds master contract for all BAS integration - Mechanical and electrical contractors provide points lists and interface requirements - Controls contractor designs unified network architecture - All digital commissioning deliverables flow through controls contractor for validation
Result: Centralized coordination ensures consistent metadata, addressing, and interoperability.
Verification
Coordination Documentation Review
- Objective: Verify that project roles and coordination procedures are clearly defined
- Method: Review project documentation for:
- Lead System Integrator designation and qualifications
- Complete Coordination Matrix with all required elements
- Digital Commissioning Authority identification
- Owner's technical representative designation
- Acceptance Criteria: All required roles identified with clear scope of responsibility
Integration Conflict Resolution
- Objective: Verify that coordination procedures are effective in practice
- Method: Review project coordination records for:
- Documentation of integration issues and their resolution
- Evidence of coordination meetings and action items
- Timely resolution of technical conflicts
- Adherence to defined escalation procedures
- Acceptance Criteria: All integration conflicts documented and resolved according to defined procedures
Required Deliverables
- Lead System Integrator designation letter with qualifications
- Coordination Matrix covering all integration activities
- Digital Commissioning Authority designation
- Owner's technical representative designation
- Coordination meeting schedules and procedures
- Integration conflict escalation procedures
Acceptance Criteria
The roles and responsibilities requirements are met when all required roles are formally designated, a complete coordination matrix is submitted and accepted, and evidence demonstrates effective coordination throughout the project lifecycle.
2.5 Cybersecurity Validation
Overview
Building automation systems constitute critical operational technology (OT) infrastructure that requires comprehensive cybersecurity measures. This section establishes validation requirements for cybersecurity controls to ensure BAS networks are protected against unauthorized access, data breaches, and operational disruptions.
Requirements
2.5.1 User Account Management
User account policies shall be documented and verified. The following shall be validated:
- All default passwords have been changed on all devices and systems
- User accounts follow principle of least privilege
- Account naming conventions prevent identification of privilege levels
- Inactive accounts are disabled or removed
- Password length requirements meet current security standards (minimum 15 characters for user accounts, minimum 20 characters for administrative accounts)
- Multi-factor authentication is implemented for administrative access where supported
2.5.2 Network Encryption
Data encryption standards shall be verified for all network communications:
- TLS 1.2 or higher is enabled for all HTTPS communications
- TLS 1.3 is implemented where supported by devices
- Older encryption protocols (SSL, TLS 1.0, TLS 1.1) are disabled
- Certificate validity and trust chains are verified
- Encryption is enforced for all supervisory and management interfaces
2.5.3 Network Segmentation
Network architecture shall implement proper segmentation:
- BAS network is isolated from enterprise IT network
- VLANs or physical separation is used to segment OT traffic
- Firewall rules explicitly define allowed traffic between zones
- Remote access requires VPN connection to isolated management network
- Internet-facing systems are prohibited unless explicitly approved with compensating controls
2.5.4 Remote Access Controls
Remote access methods shall be validated:
- VPN is the only permitted method for remote access
- Direct internet exposure of BAS devices is prohibited
- VPN endpoints require multi-factor authentication
- Remote access sessions are logged and auditable
- Remote access accounts are individual (no shared credentials)
2.5.5 Backup and Recovery
System backup and disaster recovery procedures shall be documented and tested:
- Automated backup schedules are configured and operational
- Backup storage is segregated from production systems
- Backup integrity verification is performed regularly
- Disaster recovery procedures are documented
- Recovery Time Objective (RTO) and Recovery Point Objective (RPO) are defined
- Recovery procedures have been tested and validated
2.5.6 Audit Logging
Security event logging shall be enabled and validated:
- User authentication events (success and failure) are logged
- Administrative actions are logged with user attribution
- Network access attempts are logged
- Log retention meets organizational requirements (minimum 90 days)
- Logs are stored on separate system or exported regularly
- Log review procedures are documented
Examples
Example 1: User Account Validation
Scenario: Validation of user accounts on a BACnet/IP network with Niagara supervisory system.
Validation Process: 1. Inventory all user accounts across all systems (controllers, supervisory stations, network devices) 2. Verify no default accounts remain active (admin/admin, root/root, etc.) 3. Confirm password complexity meets requirements 4. Verify MFA enabled for all administrative accounts 5. Check for shared accounts or generic usernames (admin, user, operator) 6. Validate least-privilege implementation (operators cannot modify programs)
Result: 100% of accounts comply with security policy; no default credentials; MFA enforced.
Example 2: Network Encryption Validation
Scenario: HTTPS/TLS validation for web-based BAS interfaces.
Implementation: - Use SSL Labs or similar tool to scan all web interfaces - Verify TLS 1.2 minimum, TLS 1.3 preferred - Confirm no SSL or TLS 1.0/1.1 support - Validate certificate chains and expiration dates - Test that HTTP redirects to HTTPS - Verify strong cipher suites only
Result: All interfaces achieve "A" rating; modern encryption enforced; legacy protocols disabled.
Example 3: Backup and Recovery Testing
Scenario: Validation of backup procedures for BAS supervisory system.
Test Procedure: 1. Document current system state (graphics, programs, trends) 2. Verify automated backups are running on schedule 3. Perform test restore to separate system 4. Validate restored system matches documented state 5. Measure recovery time against RTO requirement 6. Document any gaps or failures
Result: Backup restoration successful in 45 minutes (meets 1-hour RTO); minor configuration gaps documented and corrected.
Verification
Security Policy Review
- Objective: Verify cybersecurity policies are documented and comprehensive
- Method: Review cybersecurity documentation for:
- User account management policy
- Network encryption standards
- Network segmentation architecture
- Remote access procedures
- Backup and disaster recovery plan
- Audit logging configuration
- Acceptance Criteria: All required policies documented with specific technical requirements
Account Security Audit
- Objective: Verify user account controls are properly implemented
- Method:
- Inventory all user accounts across all systems
- Attempt login with known default credentials (should fail)
- Verify password complexity enforcement
- Validate MFA implementation for admin accounts
- Check for inactive or unnecessary accounts
- Acceptance Criteria: Zero default credentials; 100% MFA on admin accounts; no policy violations
Encryption Validation
- Objective: Verify encryption is properly configured and enforced
- Method:
- Scan all HTTPS interfaces with SSL testing tools
- Verify TLS version enforcement
- Validate certificate integrity and trust chains
- Attempt connection with legacy protocols (should fail)
- Acceptance Criteria: TLS 1.2 minimum on all interfaces; no weak ciphers; valid certificates
Network Segmentation Testing
- Objective: Verify network isolation and segmentation controls
- Method:
- Review firewall rules and VLAN configuration
- Attempt unauthorized cross-zone access (should be blocked)
- Verify BAS network isolation from IT network
- Test remote access requires VPN
- Acceptance Criteria: All unauthorized access blocked; VPN required for remote access; proper segmentation verified
Backup and Recovery Validation
- Objective: Verify backup systems are operational and recovery procedures work
- Method:
- Verify automated backups running on schedule
- Perform test restoration to alternate system
- Measure recovery time against RTO
- Validate restored system functionality
- Acceptance Criteria: Successful restoration within RTO; full functionality verified; procedures documented
Required Deliverables
- Cybersecurity policy document
- User account inventory and access matrix
- Network segmentation diagram with firewall rules
- Encryption configuration documentation
- Remote access procedures and VPN configuration
- Backup schedule and disaster recovery plan
- Audit logging configuration and retention policy
- Security validation test results
Acceptance Criteria
Cybersecurity validation is complete when all security policies are documented, technical controls are verified through testing, backup and recovery procedures are validated, and all findings are documented with corrective actions implemented.
3.0 BACnet Network Infrastructure Validation
Overview
This section defines the validation and verification requirements for BACnet network infrastructure implementations. The validation process ensures that the implemented network infrastructure meets the project-specific design requirements and provides a stable, scalable, and interoperable foundation for building automation system operations.
Structure
Each subsection provides:
- Validation Requirements: What aspects of the network infrastructure must be validated
- Validation Criteria: Examples of criteria that would be applied based on project-specific requirements
- Verification Procedures: Standardized methods for conducting validation tests
- Acceptance Criteria: Clear pass/fail thresholds for each validation test
Applicability
These validation procedures can be applied regardless of the specific network design standard, addressing scheme, or equipment selection specified for a project. The validation criteria are defined relative to project-specific requirements, enabling this standard to be used across different design approaches and system architectures.
3.1 Network Architecture Validation
3.1.1 Validation Requirements
Architecture Documentation Validation
The network architecture implementation must be validated against the project-specific design requirements through documentation review and network discovery.
Required Deliverable: Complete network architecture diagram showing all IP and MS/TP segments, routers, controllers, and server locations.
Network Topology Validation
The implemented network topology must be verified against the approved design specifications, including:
- Network segmentation approach (hierarchical vs. flat architectures)
- Communication path verification between network segments
- Router placement and connectivity validation
- Compliance with project-specific topology requirements
3.1.2 Verification Procedures
Documentation Review Method
-
Architecture Diagram Verification
- Objective: Confirm submitted diagrams accurately represent implemented network
- Method: Cross-reference diagram against discovered network devices and connectivity
- Acceptance Criteria: 100% correlation between diagram and physical implementation
-
Network Discovery Validation
- Objective: Verify all network segments and devices are discoverable
- Method: Automated network scanning and BACnet device discovery
- Acceptance Criteria: All devices shown in documentation are discoverable on their designated network segments
Performance Testing Method
-
Inter-Segment Communication Testing
- Objective: Verify data can flow between all network segments as designed
- Method: Automated communication testing between devices on different segments
- Acceptance Criteria: 100% of cross-segment device pairs can successfully communicate
-
Network Performance Baseline
- Objective: Establish performance metrics for ongoing monitoring
- Method: Measure throughput, latency, and response times under normal operating conditions
- Acceptance Criteria: Performance meets or exceeds project-specific requirements
Reliability Testing Method (If Applicable)
-
Redundancy Verification
- Objective: Verify failover capabilities function as designed
- Method: Controlled failover testing of redundant components
- Acceptance Criteria: Automatic failover occurs within specified time limits with no data loss
3.2 IP Addressing Validation
3.2.1 Validation Requirements
IP Address Assignment Validation
The IP addressing implementation must be validated against the project-specific addressing scheme and network design requirements.
Required Deliverable: Complete IP Address Schedule documenting all BAS devices with their assigned IP addresses, subnet assignments, and addressing method (static or DHCP reservation).
Subnet Configuration Validation
Network subnet implementation must be verified for:
- Compliance with documented subnet design
- Adequate address space for current and planned devices
- Proper subnet isolation and VLAN configuration
- Coordination with facility IT infrastructure
3.2.2 Validation Criteria Examples
The following examples illustrate validation criteria that would be applied based on project-specific requirements:
Address Assignment Validation
- Static IP Verification: Confirm devices configured with static IP addresses match the IP Address Schedule
- DHCP Reservation Verification: Verify MAC-address-based DHCP reservations are properly configured and functional
- Dynamic Address Prohibition: Confirm no devices are using randomly assigned or dynamic DHCP addresses
Subnet Capacity Validation
Examples of subnet capacity validation against project requirements:
- Subnet A Example: 10.100.10.0/24 (253 usable addresses) - verify device count does not exceed capacity threshold per project requirements
- Subnet B Example: 10.100.11.0/26 (61 usable addresses) - validate current device count against design capacity
- Subnet C Example: 10.100.11.64/28 (13 usable addresses) - confirm subnet size meets minimum requirements or has documented exception
3.2.3 Verification Procedures
IP Address Schedule Verification Method
-
Documentation Completeness Check
- Objective: Verify IP Address Schedule includes all BAS devices
- Method: Cross-reference schedule against network discovery results
- Acceptance Criteria: 100% of discovered devices are documented in schedule
-
Address Assignment Verification
- Objective: Confirm devices are configured per the documented addressing scheme
- Method: Network scanning to verify actual IP configurations match schedule
- Acceptance Criteria: 100% correlation between documented and actual IP assignments
Network Configuration Testing Method
-
Subnet Verification
- Objective: Validate subnet configuration and device placement
- Method: Network topology discovery and subnet boundary testing
- Acceptance Criteria: All devices are on correct subnets with proper network masks and gateways
-
VLAN Configuration Testing (If Applicable)
- Objective: Verify VLAN tagging and membership
- Method: VLAN discovery and traffic flow analysis
- Acceptance Criteria: All BAS devices are on designated VLANs with proper isolation from other networks
-
Connectivity Validation
- Objective: Confirm all devices can communicate as required by system design
- Method: Automated ping testing and BACnet communication verification
- Acceptance Criteria: 100% of devices respond to network communication tests
3.3 BACnet Network Numbering
Each BACnet network segment (both IP and MS/TP) shall be assigned a unique BACnet Network Number. Duplicate network numbers are not permitted.
Network numbers shall be allocated logically, following a documented scheme (e.g., 1xx for IP backbone, 2xx for Floor 2 MS/TP networks, etc.).
The contractor shall submit a schedule of all BACnet Network Numbers used in the project.
Verification
BACnet Routing Table
- Network Number Assignment: Verify unique BACnet network numbers for each routed segment
- Routing Table Entries: Confirm all downstream networks are properly advertised
- Network Reachability: Test that devices on different network segments can communicate
- BBMD Configuration: If applicable, verify Broadcast Distribution Table entries
3.4 BACnet Device Instance Numbers (Device IDs)
Every BACnet device on the internetwork shall have a globally unique Device Instance Number. Duplicate Device IDs are not permitted.
Device IDs shall follow a structured, documented scheme that allows for easy identification of the device.
Example Scheme: <Site Code><Equipment Type Code><Sequential Number>
Example ID: 102501 (Site 10, AHU Type 25, Instance 01)
Device IDs shall not be re-used, even if a device is decommissioned. A previously assigned Device Instance Number may remain in service only when a failed device is replaced on a like-for-like basis and the replacement assumes the same documented device identity.
The contractor shall submit a complete schedule of all Device Instance Numbers.
Verification
Device Instance Verification
- Global Uniqueness: Verify that no duplicate Device Instance Numbers exist on the internetwork
- Scheme Consistency: Confirm Device Instance Numbers follow the documented project numbering scheme
- Schedule Completeness: Verify the submitted Device Instance schedule includes all commissioned BACnet devices
- Replacement Traceability: Where a failed device was replaced like-for-like, confirm the retained Device Instance Number is documented in turnover records
3.5 BACnet/IP Network Services
BBMDs (BACnet/IP Broadcast Management Devices)
For networks spanning multiple IP subnets, a clear BBMD strategy is required. One primary and at least one backup BBMD shall be configured. The contractor must provide the Broadcast Distribution Table (BDT) for each BBMD.
The contractor shall also provide a BACnet data flow diagram illustrating the logical path of communications through routers and BBMDs during both normal operation and in failover scenarios.
Foreign Device Registration
The use of Foreign Device Registration shall be minimized and approved by the Owner's Representative. It is not an acceptable substitute for a properly configured BBMD architecture.
UDP Port
All BACnet/IP communications shall use the official IANA-registered UDP port 47808 (0xBAC0). Any deviation must be explicitly approved in writing.
3.6 BACnet MS/TP & ARC156 Network Properties
MAC Addresses
Each device on a given MS/TP or ARC156 segment must have a unique MAC address. Addresses shall be assigned in contiguous blocks to simplify troubleshooting and device management as well as maintain performance.
Baud Rate
The baud rate must be consistent across all devices on a single segment (e.g., 38400 or 76800 for MS/TP).
Max Masters (MS/TP)
The Max-Info-Frames and Max-Master properties shall be configured correctly on all master devices to ensure stable token passing. The configuration for each trunk shall be documented as a project deliverable. The number of master devices on a single segment should not exceed 64.
Validation and Verification
The greater of (3) or 10% of all fieldbus trunks must have their configurations verified. If the project includes multiple device models serving as fieldbus routers, at least one example of each device model must be included in the verification sample.
MS/TP Network Verification
Data Link Layer Verification
- MAC Address Uniqueness: Scan all devices on trunk to verify no duplicate MAC addresses
- MAC Address Range: Confirm addresses are within 0-127 range and assigned in logical blocks
- Baud Rate Consistency: Verify all devices on trunk operate at same baud rate (typically 38400 or 76800)
- Max-Master Configuration: Check that Max-Master value accommodates all devices on trunk
- Max-Info-Frames: Verify appropriate Max-Info-Frames setting for network performance
Token Passing Verification
- Token Passing Sequence: Monitor network to confirm proper token circulation
- Response Time: Measure token rotation time under normal and peak load conditions
- Error Recovery: Verify network recovers properly from token loss scenarios
ARC156 Network Verification Examples
Protocol Verification
- ARC156 Compliance: Confirm devices properly implement ARC156 protocol extensions
- Interoperability: Verify mixed-vendor device communication on same trunk
- Configuration Consistency: Check that all devices have compatible ARC156 settings
- Diagnostics: Verify that networks have no reconfigurations while devices in steady state for 24 hours.
3.7 Physical Layer Cabling
The network design and installation shall adhere to the specified physical layer for each segment.
EIA-485 (MS/TP)
All MS/TP cabling shall be low-capacitance, shielded twisted-pair (STP) wiring, properly terminated at both ends of each segment with end-of-line (EOL) resistors according to manufacture specifications.
ARC156
Cabling shall adhere to the ARC156 physical layer specifications.
Ethernet (BACnet/IP)
Shall utilize Category 6 (Cat6) or better cabling for 100/1000BASE-T networks.
Single Pair Ethernet (BACnet/IP)
Where specified, shall adhere to the IEEE 802.3cg-2019 (10BASE-T1L/S) standard.
Physical Layer Verification
MS/TP
- Cable Termination: Verify manufacturer specified termination resistors are installed at both ends of each trunk
- Cable Type: Confirm twisted pair cable meets device manufacturers specifications
- Cable Length: Verify total trunk length does not exceed manufacturers specifications
- Drop Length: Confirm individual device drops do not exceed manufacturers specifications
ARC156
- Connector Types: Verify proper ARC156 connectors are used throughout network
- Cable Specifications: Confirm cable meets ARC156 requirements
- Network Topology: Verify star or tree topology implementation
3.9 Software Interoperability and Licensing
Overview
True system openness requires both protocol-level interoperability and freedom from proprietary software licensing restrictions. This section establishes requirements to prevent vendor lock-in at the software and licensing level, ensuring long-term system maintainability and owner control.
Requirements
3.9.1 Open Protocol Implementation
All integration platforms and supervisory systems shall use open, non-proprietary communication protocols:
- BACnet/IP (ASHRAE 135) as primary protocol
- Modbus TCP/IP for compatible devices
- SNMP for network equipment monitoring
- Standard IT protocols (HTTPS, REST API, MQTT) where appropriate
- REST APIs should follow industry standard conventions, using OpenAPI or equivalent quality standards
- Exceptions to normative HTTP verb use must be noted and approved by owner
- Proprietary protocols are prohibited for device-to-supervisor or system-to-system communication
3.9.2 Network Integration Controller (NIC) Openness
For systems using Niagara Framework or similar integration platforms, the following shall be verified:
- Open NIC Statement provided by the contractor documenting that:
- All programming and configuration is accessible via standard Niagara Workbench
- No vendor-specific tools or dongles are required for system access
- No encrypted or obfuscated program files that prevent editing
- All graphics, logic, and configuration are fully editable by any qualified Niagara integrator
3.9.3 Software License Ownership
Software licensing shall ensure owner control and prevent lock-in:
- Owner as Named License Holder: All software licenses for supervisory systems, integration platforms, and graphical interfaces shall be registered to the building owner, not the contractor
- License Transferability: Licenses shall be transferable to any qualified service provider chosen by the owner
- No Subscription Lock-in: Ongoing subscription fees shall be for optional services only; core system operation shall not be dependent on active subscriptions
- Source File Access: Owner shall receive all source files, programs, graphics, and configuration databases
3.9.4 Third-Party Access
The system shall support unrestricted access by qualified third-party service providers:
- No vendor-specific passwords or access codes that cannot be shared with owner
- No license restrictions preventing third-party programming or modification
- Standard tools (Workbench, standard browsers, BACnet tools) sufficient for all system access
- No encrypted files or proprietary formats that prevent modification
3.9.5 Documentation of Interoperability
The contractor shall provide documentation demonstrating software-level openness:
- Software architecture diagram showing all communication paths and protocols
- Listing of all software licenses with owner as named holder
- Open NIC Statement (where applicable)
- Confirmation that no proprietary tools are required for system maintenance
- List of standard tools required for system programming and access
Examples
Example 1: Niagara Framework Open NIC Validation
Scenario: A BAS using Tridium Niagara as the supervisory platform.
Open NIC Statement Verification: - Contractor provides written statement confirming no encrypted program files - All graphics created in standard Niagara PX or AX editors - No vendor-specific modules or extensions that prevent third-party access - Logic programs are standard .bog files, fully editable - Owner receives Workbench licenses in their name - Any integrator with standard Niagara Workbench can access system
Result: Owner can hire any qualified Niagara integrator for future service without contractor involvement.
Example 2: License Ownership Verification
Scenario: Validation that software licenses are properly registered to the owner.
Verification Process: 1. Obtain copy of all software license agreements 2. Verify owner organization is named licensee 3. Confirm licenses are perpetual or clearly documented subscription terms 4. Validate license allows third-party service providers 5. Ensure owner receives all license keys and activation codes 6. Document all software version numbers and support terms
Result: Owner has direct relationship with software vendors; can maintain licenses independent of contractor.
Example 3: Protocol-Only Interoperability (Insufficient)
Scenario: A system using BACnet/IP but with proprietary licensing restrictions.
Issue Identified: - All devices communicate via open BACnet/IP protocol ✓ - Supervisory software requires annual subscription to vendor for system access ✗ - Programming tools are vendor-specific and cannot be purchased by owner ✗ - Graphics are in proprietary encrypted format ✗
Corrective Action: - Require contractor to provide open-licensed supervisory software - Owner receives perpetual licenses in their name - All graphics converted to standard formats - Subscription for support only, not for access
Result: Protocol openness combined with software/licensing openness prevents lock-in.
Verification
Open Protocol Validation
- Objective: Verify all system communications use open, standard protocols
- Method:
- Review network architecture for protocol usage
- Confirm BACnet/IP as primary protocol
- Identify any proprietary protocol usage
- Validate standard IT protocols for supervisory access
- Acceptance Criteria: Zero use of proprietary protocols for device communication or supervisory access
NIC Openness Verification
- Objective: Verify integration platform is accessible to any qualified integrator
- Method:
- Obtain Open NIC Statement from contractor
- Verify no encrypted or obfuscated program files
- Confirm standard tools (Workbench) sufficient for all access
- Test that programming is editable without vendor-specific tools
- Validate no vendor-specific extensions prevent third-party access
- Acceptance Criteria: Complete Open NIC Statement provided; all files editable with standard tools
License Ownership Audit
- Objective: Verify owner is the named holder of all software licenses
- Method:
- Review all software license agreements
- Verify owner organization is named licensee
- Confirm license terms allow third-party service
- Validate owner has received all license keys
- Check that no subscription is required for basic system operation
- Acceptance Criteria: 100% of licenses registered to owner; no restrictions on third-party service
Third-Party Access Testing
- Objective: Verify unrestricted access for qualified third-party service providers
- Method:
- Document all tools required for system access
- Confirm no vendor-specific passwords or dongles
- Validate standard tools are sufficient
- Test access using generic credentials and standard tools
- Acceptance Criteria: Full system access achievable with standard industry tools; no vendor-specific access restrictions
Required Deliverables
- Software architecture diagram with protocols documented
- Open NIC Statement (for Niagara or similar platforms)
- All software license agreements with owner as named holder
- Complete list of tools required for system access and maintenance
- All source files, programs, graphics, and databases
- Written confirmation of no proprietary access restrictions
- License key inventory and registration documentation
Acceptance Criteria
Software interoperability requirements are met when all communications use open protocols, integration platforms are confirmed as open via NIC statements, all software licenses are held by the owner, and third-party access is unrestricted by proprietary tools or licensing.
3.10 BACnet/SC Security and PKI Ownership
Overview
BACnet Secure Connect (BACnet/SC) provides secure, authenticated communication using Transport Layer Security (TLS) and a Public Key Infrastructure (PKI). To prevent vendor lock-in and ensure long-term system maintainability, the building owner must control the PKI and have unrestricted ability to provision certificates for any authorized device or service provider. Additionally, networks that route between BACnet/SC and traditional BACnet/IP or MS/TP must provide diagnostic access points for troubleshooting.
Requirements
3.10.1 PKI Ownership and Control
The building owner shall be the Certificate Authority (CA) or shall control the Certificate Authority for the BACnet/SC network:
- Owner-Controlled CA: The owner shall operate the Certificate Authority or designate a trusted third party to operate it on the owner's behalf (not the BAS contractor or vendor)
- Root Certificate Ownership: The owner shall hold the private key for the root certificate
- No Vendor CA Dependency: The system shall not depend on certificates issued by vendor-controlled or cloud-based CAs that restrict owner's provisioning ability
- CA Documentation: Complete CA setup documentation, including certificate issuance procedures, shall be provided to the owner
3.10.2 Certificate Provisioning Rights
The owner shall have unrestricted rights to provision certificates for BACnet/SC devices and participants:
- Self-Provisioning Capability: The owner shall be able to generate and sign certificates for new devices without vendor involvement
- Third-Party Provisioning: Any qualified service provider chosen by the owner shall be able to provision devices with owner-signed certificates
- No License Restrictions: Software licenses shall not restrict the owner's ability to issue certificates
- Revocation Authority: The owner shall have authority to revoke certificates for any device or participant
3.10.3 Certificate Management Tools
Certificate management tools shall be accessible to the owner:
- Standard Tools: Certificate generation and signing shall be achievable using standard PKI tools (OpenSSL, Microsoft CA, or open-source equivalents)
- Vendor Tool Transfer: If vendor-specific tools are used during installation, they shall be transferred to the owner with perpetual licenses
- Documentation: Complete procedures for certificate generation, signing, installation, and revocation shall be documented
- No Proprietary Formats: Certificates shall use standard formats (X.509) with no proprietary extensions that prevent third-party provisioning
3.10.4 Certificate Validation and Direct Trust Model
BACnet/SC uses a direct trust model where devices validate certificates against locally installed signing certificates. The following shall be verified:
- Signing Certificate Trust: All device certificates shall be signed by a certificate that chains to the owner's root CA
- Two-Slot Trust Model: BACnet/SC devices support two trusted signing certificate slots; the owner's signing certificate(s) shall be installed in these slots
- No Chain Validation in Devices: Individual devices do not verify full certificate chains; they only validate that certificates are signed by one of their two trusted signing certificates
- Owner Signing Certificate Authority: The signing certificate used to issue device certificates shall be controlled by the owner and chain to the owner's root CA
- No External Trust Dependencies: Devices shall not require trust of vendor CAs, public CAs, or any signing certificate not controlled by the owner
- Certificate Expiration Policy: Certificate lifetimes shall be documented; automated renewal procedures shall be implemented where supported
- Expired Certificate Handling: System behavior with expired certificates shall be documented and tested
3.10.5 BACnet/SC Hub and Router Configuration
BACnet/SC infrastructure components shall meet ownership requirements:
- Hub Configuration Access: BACnet/SC hubs shall be fully configurable by the owner using standard interfaces (web, CLI, BACnet services)
- Router Ownership: Devices that route between BACnet/SC and BACnet/IP or MS/TP shall be owned and controlled by the building owner (not locked to vendor cloud services)
- Connection Authorization: The owner shall control which devices and hubs can participate in the BACnet/SC network via certificate policy
- No Cloud Lock-in: Hub or router functionality shall not depend on vendor cloud services or external platforms
3.10.6 Diagnostic Access for Routed Networks
Where BACnet/SC networks route to traditional BACnet/IP or MS/TP networks, diagnostic access shall be provided:
- Service Ports on Routers: BACnet/SC to BACnet/IP routers shall provide a service port on the local BACnet/IP network segment for diagnostic packet capture
- MS/TP Diagnostic Access: BACnet/SC to MS/TP routers shall provide a method to capture MS/TP traffic (service port, mirror port, or tap connection)
- Unencrypted Local Traffic: Traffic on the local BACnet/IP or MS/TP side of routers shall remain unencrypted to allow standard diagnostic tools (Wireshark, VTS, etc.)
- No Interference: Diagnostic service ports shall not interfere with normal network operations or create security vulnerabilities
3.10.7 Third-Party Diagnostic Tool Compatibility
Network architecture shall support third-party diagnostic tools:
- Packet Capture Support: Network infrastructure shall allow packet capture on local network segments using standard tools (Wireshark, tcpdump)
- BACnet Protocol Analysis: Captured traffic shall be analyzable using standard BACnet protocol analyzers (VTS, YABE, etc.)
- No Encryption of Local Traffic: BACnet/IP and MS/TP networks that connect to BACnet/SC shall not encrypt local segment traffic
- Documentation of Capture Points: Network diagrams shall identify all available diagnostic capture points
Examples
Example 1: Owner-Operated Certificate Authority
Scenario: Building owner establishes local CA for BACnet/SC network.
Implementation: - Owner installs Microsoft Active Directory Certificate Services on local server - Root CA certificate generated and secured by owner IT department - BAS contractor receives subordinate CA certificate for device provisioning during installation - After project completion, owner IT assumes all certificate provisioning responsibilities - Any future service provider can request subordinate CA cert from owner to provision new devices
Validation: 1. Verify owner holds root CA private key 2. Confirm contractor used owner's CA for all device certificates 3. Test certificate provisioning using owner's tools (no vendor software required) 4. Validate any qualified integrator can obtain subordinate cert from owner 5. Test certificate revocation by owner
Result: Owner has complete control of PKI; no vendor dependency for future device additions.
Example 2: Certificate Provisioning Testing with Direct Trust Model
Scenario: Validation that new devices can be provisioned without vendor involvement using BACnet/SC's two-slot direct trust model.
Trust Architecture: - Owner operates root CA (secured, offline) - Owner creates subordinate "Device Signing CA" that chains to root - Device Signing CA certificate installed in both trust slots on all BACnet/SC devices - Individual device certificates signed by Device Signing CA
Test Procedure: 1. Obtain new BACnet/SC device (controller, hub, etc.) 2. Install owner's Device Signing CA certificate in device's two trust slots (typically via web interface or provisioning tool) 3. Using owner's CA tools, generate certificate request for new device 4. Sign device certificate using owner's Device Signing CA (no vendor tools) 5. Install device certificate on device using standard methods (web interface, BACnet service, USB) 6. Verify device connects to BACnet/SC network and communicates 7. Confirm device validates peer certificates signed by the same Device Signing CA 8. Validate device appears in network roster and is fully functional
Acceptance: - Trust slots contain only owner-controlled signing certificates ✓ - Device certificate generated using standard tools (OpenSSL, Microsoft CA) ✓ - No vendor-specific software required for signing ✓ - Certificate installation achievable via standard interfaces ✓ - Device validates peers using owner's signing certificate ✓ - Device fully operational with owner-issued certificate ✓
Result: Owner can add devices at will using direct trust model; no vendor lock-in for certificate provisioning.
Example 3: Diagnostic Access Validation for Routed Networks
Scenario: BACnet/SC network with router to existing BACnet/IP building network.
Configuration: - BACnet/SC hub connects to secure WAN - Router bridges BACnet/SC to local BACnet/IP network (192.168.10.0/24) - Router provides service port on BACnet/IP side for diagnostics
Validation: 1. Connect laptop to BACnet/IP service port on router 2. Launch Wireshark and capture BACnet/IP traffic 3. Verify unencrypted BACnet packets are visible and decodable 4. Use VTS to discover devices on local BACnet/IP network 5. Confirm diagnostic access does not interfere with normal operations 6. Verify BACnet/SC traffic remains encrypted on WAN side
Network Diagram:
[BACnet/IP Devices] ←→ [Service Port] ←→ [BACnet/SC Router] ←→ [Encrypted WAN]
↓
[Diagnostic Laptop]
(Wireshark/VTS)
Result: Technicians can troubleshoot local network using standard tools; BACnet/SC security maintained.
Example 4: MS/TP to BACnet/SC Router Diagnostics
Scenario: Legacy MS/TP field controllers routed to BACnet/SC infrastructure.
Implementation: - MS/TP network at 76.8 kbps on RS-485 bus - Router provides diagnostic tap or mirror port for MS/TP traffic - Local diagnostics use MS/TP sniffer or USB interface
Diagnostic Options: - Option 1: Router provides USB port for MS/TP traffic capture - Option 2: Router includes physical tap points on MS/TP terminals - Option 3: Router mirrors MS/TP traffic to Ethernet service port (encapsulated)
Validation: 1. Connect MS/TP diagnostic tool to router's diagnostic interface 2. Capture MS/TP token passing and data frames 3. Verify MS/TP addressing and timing is correct 4. Troubleshoot any field controller communication issues 5. Confirm diagnostic connection does not load MS/TP network or cause errors
Result: MS/TP troubleshooting possible using standard tools; BACnet/SC security unaffected.
Verification
PKI Ownership Validation
- Objective: Verify owner controls the Certificate Authority and can provision certificates independently
- Method:
- Verify owner holds root CA private key
- Confirm no dependency on vendor or cloud-based CAs
- Review certificate issuance procedures documentation
- Test that owner can generate and sign certificates using standard tools
- Validate third-party service providers can obtain subordinate certificates from owner
- Acceptance Criteria: Owner operates or controls CA; no vendor CA dependencies; certificate provisioning possible with standard tools
Certificate Provisioning Testing
- Objective: Verify unrestricted certificate provisioning capability
- Method:
- Obtain test device and provision certificate using owner's CA and tools
- Verify no vendor-specific software required
- Test certificate installation via standard methods
- Validate device operates normally with owner-issued certificate
- Test certificate revocation by owner
- Acceptance Criteria: Successful device provisioning without vendor involvement; revocation authority confirmed
Direct Trust Model Validation
- Objective: Verify BACnet/SC direct trust model is correctly implemented with owner-controlled signing certificates
- Method:
- Verify owner's Device Signing CA certificate is installed in both trust slots on all devices
- Confirm Device Signing CA chains to owner's root CA (validated out-of-band, not by devices)
- Inspect all device certificates and verify they are signed by owner's Device Signing CA
- Confirm devices do not trust vendor signing certificates or external CAs
- Test that devices accept peers with certificates signed by owner's Device Signing CA
- Test that devices reject peers with certificates from untrusted signing CAs
- Validate certificate renewal procedures
- Test system behavior with expired certificates
- Acceptance Criteria: All devices trust only owner-controlled signing certificates; device certificates signed by owner's CA; no vendor or external CA dependencies; peer validation functions correctly; expiration handling documented
BACnet/SC Infrastructure Control
- Objective: Verify owner controls hubs and routers
- Method:
- Access hub and router configuration interfaces
- Verify owner credentials provide full administrative access
- Confirm no cloud service dependencies for core functionality
- Test connection authorization controls
- Acceptance Criteria: Full administrative access to all BACnet/SC infrastructure; no cloud lock-in
Diagnostic Access Verification for Routed Networks
- Objective: Verify diagnostic access is available on routed networks
- Method:
- Identify all BACnet/SC to BACnet/IP routers and verify service port availability
- Identify all BACnet/SC to MS/TP routers and verify diagnostic access method
- Connect diagnostic tools to service ports
- Capture and analyze traffic using standard tools (Wireshark, VTS)
- Verify diagnostic access does not interfere with operations
- Acceptance Criteria: Service ports available on all routers; packet capture successful; traffic analyzable with standard tools; no operational interference
Third-Party Tool Compatibility
- Objective: Verify standard diagnostic tools can analyze network traffic
- Method:
- Capture packets on local BACnet/IP segments using Wireshark
- Analyze BACnet protocol using VTS or YABE
- Verify MS/TP traffic is accessible and analyzable
- Confirm traffic is unencrypted on local segments
- Document all diagnostic capture points on network diagrams
- Acceptance Criteria: Standard tools successfully capture and analyze traffic; all diagnostic points documented
Required Deliverables
- Certificate Authority setup documentation with owner as operator
- Root certificate and private key secured by owner
- Certificate issuance and revocation procedures
- Trust chain verification for all device certificates
- BACnet/SC hub and router configuration documentation
- Network diagrams showing diagnostic service ports and capture points
- Third-party certificate provisioning test results
- Diagnostic access validation test results
- Certificate management tool documentation and licenses
Acceptance Criteria
BACnet/SC security and PKI ownership requirements are met when the owner controls the Certificate Authority, can provision certificates independently using standard tools, has unrestricted revocation authority, controls all BACnet/SC infrastructure, and diagnostic access is available on all routed networks for third-party troubleshooting tools.
5.0 Integrated Systems Validation
5.1 Overview
Modern building automation systems do not operate in isolation. To achieve operational efficiency, occupant comfort, and energy performance goals, BAS networks routinely integrate with lighting control systems, utility metering infrastructure, and other building subsystems. This section establishes validation requirements for systems that exchange data with the BAS.
5.2 Scope of Integration Validation
This section addresses validation of systems that interface and exchange data with the building automation system, specifically:
- Lighting Control Systems: Where lighting integrates with BAS for scheduling, occupancy coordination, or daylight harvesting
- Utility Metering: Where electric, water, thermal, or gas metering data feeds into the BAS for monitoring, trending, or demand response
- Data Accessibility: Performance requirements for data access, visualization, and system responsiveness
What This Section Does
- Validates that integration points between systems are correctly configured
- Verifies that data exchange between systems is functional and accurate
- Ensures integrated systems meet performance requirements for data access
- Confirms metadata for integrated points follows the same standards as native BAS points
What This Section Does Not Do
- Prescribe which systems must be integrated (determined by project requirements)
- Define the operational requirements of the integrated systems themselves (e.g., lighting levels, meter accuracy)
- Replace the commissioning requirements for the integrated systems in their own specifications
5.3 Integration Validation Approach
Where integrated systems are present and exchange data with the BAS, validation shall verify:
- Data Connectivity: Integration points are configured and communication is functional
- Data Accuracy: Values exchanged between systems are accurate and update at required intervals
- Metadata Compliance: Integrated points include proper naming, tagging, and semantic modeling per metadata validation levels
- Performance Requirements: Data access and visualization meet specified performance criteria
5.4 Applicability
Integration validation requirements apply when:
- The project specifications require integration between BAS and other building systems
- Data from non-BAS systems is made available through the BAS supervisory interface
- Coordinated control sequences involve multiple systems (e.g., HVAC and lighting)
- Utility metering data is required to be trended or visualized in the BAS
Where no integration is specified or required, this section does not apply.
5.2 Lighting Controls Integration
Overview
Lighting control systems often integrate with BAS for coordinated scheduling, occupancy-based control, and energy management. This section establishes validation requirements for lighting control system integration points where lighting data is exchanged with or made available through the BAS.
Requirements
5.2.1 Integration Scope Documentation
Where lighting controls integrate with the BAS, the following shall be documented:
- List of all lighting control points exposed to the BAS (zones, loads, occupancy sensors, daylight sensors, schedules)
- Communication protocol used for integration (BACnet, Modbus, proprietary API, etc.)
- Integration architecture diagram showing connection between lighting and BAS networks
- Intended use of lighting data in BAS (monitoring only, coordinated control, scheduling, energy analytics)
5.2.2 Data Connectivity Validation
Communication between lighting and BAS systems shall be verified:
- All specified lighting control points are visible and accessible from BAS supervisory station
- Data updates at specified refresh rate (typically 1-5 seconds for status, 30-60 seconds for energy data)
- Communication failures or errors are logged and reported
- Network addressing for lighting controllers follows project IP addressing scheme
5.2.3 Lighting Point Accuracy
Lighting control data accuracy shall be validated:
- Lighting zone status (on/off, dimming level) matches actual field conditions
- Occupancy sensor states accurately reflect space occupancy
- Daylight sensor values correlate with measured light levels
- Energy/power consumption data is within ±5% of independent measurement
- Scheduled events execute at correct times and affect correct zones
5.2.4 Coordinated Control Sequences
Where BAS and lighting systems execute coordinated control, validation shall verify:
- Occupancy-based HVAC setback triggers when lighting confirms space unoccupied
- Daylight harvesting adjustments coordinate with perimeter HVAC zoning
- After-hours HVAC requests properly enable associated lighting zones
- Coordinated shutdowns execute in proper sequence (lighting confirmation before HVAC setback)
5.2.5 Metadata Compliance
Lighting control points integrated into the BAS shall comply with metadata validation requirements:
- Level 1 (Naming): Lighting point names follow the same naming convention as native BAS points
- Level 2 (Tagging): Lighting equipment and points include appropriate tags (equip: Lighting, point: Switch, Sensor, etc.)
- Level 3 (Semantic): Lighting zones are properly related to served spaces in the semantic model
Examples
Example 1: Occupancy-Based HVAC Integration
Scenario: Office spaces with lighting occupancy sensors that trigger HVAC setback.
Integration Configuration: - Lighting occupancy sensors communicate via BACnet to lighting panel - Lighting panel exposes aggregated zone occupancy to BAS via BACnet/IP - BAS logic monitors occupancy status and implements 30-minute setback delay - If occupancy=false for 30 minutes, HVAC setpoint setback initiated
Validation: - Simulate occupancy (motion in space) → verify sensor status changes to occupied - Verify BAS receives occupied status within 5 seconds - Simulate vacancy → verify 30-minute timer starts correctly - Confirm HVAC setback occurs only after timer expires - Test re-occupancy cancels setback timer
Result: Coordinated sequence reduces energy use while maintaining comfort; proper integration validated.
Example 2: Lighting Load Monitoring
Scenario: BAS trending lighting energy consumption by floor.
Implementation: - Lighting panels provide kW data via Modbus TCP - BAS polls Modbus registers every 60 seconds - Lighting loads tagged with floor designation - BAS trends and displays energy by floor
Validation: 1. Use temporary power meter to measure lighting panel output 2. Compare BAS-displayed kW value to measured value (should be within ±5%) 3. Verify trending intervals are 60 seconds as specified 4. Confirm floor designation tagging is correct 5. Validate historical trend data accumulates properly
Result: Accurate lighting energy data enables load profiling and anomaly detection.
Example 3: Daylight Harvesting Coordination
Scenario: Perimeter zones with daylight harvesting affecting HVAC load.
Integration: - Daylight sensors in lighting system report lux levels to BAS - BAS logic adjusts perimeter cooling anticipating reduced lighting heat gain - Coordination prevents overcooling when lights dim
Validation: - Verify daylight sensor values accessible in BAS - Compare BAS lux reading to calibrated light meter (within ±10%) - Simulate high daylight → confirm lights dim → verify HVAC cooling reduces - Monitor space temperature to confirm comfort maintained - Test cloudy condition → lights increase → HVAC cooling responds
Result: Coordinated control optimizes comfort and energy by accounting for dynamic lighting loads.
Verification
Integration Documentation Review
- Objective: Verify lighting integration is completely documented
- Method: Review integration documentation for:
- Complete list of integrated lighting points
- Communication protocol and architecture diagram
- Network addressing compliance with project standards
- Intended use cases for lighting data in BAS
- Acceptance Criteria: All integration points documented; architecture clearly defined
Communication Validation
- Objective: Verify data connectivity between lighting and BAS systems
- Method:
- Verify all lighting points are accessible from BAS supervisory station
- Monitor data refresh rates (status, energy, sensors)
- Simulate communication failure and verify error reporting
- Check network addressing follows project scheme
- Acceptance Criteria: 100% of points accessible; refresh rates meet specification; errors are logged
Data Accuracy Testing
- Objective: Verify lighting data values are accurate
- Method:
- Compare lighting status in BAS to field observations
- Verify occupancy sensor states with actual occupancy
- Measure light levels and compare to BAS sensor values (±10%)
- Validate energy data against independent metering (±5%)
- Test scheduled events execute at correct times
- Acceptance Criteria: All data values within specified accuracy; schedules execute correctly
Coordinated Sequence Testing
- Objective: Verify coordinated control between lighting and HVAC systems
- Method:
- Test occupancy-based setback sequences (simulate occupied/vacant)
- Verify daylight coordination sequences
- Test after-hours lighting enable with HVAC request
- Validate shutdown sequences execute in correct order
- Acceptance Criteria: All coordinated sequences function as designed; proper delays and interlocks verified
Metadata Compliance Check
- Objective: Verify lighting points comply with metadata standards
- Method:
- Validate naming convention compliance (Level 1)
- Check tagging accuracy and completeness (Level 2)
- Verify semantic relationships in model (Level 3)
- Acceptance Criteria: Lighting points meet same metadata standards as native BAS points
Required Deliverables
- Lighting integration point list
- Communication protocol and architecture documentation
- Coordinated control sequence descriptions
- Data accuracy validation test results
- Metadata compliance verification for lighting points
- As-built integration diagrams
Acceptance Criteria
Lighting controls integration validation is complete when all integration points are documented and accessible, data accuracy is verified, coordinated control sequences function correctly, and lighting points comply with metadata validation requirements.
5.3 Utility Metering Integration
Overview
Utility metering data (electric, water, thermal energy, gas) is increasingly integrated with building automation systems for energy monitoring, demand response, and operational optimization. This section establishes validation requirements for utility metering integration where meter data is made available through the BAS.
Requirements
5.3.1 Metering System Documentation
Where utility meters integrate with the BAS, the following shall be documented:
- Complete inventory of all meters integrated to BAS (electric, water, thermal, gas)
- Meter communication protocols (Modbus, BACnet, M-Bus, pulse, etc.)
- Network architecture showing meter-to-BAS connectivity
- Data points exposed (kW, kWh, flow rate, volume, temperature, etc.)
- Intended use of metering data (trending, analytics, demand limiting, submetering)
- Data retention and trending intervals
5.3.2 Meter Communication Validation
Communication between meters and BAS shall be verified:
- All specified meters are accessible from BAS supervisory station
- Meter data updates at specified intervals (typically 5-60 seconds for real-time, 15-minute intervals for energy)
- Communication protocol configuration is correct (Modbus register mapping, BACnet PICS compliance)
- Network addressing follows project IP addressing scheme
- Communication failures are detected, logged, and alarmed
5.3.3 Metering Data Accuracy
Utility metering data accuracy shall be validated:
- Real-time values (kW, GPM, Btu/h) are within ±2% of independent measurement
- Accumulated values (kWh, gallons, Btu) track correctly over time (±1% over 24-hour period)
- Pulse counters increment correctly and match totalizer displays
- Calculated values ($/kWh, efficiency metrics) use correct formulas and constants
- Demand values correctly track peak values over demand interval (typically 15 minutes)
5.3.4 Data Trending and Retention
Metering data trending shall be validated:
- Trend logs are configured for all required meter data points
- Trending intervals match specification (e.g., 15-minute intervals for energy, 1-minute for real-time)
- Trend data retention meets requirements (minimum 13 months recommended)
- Trend data is exportable in standard formats (CSV, Excel, BACnet Trend Log)
- Historical data is retrievable and complete (no gaps in critical data)
5.3.5 Demand Response and Alarming
Where metering supports demand response or alarming, validation shall verify:
- Demand limiting functions correctly at specified threshold
- Demand predictions are accurate and provide adequate warning time
- Load shedding sequences execute in correct priority order
- Alarms trigger at specified thresholds (peak demand, consumption limits)
- Alarm notifications reach appropriate personnel
5.3.6 Metadata Compliance
Utility meter points integrated into the BAS shall comply with metadata validation requirements:
- Level 1 (Naming): Meter point names follow project naming convention
- Level 2 (Tagging): Meters tagged with appropriate classifications (Elec-Meter, Water-Meter, point types: Power, Energy, Flow, Volume)
- Level 3 (Semantic): Meters properly related to equipment, systems, and spaces they serve in semantic model
Examples
Example 1: Electric Submetering Integration
Scenario: Building with electric submeters for tenant billing integrated to BAS.
Implementation: - Submeters communicate via Modbus TCP - Each meter reports: kW (real-time), kWh (accumulated), Volts, Amps, PF - BAS trends kWh at 15-minute intervals for utility bill verification - Tenant dashboards display real-time and historical consumption
Validation: 1. Verify all meter data points accessible in BAS 2. Compare BAS kW reading to portable power analyzer (within ±2%) 3. Record kWh value, wait 24 hours, verify accumulation matches utility meter (within ±1%) 4. Verify 15-minute trend intervals are consistent 5. Export trend data and verify no missing intervals 6. Validate tenant dashboard displays correct meter data
Result: Accurate submetering enables reliable tenant billing and energy cost allocation.
Example 2: Thermal Energy Metering
Scenario: Campus with distributed thermal energy meters for chilled water and heating.
Integration: - BTU meters provide: Flow (GPM), Delta-T (°F), Energy Rate (Btu/h), Total Energy (MMBtu) - Meters communicate via BACnet/IP - Central plant BAS trends all buildings for load profiling
Validation: 1. Verify all four data points from each meter accessible 2. Compare flow reading to ultrasonic flow meter (±2%) 3. Validate Delta-T matches independent temperature measurement (±0.5°F) 4. Confirm energy calculation: GPM × Delta-T × 500 = Btu/h (within ±2%) 5. Verify MMBtu accumulation over 24 hours matches meter local display (±1%) 6. Test trend data export and analysis for load profiling
Result: Thermal metering enables plant optimization and building energy benchmarking.
Example 3: Demand Response Integration
Scenario: Peak demand limiting using electric meter integration.
Control Sequence: - Main electric meter reports 15-minute rolling demand - At 90% of demand limit, BAS begins shedding non-critical loads - At 95%, additional loads shed in priority order - System returns to normal when demand drops below 85%
Validation: 1. Monitor normal building demand via BAS 2. Simulate high load condition approaching limit 3. Verify BAS detects 90% threshold and initiates first load shed 4. Confirm loads shed in correct priority sequence 5. Verify demand stays below limit (alarm if exceeded) 6. Test return to normal when demand reduces 7. Validate alarm notifications sent to facilities staff
Result: Automated demand response prevents costly demand charges; validated sequence functions correctly.
Verification
Metering System Documentation Review
- Objective: Verify metering integration is comprehensively documented
- Method: Review documentation for:
- Complete meter inventory with specifications
- Communication protocols and network architecture
- Data point list with trending requirements
- Intended applications for meter data
- Acceptance Criteria: All meters documented; integration architecture clearly defined
Communication Validation
- Objective: Verify meter-to-BAS communication is functional
- Method:
- Verify all meters accessible from BAS supervisory station
- Monitor data refresh rates for all point types
- Validate protocol configuration (Modbus registers, BACnet objects)
- Simulate communication failure and verify error detection/alarming
- Acceptance Criteria: 100% of meters communicating; refresh rates meet specification; errors detected and alarmed
Data Accuracy Testing
- Objective: Verify metering data values are accurate
- Method:
- Use calibrated reference instruments to measure actual values
- Compare BAS real-time values to reference (within ±2%)
- Track accumulated values over 24 hours (within ±1%)
- Verify pulse counter accuracy
- Validate calculated values use correct formulas
- Acceptance Criteria: All data points within specified accuracy tolerances
Trending Validation
- Objective: Verify trend logging configuration and data quality
- Method:
- Confirm trend intervals match specification
- Verify trend retention meets requirements (13+ months)
- Export trend data and check for gaps or missing intervals
- Validate historical data retrieval
- Acceptance Criteria: Trending configured correctly; data complete with no gaps; exportable in standard formats
Demand Response Testing
- Objective: Verify demand limiting and load shedding functions correctly
- Method:
- Monitor demand in normal operation
- Simulate high demand approaching limits
- Verify threshold detection and load shedding execution
- Confirm priority sequence is correct
- Test alarm notification delivery
- Validate return to normal operation
- Acceptance Criteria: Demand response functions as designed; loads shed in correct sequence; demand stays below limit
Metadata Compliance Check
- Objective: Verify meter points comply with metadata standards
- Method:
- Validate naming convention compliance (Level 1)
- Check tagging accuracy and completeness (Level 2)
- Verify semantic relationships to equipment and spaces (Level 3)
- Acceptance Criteria: Meter points meet same metadata standards as native BAS points
Required Deliverables
- Complete meter inventory with specifications
- Communication protocol and network architecture documentation
- Meter data point list with trending configuration
- Data accuracy validation test results
- Trend log configuration and retention verification
- Demand response sequence testing results (if applicable)
- Metadata compliance verification for meter points
- As-built metering diagrams and register maps
Acceptance Criteria
Utility metering integration validation is complete when all meters are documented and accessible, data accuracy is verified within tolerances, trending is configured correctly, demand response functions are validated (where applicable), and meter points comply with metadata validation requirements.
5.4 Data Accessibility and Performance
Overview
The ultimate value of validated digital infrastructure lies in its usability for building operations and advanced applications. This section establishes performance requirements for data access, visualization responsiveness, and system usability to ensure that correctly structured data is also performant and accessible.
Requirements
5.4.1 Graphical User Interface Performance
The BAS supervisory system graphical interface shall meet the following performance criteria:
- Graphic Display Time: Graphics containing up to 20 dynamic data points shall display in ≤5 seconds
- Graphic Update Rate: Dynamic values on displayed graphics shall update in ≤8 seconds after field value change
- Navigation Responsiveness: Navigation between graphics, menus, or system views shall complete in ≤3 seconds
- Large Graphic Performance: Complex graphics (50+ dynamic points) shall display in ≤10 seconds
These criteria shall be measured on the minimum specified workstation hardware and under normal network loading conditions.
5.4.2 Alarm Annunciation Performance
The alarm and event notification system shall meet the following criteria:
- Alarm Display Latency: Critical alarms shall appear in the supervisory interface within ≤15 seconds of the triggering condition
- Alarm Notification Delivery: Email/SMS alarm notifications shall be delivered within ≤60 seconds (subject to external email system performance)
- Alarm Acknowledgment: Operator alarm acknowledgment shall register in the system within ≤3 seconds
- Alarm History Retrieval: Alarm history queries shall return results within ≤10 seconds for typical date ranges (30 days)
5.4.3 Trend Data Retrieval
Historical trend data access shall meet the following performance requirements:
- Real-time Trend Display: Current trend data (last 24 hours) shall display in ≤5 seconds
- Historical Trend Queries: Historical trend retrieval (30-day range) shall complete in ≤15 seconds
- Trend Data Export: Export of trend data to CSV/Excel shall complete in ≤30 seconds for up to 10 points over 30 days
- Multi-point Trending: Displaying overlay trends of up to 10 points shall complete in ≤10 seconds
5.4.4 Data Query and Reporting
System data queries and report generation shall meet performance requirements:
- Point Search: Searching for points by name or tag shall return results in ≤3 seconds for typical queries
- Equipment Lists: Displaying equipment lists filtered by type or location shall complete in ≤5 seconds
- Custom Reports: Standard operational reports (runtime summaries, alarm summaries, setpoint schedules) shall generate in ≤30 seconds
- Energy Dashboards: Energy monitoring dashboards with multiple meters and graphics shall load in ≤10 seconds
5.4.5 Mobile and Remote Access
Where mobile or remote access is provided, performance shall be validated:
- Mobile App Performance: Mobile applications shall meet the same graphic display and navigation performance as workstation interfaces (±2 seconds acceptable variance)
- VPN Access Performance: Performance over VPN shall degrade by no more than 20% compared to local network access
- Web Browser Performance: Web-based interfaces shall meet all performance criteria when accessed via supported browsers
5.4.6 Concurrent User Performance
System performance shall be validated under concurrent user load:
- Multiple Users: System shall maintain performance standards with up to the maximum specified concurrent users
- Performance Degradation Limit: With maximum concurrent users, performance degradation shall not exceed 25% of single-user benchmarks
- Load Testing: System shall be tested at 125% of maximum concurrent users to verify graceful degradation
5.4.7 API and Integration Performance
Where programmatic data access is provided via APIs:
- API Response Time: REST API calls shall return data within ≤2 seconds for typical point queries
- Bulk Data Retrieval: API requests for bulk data (100+ points) shall complete within ≤10 seconds
- API Availability: API endpoints shall maintain ≥99.5% availability during normal operations
- Rate Limiting: API rate limits shall be documented and sufficient for intended applications (minimum 60 requests/minute recommended)
Examples
Example 1: Graphics Performance Validation
Scenario: Testing GUI performance for HVAC graphics with 20 dynamic points.
Test Procedure: 1. Clear browser cache and restart supervisory workstation 2. Start timer and navigate to complex AHU graphic (20 dynamic points: temperatures, status, setpoints) 3. Record time until graphic fully displays with all values populated 4. Change field setpoint and measure time until graphic updates 5. Navigate to different graphic and measure transition time 6. Repeat test 3 times and average results
Acceptance: - Graphic display: ≤5 seconds (tested: 3.2 seconds average) ✓ - Value update: ≤8 seconds (tested: 5.1 seconds average) ✓ - Navigation: ≤3 seconds (tested: 1.8 seconds average) ✓
Result: GUI performance meets all specified criteria.
Example 2: Alarm Annunciation Testing
Scenario: Validating critical alarm notification performance.
Test Procedure: 1. Configure test point with critical alarm (high temperature) 2. Start timer and force point into alarm condition 3. Measure time until alarm appears in supervisory interface 4. Record time until email notification received 5. Acknowledge alarm and measure acknowledgment response time 6. Query alarm history for last 30 days and measure retrieval time
Results: - Alarm display latency: 8 seconds (meets ≤15 second requirement) ✓ - Email notification: 22 seconds (meets ≤60 second requirement) ✓ - Acknowledgment response: <1 second (meets ≤3 second requirement) ✓ - History retrieval (30 days): 6 seconds (meets ≤10 second requirement) ✓
Result: Alarm system performance verified; operators receive timely notifications.
Example 3: Concurrent User Load Testing
Scenario: Validating system performance with multiple simultaneous users.
Test Setup: - System specified for 10 concurrent users - Baseline single-user performance established - 12 users simultaneously access system (125% of maximum)
Test Activities: - All users navigate to different graphics simultaneously - Multiple users query trend data - Several users acknowledge alarms - Some users generate reports
Performance Results: - Single-user graphic display: 3.5 seconds - 12 concurrent users graphic display: 4.8 seconds (37% degradation - FAILS)
Corrective Action: - Database query optimization implemented - Graphics caching enabled - Re-test shows 12-user performance: 4.2 seconds (20% degradation - PASSES)
Result: System maintains acceptable performance under peak load after optimization.
Verification
Graphics Performance Testing
- Objective: Verify graphical interface meets response time requirements
- Method:
- Test graphic display time for standard and complex graphics
- Measure dynamic value update rates
- Validate navigation responsiveness
- Test under minimum specified hardware configuration
- Acceptance Criteria: All graphics meet specified display and update time requirements
Alarm System Performance Validation
- Objective: Verify alarm annunciation meets latency requirements
- Method:
- Force alarm conditions and measure display latency
- Test notification delivery times (email, SMS)
- Measure alarm acknowledgment response
- Validate alarm history query performance
- Acceptance Criteria: All alarm functions meet specified time requirements
Trend Data Performance Testing
- Objective: Verify trend data retrieval meets performance criteria
- Method:
- Display real-time trends and measure load time
- Query historical data ranges and measure retrieval time
- Test trend data export performance
- Validate multi-point overlay trend performance
- Acceptance Criteria: Trend access and export meet all specified time requirements
Report and Query Performance
- Objective: Verify data query and reporting performance
- Method:
- Test point search response times
- Measure equipment list display times
- Generate standard reports and measure completion time
- Load energy dashboards and measure display time
- Acceptance Criteria: All queries and reports complete within specified times
Concurrent User Load Testing
- Objective: Verify system performance under multi-user load
- Method:
- Establish single-user performance baseline
- Test with maximum specified concurrent users
- Test at 125% of maximum users
- Measure performance degradation percentage
- Verify graceful degradation (no crashes)
- Acceptance Criteria: Performance degradation ≤25% at maximum users; system remains stable at 125% load
Mobile and Remote Access Validation
- Objective: Verify mobile and VPN access meet performance requirements
- Method:
- Test mobile app performance against workstation baseline
- Measure VPN access performance degradation
- Test web browser interface performance
- Validate across supported devices and browsers
- Acceptance Criteria: Mobile/remote performance within acceptable variance of local access
API Performance Testing
- Objective: Verify programmatic data access meets performance requirements
- Method:
- Test API response times for typical queries
- Measure bulk data retrieval performance
- Monitor API availability over test period
- Validate rate limiting configuration
- Acceptance Criteria: API calls meet response time requirements; availability ≥99.5%; rate limits documented and adequate
Required Deliverables
- GUI performance test results with screenshots and timing data
- Alarm annunciation performance validation report
- Trend data retrieval performance test results
- Concurrent user load testing report with baseline and multi-user results
- Mobile/remote access performance validation
- API performance and availability test results
- Performance optimization documentation (if corrective actions required)
Acceptance Criteria
Data accessibility and performance validation is complete when all graphical interface, alarm, trend, query, and API performance requirements are met under both single-user and maximum concurrent user loads, with mobile and remote access validated where applicable.
4.0 Metadata Validation Framework
Overview
This section defines the validation and verification requirements for digital metadata representation of building automation systems. The validation process is structured in three progressive levels, each building upon the previous to create increasingly sophisticated machine-readable representations.
Three-Level Validation Structure
Level 1: Naming Convention Validation
Validates that all device and point names conform to the project-specific naming standard through automated pattern matching and syntax verification.
Level 2: Tagging and Labeling Validation
Validates that metadata tags and labels are consistently applied and semantically correct through systematic verification of classification and descriptive attributes.
Level 3: Semantic Model Validation
Validates that system relationships and hierarchies are properly modeled and machine-readable through graph analysis and semantic verification.
Validation Independence
Each level of metadata validation is designed to work with any project-specific:
- Naming convention or standard (e.g., project appendices, industry standards)
- Tagging taxonomy or vocabulary
- Semantic modeling approach or ontology
- Database structure or format
The validation procedures verify conformance to the specified approach rather than prescribing a particular approach.
Progressive Requirements
Each level has distinct validation requirements:
- 100% Coverage: All elements at each level must pass syntax and format validation
- Sampling-Based Correctness: Strategic sampling ensures semantic accuracy and proper implementation
- Cross-Level Consistency: Higher levels must be consistent with validated lower levels
4.1 Level 1: Naming Convention Validation
Objective
To enforce a strict, standardized naming convention for all assets and points.
Requirements
All BAS controllers, equipment, and points (software and hardware) must be named according to the project-specific naming convention outlined in the Project Naming Standard (e.g., Appendix A).
Validation and Acceptance Criteria
Validation Requirements (100% Coverage)
All names must programmatically validate against the regular expression (regex) defined in the project naming standard. This validation must achieve 100% coverage of all BAS controllers, equipment, and points. Any name failing the validation script must be corrected in the BAS database before this level is considered complete.
Correctness Verification (Sampling-Based)
To verify that validated names are not only syntactically correct but also accurate, the following sampling requirements must be met:
- Controller Models: At least one controller of each unique model type must be 100% checked for naming correctness on both the device level metadata and the point level metadata.
- Equipment Functions: At least one piece of equipment representing each functional type (e.g., AHU, VAV, chiller, boiler) must be 100% checked for naming correctness at the device level and the point level
Device Level Correctness Checks:
- Device Name: Each token from the project naming standard definition (e.g., Appendix A) shall match the expected value. Examples:
- If building identifier is a token:
B01for Building 1,B02for Building 2 - If floor identifier is a token:
F01for Floor 1,F02for Floor 2 - If equipment type is a token:
AHUfor Air Handling Unit,VAVfor Variable Air Volume box - If equipment id is a token: 'AHU-01' for Air Handling Unit #1, it should use the same numbering scheme as shown on drawings for that specific equipment, or an owner provide equipment naming scheme.
- If sequence number is a token:
001,002,003for sequential numbering - Device Location: Physical location matches the naming convention location tokens
- Device Function: Equipment function identifier accurately represents the actual equipment type and purpose
Point Level Correctness Checks:
- Point Name: Each token in the point name shall correspond to the correct naming convention definition. Examples:
- Equipment Reference: Point prefix matches parent equipment name (e.g.,
AHU01_SATfor supply air temperature from AHU01) - Measurement Type: Point suffix accurately describes the measurement:
_SATfor Supply Air Temperature_RATfor Return Air Temperature_SF_CMDfor Supply Fan Command_DAM_POSfor Damper Position_SPfor Setpoint values_ALMfor Alarm points
- Data Type Verification: Point data types match the expected type for the measurement:
- Temperature points:
RealorFloatdata type - Status/Command points:
BooleanorBinarydata type - Position feedback:
RealorAnalogdata type (0-100%) - Alarm points:
BooleanorBinarydata type - Units of Measure: Engineering units match the point type and local standards:
- Temperature:
°F,°C, orKas appropriate - Pressure:
psi,Pa,kPa, orinWCas appropriate - Flow:
CFM,GPM,L/s, orm³/has appropriate - Percentage:
%for damper positions, valve positions - Power/Energy:
kW,kWh,BTU/has appropriate
This correctness verification ensures that names follow the intended naming logic and accurately represent the physical assets and their functions within the building automation system.
4.2 Level 2: Tagging and Labeling Validation
4.2.1 Validation Requirements
Tagging Implementation Validation
The tagging and labeling implementation must be validated against the project-specific tagging standard and data model requirements. This validation ensures that all BAS points and equipment have consistent, standardized metadata tags that enable automated querying and analysis.
Required Deliverable: Complete tagged data model export in the format specified by project requirements (e.g., Haystack JSON, Brick TTL, custom JSON schema).
Tag Completeness Validation
All BAS components must be validated for:
- Presence of required base tags as defined by the project tagging standard
- Appropriate equipment classification tags
- Point type and measurement classification tags
- Relationship tags linking points to parent equipment
- Location and system hierarchy tags
Tag Consistency Validation
Tagging implementation must demonstrate:
- Consistent application of the designated tagging vocabulary
- Proper tag combinations and relationships
- Absence of conflicting or contradictory tag assignments
- Compliance with the specified data model schema
4.2.2 Validation Criteria Examples
The following examples illustrate validation criteria that would be applied based on project-specific tagging standards:
Industry Standard Tagging Examples
- Project Haystack: Validation against Haystack tag definitions and entity relationships (e.g.,
site,equip,point,sensor,cmd,sp,air,temp) - Brick Schema: Validation against Brick ontology classes and relationships
- Custom Taxonomy: Validation against project-specific tag dictionary and relationship rules
Tag Coverage Requirements Examples
- Base Tags: All entities have required foundational tags (site, equipment type, point classification)
- Measurement Tags: All sensor and setpoint entities include units, measurement type, and substance tags
- Relationship Tags: All points are properly linked to parent equipment through relationship tags
- Location Tags: All entities include appropriate spatial hierarchy tags
Schema Compliance Examples
- Required Tag Combinations: Verify mandatory tag patterns (e.g., temperature sensors must have
temp+sensor+ substance tag) - Prohibited Tag Conflicts: Identify mutually exclusive tag combinations
- Enumeration Validation: Confirm enumerated tags use only allowed values from the specified vocabulary
4.2.3 Verification Procedures
Data Model Parsing Method
-
Schema Validation
- Objective: Verify submitted data model conforms to specified format and schema
- Method: Automated parsing and schema validation against project-specified data model format
- Acceptance Criteria: 100% successful parsing with zero schema validation errors
-
Tag Dictionary Compliance
- Objective: Confirm all tags are from approved vocabulary
- Method: Cross-reference all tags against the project-specified tag dictionary or ontology
- Acceptance Criteria: 100% of tags are recognized terms from the approved vocabulary
Tag Completeness Testing Method
-
Required Tag Coverage
- Objective: Verify all entities have mandatory tags per project requirements
- Method: Automated queries to identify entities missing required tag categories
- Acceptance Criteria: 100% of entities have all required base tags as defined by project tagging standard
-
Tag Relationship Validation
- Objective: Confirm proper parent-child and system relationships through tags
- Method: Graph analysis and relationship queries to verify tagged connections
- Acceptance Criteria: 100% of points are properly linked to parent equipment and system hierarchies
Semantic Correctness Verification Method (Sampling-Based)
-
Equipment Tag Accuracy
- Objective: Verify equipment tags accurately represent physical equipment types and functions
- Method: Sample-based verification of at least one instance of each unique equipment type
- Acceptance Criteria: 100% of sampled equipment have semantically correct tag combinations
-
Point Tag Accuracy
- Objective: Verify point tags accurately represent measurement types, units, and functions
- Method: Sample-based verification covering all point type categories (sensors, commands, setpoints, alarms)
- Acceptance Criteria: 100% of sampled points have semantically accurate tag combinations and proper units of measure
Query Validation Method
-
Functional Query Testing
- Objective: Verify tagged data supports intended analytical and operational queries
- Method: Execute a suite of predefined validation queries designed to test tag completeness and correctness
- Acceptance Criteria: All validation queries return expected results without errors
-
Data Accessibility Testing
- Objective: Confirm tagged data enables automated discovery and filtering
- Method: Test ability to programmatically find and filter entities by tag combinations
- Acceptance Criteria: All major equipment types and point categories can be successfully discovered and filtered using tag-based queries
4.3 Level 3: Semantic Model Validation
4.3.1 Validation Requirements
Semantic Model Implementation Validation
The semantic model implementation must be validated against the project-specific modeling standard and relationship requirements. This validation ensures that all BAS components, spaces, and their functional and physical relationships are accurately represented in a machine-readable format that supports advanced analytical applications.
Required Deliverable: Complete semantic model export in the format specified by project requirements (e.g., Brick Schema TTL, custom RDF/OWL, JSON-LD, or other specified semantic format).
Relationship Completeness Validation
All system relationships must be validated for:
- Equipment-to-equipment connections and dependencies
- Equipment-to-space service relationships
- System hierarchy and containment relationships
- Energy and substance flow pathways
- Control and monitoring relationships
Model Integrity Validation
The semantic model must demonstrate:
- Syntactic validity according to the specified modeling language/schema
- Semantic consistency with the designated ontology or vocabulary
- Complete representation of all physical and logical system components
- Traceable pathways from spaces to building services
4.3.2 Validation Criteria Examples
The following examples illustrate validation criteria that would be applied based on project-specific semantic modeling standards:
Semantic Modeling Standard Examples
- Brick Schema: Validation against Brick ontology classes and relationships (e.g.,
isFedBy,isLocatedIn,serves,hasPoint) - Custom Ontology: Validation against project-specific semantic model and relationship definitions
- Industry Standards: Validation against IFC, BOT (Building Topology Ontology), or other specified semantic standards
- Hybrid Approaches: Validation of models combining multiple ontologies or standards
Relationship Validation Examples
- Service Relationships: Equipment serving spaces through appropriate relationship chains
- Energy Pathways: Complete energy flow chains from building services to end-use equipment
- Spatial Relationships: Equipment located in appropriate spaces with proper containment hierarchies
- Control Relationships: Controllers linked to controlled equipment and monitored points
Query Capability Requirements Examples
Based on project-specific analytical needs, the model must support queries such as:
- Energy Chain Tracing: From any space, trace complete energy supply chain to building service entrance
- Equipment Discovery: Find all equipment of specific types serving designated areas
- Impact Analysis: Identify all spaces affected by equipment outages or system changes
- Performance Analysis: Group related equipment and points for analytics applications
4.3.3 Verification Procedures
Model Syntax and Schema Validation Method
-
Format Validation
- Objective: Verify submitted model conforms to specified semantic format and syntax
- Method: Automated parsing and validation using appropriate semantic web tools (e.g., RDF validators, OWL reasoners)
- Acceptance Criteria: 100% successful parsing with zero syntax errors or schema violations
-
Ontology Compliance
- Objective: Confirm model uses only approved classes and relationships from designated ontology
- Method: Automated validation against the project-specified ontology or vocabulary
- Acceptance Criteria: 100% of model elements are valid according to the specified semantic standard
Relationship Completeness Testing Method
-
System Connectivity Validation
- Objective: Verify all system components are properly connected through semantic relationships
- Method: Graph analysis to identify orphaned entities and incomplete relationship chains
- Acceptance Criteria: 100% of equipment and spaces are connected to appropriate system hierarchies
-
Service Chain Completeness
- Objective: Confirm complete service pathways from building services to served spaces
- Method: Automated traversal of service relationships to verify end-to-end connectivity
- Acceptance Criteria: 100% of conditioned spaces have traceable pathways to appropriate building services
Functional Query Validation Method
-
Required Query Execution
- Objective: Verify model supports all project-specified analytical query requirements
- Method: Execute suite of validation queries designed to test semantic model completeness
- Acceptance Criteria: All required query types return complete and accurate results
-
Energy Chain Tracing Validation
- Objective: Confirm ability to trace complete energy supply chains from spaces to building services
- Method: Sample-based testing using automated queries from representative spaces to building service entrances
- Acceptance Criteria: 100% of tested spaces return complete energy chain traces terminating at appropriate building services (e.g., electrical service, gas service, chilled water service)
Semantic Accuracy Verification Method (Sampling-Based)
-
Relationship Accuracy Testing
- Objective: Verify semantic relationships accurately represent physical and functional system relationships
- Method: Sample-based verification of critical relationships against design documents and field conditions
- Acceptance Criteria: 100% of sampled relationships are semantically accurate and physically correct
-
Model-Reality Correspondence
- Objective: Confirm semantic model accurately represents actual system implementation
- Method: Cross-validation of model against as-built drawings, control sequences, and system commissioning data
- Acceptance Criteria: No material discrepancies between semantic model and actual system implementation
Advanced Analytics Readiness Testing Method
-
Application Query Performance
- Objective: Verify model supports efficient execution of analytical applications
- Method: Performance testing of complex queries representative of intended system applications
- Acceptance Criteria: Query performance meets project-specified response time requirements
-
Integration Capability Testing
- Objective: Confirm model can integrate with specified analytics platforms and applications
- Method: Test model loading and querying in designated analytics tools or platforms
- Acceptance Criteria: Successful integration and operation with all specified target applications
License
This document is part of the Digital Commissioning of Building Automation Systems Standard, licensed under CC BY-SA 4.0.
You are free to: - Share — copy and redistribute the material in any medium or format - Adapt — remix, transform, and build upon the material for any purpose, even commercially
Under the following terms: - Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made - ShareAlike — If you remix, transform, or build upon the material, you must distribute your contributions under the same license