ORIGINAL RESEARCH ARTICLE | Feb. 3, 2026
Development and Testing of California Bearing Ratio Machine for Evaluating the Strength of Materials for Use in Roads and Pavements
Isaac O. Olaniyan, David A. Opeyemi
Page no 62-78 |
https://doi.org/10.36348/sjet.2026.v11i02.001
An electrically-operated California Bearing Ratio (CBR) machine was designed, fabricated with locally-sourced materials, calibrated and tested with the aim of providing high precision machine obtainable at lower cost. Materials were selected based on the ability to withstand mechanical loads, stiffness and dimensional stability, wear resistance, corrosion resistance, machinability and cost-effectiveness. The major component parts were designed using standard equations. For material components such as the loading frame, CBR moulds and reaction rings, mild steel was used, hardened medium carbon steel was used for the plunger, while high-grade spring steel was used for the load-measuring components for high elastic recovery. Calibration result gave proving ring constant as 0.02 kN/div. CBR test results on soil samples under unsoaked conditions gave CBR values ranging from 4.85 – 6.91 %, indicating poor to fair soils requiring stabilization or treatment for subgrade material. For soil tested under soaked conditions, the lowest CBR value of 0.82% showed poor subgrade soil that requires substantial stabilization, while the sample with the highest CBR value of 3.15% requires significant improvement. Statistical analysis of data using Minitab software version 2018 applied Fisher Pairwise tests for differences of means at 95% level of significance, which showed that soaked 2.5mm and unsoaked 5mm samples with P = 0.007 and soaked 5mm and unsoaked 5mm with P = 0.14 are significantly different for the top, and soaked 2.5mm and unsoaked 5mm having P = 0.028, are significantly different for the bottom. CBR values for all other top and bottom samples are not significantly different.
Spectrum Mobile became a large provider of mobile services in the US after being acquired by charter communications. The speed with which a company can grow increases the complexity of its system, which in turn leads to new challenges. Spectrum Mobile faced a number of challenges: inconsistent releases, delays in testing changes, etc. To overcome these challenges, the QA/DevOps team started working on four key elements within their overall testing procedure: compliance, resiliency, automation, and modernization. Several key initiatives were started as part of this new initiative including the implementation of CI/CD pipelines in the cloud to automate the customer experience; the creation of a "hotfix" lane for addressing critical issues in real time. New technologies such as Blue/Green deployments and service virtualization were utilized to address middleware instability and scalability issues within the company. By leveraging new technology, Spectrum Mobile was able to enhance the speed with which it releases new products and services to customers, improve the availability of their systems, and create a 50% increase in its number of subscribers and protect a large amount of revenue. This case study also illustrates how the modernization of legacy systems will continue to be necessary to maintain competitive advantage in the telecommunications industry. In addition, the case study indicates how cloud-based and AI-based technological advancements will enable Spectrum Mobile to develop automated and scalable architectures in the future. Spectrum Mobile will continue to explore the potential of artificial intelligence (AI), predictive analytics, and sustainable networks to support the evolving needs of society through technology.
REVIEW ARTICLE | Feb. 6, 2026
Enhancing Data Center Management and Deployment Through Microsoft Bootstrap Lite and Advanced Automation Technologies
Srikant Sudha Panda
Page no 88-95 |
https://doi.org/10.36348/sjet.2026.v11i02.003
Data centers serve as specialized operations centers that provide organizations with the ability to manage and analyze massive quantities of data that are necessary to support their operations. Data centers have been developed to support the increased demand for IT services, which is largely due to security concerns related to the transfer of large volumes of data over the Internet and the increased use of remote devices for business purposes. Data centers have been built with the concept of centralizing IT infrastructure as a means to improve security, control and increase productivity, and provide scalable resources to meet the needs of organizations. However, increasing the capacity of a data center presents a number of potential problems for organizations including incorrect installation of hardware and improper wiring of the data center. In response to the unique challenges that organizations face in successfully completing data center expansions, Microsoft provides a tool called Bootstrap Lite (BSL) that is designed to provide assurance that the configuration of hardware and racks meet established design specifications thus, improving both reliability and efficiency, while reducing error rates and increasing the time required to build out a data center. Various considerations are required when planning for data center expansions including the following considerations: Scale of the data center, power and cooling needs of the data center, security requirements of the data center, network connectivity to the data center, environmental sustainability of the data center operations, regulatory compliance of the data center, disaster recovery options associated with the data center, redundancy of the data center, and extensive hardware testing of the data center. In addition, BSL supports organizations in maintaining high levels of performance and reliability for their data centers, providing assurance that data centers will remain secure and adaptable to the changing needs of the organization in the digital economy.
The goal of this project is to modernize BP's OPSD2 Downstream Logistics ETL Solution by moving away from Mainframe DB2 database systems and using Oracle-based DBs and incorporating multiple source data sources through Informatica Mappings using an Agile Governance framework for ETL solutions. This study uses InformaticaPowerCenter's Repository-Integration service architecture to analyze the performance characteristics of InformaticaPowerCenter and find that Read Operations can scale efficiently under high levels of concurrency while Write Operations experience rapid degradations in performance due to the use of exclusive locks on certain tables. Therefore, there is a risk of failure if high-volume deployments are attempted using a Write Operation approach. Performance analysis performed using multiple tools have identified that the write latency is the primary cause of the performance constraint. To alleviate this issue, it is recommended that cache tuning, repository clustering and asynchronous bulk logging be used to meet the operational demands of fuel distribution. Looking forward, OPSD2 intends to evolve into BP's Real-time Net-Zero Analytics Platform by leveraging cloud-native solutions based on Informatica IDMC and Advanced Streaming capabilities.