Search results for: Software as a Service (SaaS)
6169 Designing a Model for Preparing Reports on the Automatic Earned Value Management Progress by the Integration of Primavera P6, SQL Database, and Power BI: A Case Study of a Six-Storey Concrete Building in Mashhad, Iran
Authors: Hamed Zolfaghari, Mojtaba Kord
Abstract:
Project planners and controllers are frequently faced with the challenge of inadequate software for the preparation of automatic project progress reports based on actual project information updates. They usually make dashboards in Microsoft Excel, which is local and not applicable online. Another shortcoming is that it is not linked to planning software such as Microsoft Project, which lacks the database required for data storage. This study aimed to propose a model for the preparation of reports on automatic online project progress based on actual project information updates by the integration of Primavera P6, SQL database, and Power BI for a construction project. The designed model could be applicable to project planners and controller agents by enabling them to prepare project reports automatically and immediately after updating the project schedule using actual information. To develop the model, the data were entered into P6, and the information was stored on the SQL database. The proposed model could prepare a wide range of reports, such as earned value management, HR reports, and financial, physical, and risk reports automatically on the Power BI application. Furthermore, the reports could be published and shared online.Keywords: primavera P6, SQL, Power BI, EVM, integration management
Procedia PDF Downloads 1106168 Training the Competences for the 'Expert Teacher': A Framework of Skills for Teachers
Authors: Sofia Cramerotti, Angela Cattoni, Laura Biancato, Dario Ianes
Abstract:
The recognition of specific standards for new professionals, within the teaching profile, is a necessary process in order to foster an innovative school vision in accordance with the change that school is experiencing. In line with the reform of the national education and training system and with the National Training Plan for teachers, our Research and Development department developed a training project based on a framework (Syllabus) of skills that each 'Expert Teacher' should master in order to fulfill what the different specific profiles request. The syllabus is a fundamental tool for a training process consistent with the teaching profiles, both to guide the to-become teachers entering in service and to provide the in-service teachers with a system of evaluation and improvement of their skills. According to the national and international literature about professional standards for teachers, we aggregated the skills of the syllabus in three macro areas: (1) Area of professional skills related to the teacher profile and their continuous training; (2) area of teaching skills related to the school innovation; (3) area of organizing skills related to school participation for its improvement. The syllabus is a framework that identifies and describes the skills of the expert teacher in all of their roles. However, the various skills take on different importance in the different profiles involved in the school; some of those skills are determining a role, others could be secondary. Therefore, the characterization of the different profiles is represented by suitably weighted skills sets. In this way, the same skill could differently characterize each profile. In the future, we hope that the skills development and training for the teacher could evolve in a skills development and training for the whole school staff ('Expert Team'). In this perspective, the school will, therefore, benefit from a solid team, in which the skills of the various profiles are all properly developed and well represented.Keywords: framework, skills, teachers, training
Procedia PDF Downloads 1816167 A Simple Device for Characterizing High Power Electron Beams for Welding
Authors: Aman Kaur, Colin Ribton, Wamadeva Balachandaran
Abstract:
Electron beam welding due to its inherent advantages is being extensively used for material processing where high precision is required. Especially in aerospace or nuclear industries, there are high quality requirements and the cost of materials and processes is very high which makes it very important to ensure the beam quality is maintained and checked prior to carrying out the welds. Although the processes in these industries are highly controlled, however, even the minor changes in the operating parameters of the electron gun can make large enough variations in the beam quality that can result in poor welding. To measure the beam quality a simple device has been designed that can be used at high powers. The device consists of two slits in x and y axis which collects a small portion of the beam current when the beam is deflected over the slits. The signals received from the device are processed in data acquisition hardware and the dedicated software developed for the device. The device has been used in controlled laboratory environments to analyse the signals and the weld quality relationships by varying the focus current. The results showed matching trends in the weld dimensions and the beam characteristics. Further experimental work is being carried out to determine the ability of the device and signal processing software to detect subtle changes in the beam quality and to relate these to the physical weld quality indicators.Keywords: electron beam welding, beam quality, high power, weld quality indicators
Procedia PDF Downloads 3246166 Investigation of the Impact of Family Status and Blood Group on Individuals’ Addiction
Authors: Masoud Abbasalipour
Abstract:
In this study, the impact of family status on individuals, involving factors such as parents' literacy level, family size, individuals' blood group, and susceptibility to addiction, was investigated. Statistical tests were employed to scrutinize the relationships among these specified factors. The statistical population of the study consisted of 338 samples divided into two groups: individuals with addiction and those without addiction in the city of Amol. The addicted group was selected from individuals visiting the substance abuse treatment center in Amol, and the non-addicted group was randomly selected from individuals in urban and rural areas. The Chi-square test was used to examine the presence or absence of relationships among the variables, and Kramer's V test was employed to determine the strength of the relationship between them. Excel software facilitated the initial entry of data, and SPSS software was utilized for the desired statistical tests. The research results indicated a significant relationship between the variable of parents' education level and individuals' addiction. The analysis showed that the education level of their parents was significantly lower compared to non-addicted individuals. However, the variables of the number of family members and blood group did not significantly impact individuals' susceptibility to addiction.Keywords: addiction, blood group, parents' literacy level, family status
Procedia PDF Downloads 706165 Official Secrecy and Confidentiality in Tax Administration and Its Impact on Right to Access Information: Nigerian Perspectives
Authors: Kareem Adedokun
Abstract:
Official secrecy is one of the colonial vestiges which upholds non – disclosure of essential information for public consumption. Information, though an indispensable tool in tax administration, is not to be divulged by any person in an official duty of the revenue agency. As a matter o fact, the Federal Inland Revenue Service (Establishment) Act, 2007 emphasizes secrecy and confidentiality in dealing with tax payer’s document, information, returns and assessment in a manner reminiscent of protecting tax payer’s privacy in all situations. It is so serious that any violation attracts criminal sanction. However, Nigeria, being a democratic and egalitarian state recently enacted Freedom of Information Act which heralded in openness in governance and takes away the confidentialities associated with official secrets Laws. Official secrecy no doubts contradicts the philosophy of freedom of information but maintaining a proper balance between protected rights of tax payers and public interest which revenue agency upholds is an uphill task. Adopting the Doctrinal method, therefore, the author of this paper probes into the real nature of the relationship between taxpayers and Revenue Agencies. It also interfaces official secrecy with the doctrine of Freedom of Information and consequently queries the retention of non – disclosure clause under Federal Inland Revenue Service (Establishment) Act (FIRSEA) 2007. The paper finds among others that non – disclosure provision in tax statutes particularly as provided for in FIRSEA is not absolute; so also is the constitutional rights and freedom of information and unless the non – disclosure clause finds justification under any recognized exemption provided under the Freedom of Information Act, its retention is antithesis to democratic ethos and beliefs as it may hinder public interest and public order.Keywords: confidentiality, information, official secrecy, tax administration
Procedia PDF Downloads 3426164 THRAP2 Gene Identified as a Candidate Susceptibility Gene of Thyroid Autoimmune Diseases Pedigree in Tunisian Population
Authors: Ghazi Chabchoub, Mouna Feki, Mohamed Abid, Hammadi Ayadi
Abstract:
Autoimmune thyroid diseases (AITDs), including Graves’ disease (GD) and Hashimoto’s thyroiditis (HT), are inherited as complex traits. Genetic factors associated with AITDs have been tentatively identified by candidate gene and genome scanning approaches. We analysed three intragenic microsatellite markers in the thyroid hormone receptor associated protein 2 gene (THRAP2), mapped near D12S79 marker, which have a potential role in immune function and inflammation [THRAP2-1(TG)n, THRAP2-2 (AC)n and THRAP2-3 (AC)n]. Our study population concerned 12 patients affected with AITDs belonging to a multiplex Tunisian family with high prevalence of AITDs. Fluorescent genotyping was carried out on ABI 3100 sequencers (Applied Biosystems USA) with the use of GENESCAN for semi-automated fragment sizing and GENOTYPER peak-calling software. Statistical analysis was performed using the non parametric Lod score (NPL) by Merlin software. Merlin outputs non-parametric NPLall (Z) and LOD scores and their corresponding asymptotic P values. The analysis for three intragenic markers in the THRAP2 gene revealed strong evidence for linkage (NPL=3.68, P=0.00012). Our results suggested the possible role of THRAP2 gene in AITDs susceptibility in this family.Keywords: autoimmunity, autoimmune disease, genetic, linkage analysis
Procedia PDF Downloads 1276163 Review of Currently Adopted Intelligent Programming Tutors
Authors: Rita Garcia
Abstract:
Intelligent Programming Tutors, IPTs, are supplemental educational devices that assist in teaching software development. These systems provide customized learning allowing the user to select the presentation pace, pedagogical strategy, and to recall previous and additional teaching materials reinforcing learning objectives. In addition, IPTs automatically records individual’s progress, providing feedback to the instructor and student. These tutoring systems have an advantage over Tutoring Systems because Intelligent Programming Tutors are not limited to one teaching strategy and can adjust when it detects the user struggling with a concept. The Intelligent Programming Tutor is a category of Intelligent Tutoring Systems, ITS. ITS are available for many fields in education, supporting different learning objectives and integrate into other learning tools, improving the student's learning experience. This study provides a comparison of the IPTs currently adopted by the educational community and will focus on the different teaching methodologies and programming languages. The study also includes the ability to integrate the IPT into other educational technologies, such as massive open online courses, MOOCs. The intention of this evaluation is to determine one system that would best serve in a larger ongoing research project and provide findings for other institutions looking to adopt an Intelligent Programming Tutor.Keywords: computer education tools, integrated software development assistance, intelligent programming tutors, tutoring systems
Procedia PDF Downloads 3186162 Investigating the Effective Physical Factors in the Development of Coastal Ecotourism in Southern Islands of Iran: A Case Study of Hendurabi Island, Iran
Authors: Zahra Khodaee
Abstract:
Background and Objective: The attractive potential for tourism in the southern islands of Iran, Kish, and Qeshm and recently Hendurabi, are becoming more and more popular and object of increased attention from the investors. The Iranian coral reef islands, with the exception of Kish and Qeshm, have not undergone sufficient development. The southern islands of Iran have faced two problems with climate change and the desire for the presence of tourists. The lack of proper planning, inefficient management, and lack of adequate knowledge of ecosystems of offshore regions have severely damaged the world natural heritage. This study was conducted to consider the correlation of tourism, development, and ecosystem because there is a need for further addressing the ecotourism in coral islands. Method: Through qualitative research, this paper was used of library studies and field studies and surveying to study the physical (objective-subjective) physical factors of ecotourism development in Honduran Island. Using SPSS software and descriptive-analytical method was shown the results. The survey was conducted with the participation of 150 tourists on Kish islands, who were chosen at random and who expressed their desire to travel to Hendurabi Island. Information was gathered using SPSS software and unique statistical T-test. The questionnaire was put together using AMOS software to ensure that the questions asked were sufficiently relevant. Findings: The results of this study presented that physical factors affecting the development of ecotourism in two categories are objective and subjective factors because IFI factor = 0.911 and CFI Factor = 0.907 into the target community. Discussion and conclusion: The results were satisfactory in that they showed that eco-tourists attached importance to see views, quiet, secluded areas, tranquility security, quality of the area being visited, easy access to services these were the top criteria for those visiting the area while they adhere to environmental compliance. Developing Management of these regions should maintain appropriate utilization along with sustainable and ecological responsibility.Keywords: ecotourism, coral reef island, development management, Hendurabi Island
Procedia PDF Downloads 1436161 Aging Evaluation of Ammonium Perchlorate/Hydroxyl Terminated Polybutadiene-Based Solid Rocket Engine by Reactive Molecular Dynamics Simulation and Thermal Analysis
Authors: R. F. B. Gonçalves, E. N. Iwama, J. A. F. F. Rocco, K. Iha
Abstract:
Propellants based on Hydroxyl Terminated Polybutadiene/Ammonium Perchlorate (HTPB/AP) are the most commonly used in most of the rocket engines used by the Brazilian Armed Forces. This work aimed at the possibility of extending its useful life (currently in 10 years) by performing kinetic-chemical analyzes of its energetic material via Differential Scanning Calorimetry (DSC) and also performing computer simulation of aging process using the software Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS). Thermal analysis via DSC was performed in triplicates and in three heating ratios (5 ºC, 10 ºC, and 15 ºC) of rocket motor with 11 years shelf-life, using the Arrhenius equation to obtain its activation energy, using Ozawa and Kissinger kinetic methods, allowing comparison with manufacturing period data (standard motor). In addition, the kinetic parameters of internal pressure of the combustion chamber in 08 rocket engines with 11 years of shelf-life were also acquired, for comparison purposes with the engine start-up data.Keywords: shelf-life, thermal analysis, Ozawa method, Kissinger method, LAMMPS software, thrust
Procedia PDF Downloads 1286160 Universal Health Coverage 2019 in Indonesia: The Integration of Family Planning Services in Current Functioning Health System
Authors: Fathonah Siti, Ardiana Irma
Abstract:
Indonesia is currently on its track to achieve Universal Health Coverage (UHC) by 2019. The program aims to address issues on disintegration in the implementation and coverage of various health insurance schemes and fragmented fund pooling. Family planning service is covered as one of benefit packages under preventive care. However, little has been done to examine how family planning program are appropriately managed across levels of governments and how family planning services are delivered to the end user. The study is performed through focus group discussion to related policy makers and selected programmers at central and district levels. The study is also benefited from relevant studies on family planning in the UHC scheme and other supporting data. The study carefully investigates some programmatic implications when family planning is integrated in the UHC program encompassing the need to recalculate contraceptive logistics for beneficiaries (eligible couple); policy reformulation for contraceptive service provision including supply chain management; establishment of family planning standard of procedure; and a call to update Management Information System. The study confirms that there is a significant increase in the numbers of contraceptive commodities needs to be procured by the government. Holding an assumption that contraceptive prevalence rate and commodities cost will be as expected increasing at 0.5% annually, the government need to allocate almost IDR 5 billion by 2019, excluded fee for service. The government shifts its focus to maintain eligible health facilities under National Population and Family Planning Board networks. By 2019, the government has set strategies to anticipate the provision of family planning services to 45.340 health facilities distributed in 514 districts and 7 thousand sub districts. Clear division of authorities has been established among levels of governments. Three models of contraceptive supply planning have been developed and currently in the process of being institutionalized. Pre service training for family planning services has been piloted in 10 prominent universities. The position of private midwives has been appreciated as part of the system. To ensure the implementation of quality and health expenditure control, family planning standard has been established as a reference to determine set of services required to deliver to the clients properly and types of health facilities to conduct particular family planning services. Recognition to individual status of program participation has been acknowledged in the Family Enumeration since 2015. The data is precisely recorded by name by address for each family and its members. It supplies valuable information to 15.131 Family Planning Field Workers (FPFWs) to provide information and education related to family planning in an attempt to generate demand and maintain the participation of family planning acceptors who are program beneficiaries. Despite overwhelming efforts described above, some obstacles remain. The program experiences poor socialization and yet removes geographical barriers for those living in remote areas. Family planning services provided for this sub population conducted outside the scheme as a complement strategy. However, UHC program has brought remarkable improvement in access and quality of family planning services.Keywords: beneficiary, family planning services, national population and family planning board, universal health coverage
Procedia PDF Downloads 1906159 Service Flow in Multilayer Networks: A Method for Evaluating the Layout of Urban Medical Resources
Authors: Guanglin Song
Abstract:
(Objective) Situated within the context of China's tiered medical treatment system, this study aims to analyze spatial causes of urban healthcare access difficulties from the perspective of the configuration of healthcare facilities. (Methods) A social network analysis approach is employed to construct a healthcare demand and supply flow network between major residential clusters and various tiers of hospitals in the city.(Conclusion) The findings reveal that:1.there exists overall maldistribution and over-concentration of healthcare resources in Study Area, characterized by structural imbalance; 2.the low rate of primary care utilization in Study Area is a key factor contributing to congestion at higher-tier hospitals, as excessive reliance on these institutions by neighboring communities exacerbates the problem; 3.gradual optimization of the healthcare facility layout in Study Area, encompassing holistic, local, and individual institutional levels, can enhance systemic efficiency and resource balance.(Prospects) This research proposes a method for evaluating urban healthcare resource distribution structures based on service flows within hierarchical networks. It offers spatially targeted optimization suggestions for promoting the implementation of the tiered healthcare system and alleviating challenges related to accessibility and congestion in seeking medical care. Provide some new ideas for researchers and healthcare managers in countries, cities, and healthcare management around the world with similar challenges.Keywords: flow of public services, urban networks, healthcare facilities, spatial planning, urban networks
Procedia PDF Downloads 716158 An Empirical Study of Determinants Influencing Telemedicine Services Acceptance by Healthcare Professionals: Case of Selected Hospitals in Ghana
Authors: Jonathan Kissi, Baozhen Dai, Wisdom W. K. Pomegbe, Abdul-Basit Kassim
Abstract:
Protecting patient’s digital information is a growing concern for healthcare institutions as people nowadays perpetually live their lives through telemedicine services. These telemedicine services have been confronted with several determinants that hinder their successful implementations, especially in developing countries. Identifying such determinants that influence the acceptance of telemedicine services is also a problem for healthcare professionals. Despite the tremendous increase in telemedicine services, its adoption, and use has been quite slow in some healthcare settings. Generally, it is accepted in today’s globalizing world that the success of telemedicine services relies on users’ satisfaction. Satisfying health professionals and patients are one of the crucial objectives of telemedicine success. This study seeks to investigate the determinants that influence health professionals’ intention to utilize telemedicine services in clinical activities in a sub-Saharan African country in West Africa (Ghana). A hybridized model comprising of health adoption models, including technology acceptance theory, diffusion of innovation theory, and protection of motivation theory, were used to investigate these quandaries. The study was carried out in four government health institutions that apply and regulate telemedicine services in their clinical activities. A structured questionnaire was developed and used for data collection. Purposive and convenience sampling methods were used in the selection of healthcare professionals from different medical fields for the study. The collected data were analyzed based on structural equation modeling (SEM) approach. All selected constructs showed a significant relationship with health professional’s behavioral intention in the direction expected from prior literature including perceived usefulness, perceived ease of use, management strategies, financial sustainability, communication channels, patients security threat, patients privacy risk, self efficacy, actual service use, user satisfaction, and telemedicine services systems securities threat. Surprisingly, user characteristics and response efficacy of health professionals were not significant in the hybridized model. The findings and insights from this research show that health professionals are pragmatic when making choices for technology applications and also their willingness to use telemedicine services. They are, however, anxious about its threats and coping appraisals. The identified significant constructs in the study may help to increase efficiency, quality of services, quality patient care delivery, and satisfactory user satisfaction among healthcare professionals. The implantation and effective utilization of telemedicine services in the selected hospitals will aid as a strategy to eradicate hardships in healthcare services delivery. The service will help attain universal health access coverage to all populace. This study contributes to empirical knowledge by identifying the vital factors influencing health professionals’ behavioral intentions to adopt telemedicine services. The study will also help stakeholders of healthcare to formulate better policies towards telemedicine service usage.Keywords: telemedicine service, perceived usefulness, perceived ease of use, management strategies, security threats
Procedia PDF Downloads 1426157 The Reality of E-Commerce in Egypt and Its Role in Enhancing Companies' Competitiveness
Authors: Esam El Gohary
Abstract:
— The companies’ ability to survive and compete in the fierce competition is determined by its competitiveness level. With the spread of information technology use and appearance of online shopping, it became crucial for companies to adopt e-commerce system to increase its competitiveness. This paper was conducted with the purpose of determine how increasing the service value through e-commerce factors (competitive strategy, ICT infrastructures, logistics, security, human resources and innovation) can enhance companies' competitiveness. The problem of this paper is summarized in the absence of the thorough awareness of e-commerce benefits for business owners and customers, as well as how to reduce the intangibility attributes of e-commerce. For this purpose this paper describes the e-commerce in Egypt and its success factors (infrastructures, legal and regulatory environment, human resources and innovation), as well as displays the barriers of such factor, to investigate the significant of these factors on increasing service value and enhance companies' competitiveness. This paper revealed that e-commerce companies have many opportunities to enhance its competitiveness in Egypt, which is enhanced by several factors. The most important factors are “strong ICT infrastructure, qualified and skilled human resources, in addition to the distinctive logistics that distinguish Egypt due to its location, strong legal and regulatory environment and Innovation, as well as the competitive strategy. As well as, companies encounter several threats such as; the lack of infrastructures and logistics in rural areas, the absence of the inclusive understanding and awareness of e-commerce, fear from e-payment transactions and fraud, the ambiguity and burdensome of customs. Through the research findings several recommendations were introduced to both government and companies to overcome threats and exploit opportunities to improve performance and enhance companies' competitiveness.Keywords: e-commerce competitiveness, e-commerce factors, e-commerce in Egypt, information technology
Procedia PDF Downloads 1056156 Fabrication of Antimicrobial Dental Model Using Digital Light Processing (DLP) Integrated with 3D-Bioprinting Technology
Authors: Rana Mohamed, Ahmed E. Gomaa, Gehan Safwat, Ayman Diab
Abstract:
Background: Bio-fabrication is a multidisciplinary research field that combines several principles, fabrication techniques, and protocols from different fields. The open-source-software movement is a movement that supports the use of open-source licenses for some or all software as part of the broader notion of open collaboration. Additive manufacturing is the concept of 3D printing, where it is a manufacturing method through adding layer-by-layer using computer-aided designs (CAD). There are several types of AM system used, and they can be categorized by the type of process used. One of these AM technologies is Digital light processing (DLP) which is a 3D printing technology used to rapidly cure a photopolymer resin to create hard scaffolds. DLP uses a projected light source to cure (Harden or crosslinking) the entire layer at once. Current applications of DLP are focused on dental and medical applications. Other developments have been made in this field, leading to the revolutionary field 3D bioprinting. The open-source movement was started to spread the concept of open-source software to provide software or hardware that is cheaper, reliable, and has better quality. Objective: Modification of desktop 3D printer into 3D bio-printer and the integration of DLP technology and bio-fabrication to produce an antibacterial dental model. Method: Modification of a desktop 3D printer into a 3D bioprinter. Gelatin hydrogel and sodium alginate hydrogel were prepared with different concentrations. Rhizome of Zingiber officinale, Flower buds of Syzygium aromaticum, and Bulbs of Allium sativum were extracted, and extractions were selected on different levels (Powder, aqueous extracts, total oils, and Essential oils) prepared for antibacterial bioactivity. Agar well diffusion method along with the E. coli have been used to perform the sensitivity test for the antibacterial activity of the extracts acquired by Zingiber officinale, Syzygium aromaticum, and Allium sativum. Lastly, DLP printing was performed to produce several dental models with the natural extracted combined with hydrogel to represent and simulate the Hard and Soft tissues. Result: The desktop 3D printer was modified into 3D bioprinter using open-source software Marline and modified custom-made 3D printed parts. Sodium alginate hydrogel and gelatin hydrogel were prepared at 5% (w/v), 10% (w/v), and 15%(w/v). Resin integration with the natural extracts of Rhizome of Zingiber officinale, Flower buds of Syzygium aromaticum, and Bulbs of Allium sativum was done following the percentage 1- 3% for each extract. Finally, the Antimicrobial dental model was printed; exhibits the antimicrobial activity, followed by merging with sodium alginate hydrogel. Conclusion: The open-source movement was successful in modifying and producing a low-cost Desktop 3D Bioprinter showing the potential of further enhancement in such scope. Additionally, the potential of integrating the DLP technology with bioprinting is a promising step toward the usage of the antimicrobial activity using natural products.Keywords: 3D printing, 3D bio-printing, DLP, hydrogel, antibacterial activity, zingiber officinale, syzygium aromaticum, allium sativum, panax ginseng, dental applications
Procedia PDF Downloads 976155 Modeling Route Selection Using Real-Time Information and GPS Data
Authors: William Albeiro Alvarez, Gloria Patricia Jaramillo, Ivan Reinaldo Sarmiento
Abstract:
Understanding the behavior of individuals and the different human factors that influence the choice when faced with a complex system such as transportation is one of the most complicated aspects of measuring in the components that constitute the modeling of route choice due to that various behaviors and driving mode directly or indirectly affect the choice. During the last two decades, with the development of information and communications technologies, new data collection techniques have emerged such as GPS, geolocation with mobile phones, apps for choosing the route between origin and destination, individual service transport applications among others, where an interest has been generated to improve discrete choice models when considering the incorporation of these developments as well as psychological factors that affect decision making. This paper implements a discrete choice model that proposes and estimates a hybrid model that integrates route choice models and latent variables based on the observation on the route of a sample of public taxi drivers from the city of Medellín, Colombia in relation to its behavior, personality, socioeconomic characteristics, and driving mode. The set of choice options includes the routes generated by the individual service transport applications versus the driver's choice. The hybrid model consists of measurement equations that relate latent variables with measurement indicators and utilities with choice indicators along with structural equations that link the observable characteristics of drivers with latent variables and explanatory variables with utilities.Keywords: behavior choice model, human factors, hybrid model, real time data
Procedia PDF Downloads 1556154 Assessing the Spatial Distribution of Urban Parks Using Remote Sensing and Geographic Information Systems Techniques
Authors: Hira Jabbar, Tanzeel-Ur Rehman
Abstract:
Urban parks and open spaces play a significant role in improving physical and mental health of the citizens, strengthen the societies and make the cities more attractive places to live and work. As the world’s cities continue to grow, continuing to value green space in cities is vital but is also a challenge, particularly in developing countries where there is pressure for space, resources, and development. Offering equal opportunity of accessibility to parks is one of the important issues of park distribution. The distribution of parks should allow all inhabitants to have close proximity to their residence. Remote sensing and Geographic information systems (GIS) can provide decision makers with enormous opportunities to improve the planning and management of Park facilities. This study exhibits the capability of GIS and RS techniques to provide baseline knowledge about the distribution of parks, level of accessibility and to help in identification of potential areas for such facilities. For this purpose Landsat OLI imagery for year 2016 was acquired from USGS Earth Explorer. Preprocessing models were applied using Erdas Imagine 2014v for the atmospheric correction and NDVI model was developed and applied to quantify the land use/land cover classes including built up, barren land, water, and vegetation. The parks amongst total public green spaces were selected based on their signature in remote sensing image and distribution. Percentages of total green and parks green were calculated for each town of Lahore City and results were then synchronized with the recommended standards. ANGSt model was applied to calculate the accessibility from parks. Service area analysis was performed using Network Analyst tool. Serviceability of these parks has been evaluated by employing statistical indices like service area, service population and park area per capita. Findings of the study may contribute in helping the town planners for understanding the distribution of parks, demands for new parks and potential areas which are deprived of parks. The purpose of present study is to provide necessary information to planners, policy makers and scientific researchers in the process of decision making for the management and improvement of urban parks.Keywords: accessible natural green space standards (ANGSt), geographic information systems (GIS), remote sensing (RS), United States geological survey (USGS)
Procedia PDF Downloads 3436153 Customer Churn Prediction by Using Four Machine Learning Algorithms Integrating Features Selection and Normalization in the Telecom Sector
Authors: Alanoud Moraya Aldalan, Abdulaziz Almaleh
Abstract:
A crucial component of maintaining a customer-oriented business as in the telecom industry is understanding the reasons and factors that lead to customer churn. Competition between telecom companies has greatly increased in recent years. It has become more important to understand customers’ needs in this strong market of telecom industries, especially for those who are looking to turn over their service providers. So, predictive churn is now a mandatory requirement for retaining those customers. Machine learning can be utilized to accomplish this. Churn Prediction has become a very important topic in terms of machine learning classification in the telecommunications industry. Understanding the factors of customer churn and how they behave is very important to building an effective churn prediction model. This paper aims to predict churn and identify factors of customers’ churn based on their past service usage history. Aiming at this objective, the study makes use of feature selection, normalization, and feature engineering. Then, this study compared the performance of four different machine learning algorithms on the Orange dataset: Logistic Regression, Random Forest, Decision Tree, and Gradient Boosting. Evaluation of the performance was conducted by using the F1 score and ROC-AUC. Comparing the results of this study with existing models has proven to produce better results. The results showed the Gradients Boosting with feature selection technique outperformed in this study by achieving a 99% F1-score and 99% AUC, and all other experiments achieved good results as well.Keywords: machine learning, gradient boosting, logistic regression, churn, random forest, decision tree, ROC, AUC, F1-score
Procedia PDF Downloads 1346152 Modelling of Reactive Methodologies in Auto-Scaling Time-Sensitive Services With a MAPE-K Architecture
Authors: Óscar Muñoz Garrigós, José Manuel Bernabeu Aubán
Abstract:
Time-sensitive services are the base of the cloud services industry. Keeping low service saturation is essential for controlling response time. All auto-scalable services make use of reactive auto-scaling. However, reactive auto-scaling has few in-depth studies. This presentation shows a model for reactive auto-scaling methodologies with a MAPE-k architecture. Queuing theory can compute different properties of static services but lacks some parameters related to the transition between models. Our model uses queuing theory parameters to relate the transition between models. It associates MAPE-k related times, the sampling frequency, the cooldown period, the number of requests that an instance can handle per unit of time, the number of incoming requests at a time instant, and a function that describes the acceleration in the service's ability to handle more requests. This model is later used as a solution to horizontally auto-scale time-sensitive services composed of microservices, reevaluating the model’s parameters periodically to allocate resources. The solution requires limiting the acceleration of the growth in the number of incoming requests to keep a constrained response time. Business benefits determine such limits. The solution can add a dynamic number of instances and remains valid under different system sizes. The study includes performance recommendations to improve results according to the incoming load shape and business benefits. The exposed methodology is tested in a simulation. The simulator contains a load generator and a service composed of two microservices, where the frontend microservice depends on a backend microservice with a 1:1 request relation ratio. A common request takes 2.3 seconds to be computed by the service and is discarded if it takes more than 7 seconds. Both microservices contain a load balancer that assigns requests to the less loaded instance and preemptively discards requests if they are not finished in time to prevent resource saturation. When load decreases, instances with lower load are kept in the backlog where no more requests are assigned. If the load grows and an instance in the backlog is required, it returns to the running state, but if it finishes the computation of all requests and is no longer required, it is permanently deallocated. A few load patterns are required to represent the worst-case scenario for reactive systems: the following scenarios test response times, resource consumption and business costs. The first scenario is a burst-load scenario. All methodologies will discard requests if the rapidness of the burst is high enough. This scenario focuses on the number of discarded requests and the variance of the response time. The second scenario contains sudden load drops followed by bursts to observe how the methodology behaves when releasing resources that are lately required. The third scenario contains diverse growth accelerations in the number of incoming requests to observe how approaches that add a different number of instances can handle the load with less business cost. The exposed methodology is compared against a multiple threshold CPU methodology allocating/deallocating 10 or 20 instances, outperforming the competitor in all studied metrics.Keywords: reactive auto-scaling, auto-scaling, microservices, cloud computing
Procedia PDF Downloads 966151 Foundation Phase Teachers' Experiences of School Based Support Teams: A Case of Selected Schools in Johannesburg
Authors: Ambeck Celyne Tebid, Harry S. Rampa
Abstract:
The South African Education system recognises the need for all learners including those experiencing learning difficulties, to have access to a single unified system of education. For teachers to be pedagogically responsive to an increasingly diverse learner population without appropriate support has been proven to be unrealistic. As such, this has considerably hampered interest amongst teachers, especially those at the foundation phase to work within an Inclusive Education (IE) and training system. This qualitative study aimed at investigating foundation phase teachers’ experiences of school-based support teams (SBSTs) in two Full-Service (inclusive schools) and one Mainstream public primary school in the Gauteng province of South Africa; with particular emphasis on finding ways to supporting them, since teachers claimed they were not empowered in their initial training to teach learners experiencing learning difficulties. Hence, SBSTs were created at school levels to fill this gap thereby, supporting teaching and learning by identifying and addressing learners’, teachers’ and schools’ needs. With the notion that IE may be failing because of systemic reasons, this study uses Bronfenbrenner’s (1979) ecosystemic as well as Piaget’s (1980) maturational theory to examine the nature of support and experiences amongst teachers taking individual and systemic factors into consideration. Data was collected using in-depth, face-to-face interviews, document analysis and observation with 6 foundation phase teachers drawn from 3 different schools, 3 SBST coordinators, and 3 school principals. Data was analysed using the phenomenological data analysis method. Amongst the findings of the study is that South African full- service and mainstream schools have functional SBSTs which render formal and informal support to the teachers; this support varies in quality depending on the socio-economic status of the relevant community where the schools are situated. This paper, however, argues that what foundation phase teachers settled for as ‘support’ is flawed; as well as how they perceive the SBST and its role is problematic. The paper conclude by recommending that, the SBST should consider other approaches at foundation phase teacher support such as, empowering teachers with continuous practical experiences on how to deal with real classroom scenarios, as well as ensuring that all support, be it on academic or non-academic issues should be provided within a learning community framework where the teacher, family, SBST and where necessary, community organisations should harness their skills towards a common goal.Keywords: foundation phase, full- service schools, inclusive education, learning difficulties, school-based support teams, teacher support
Procedia PDF Downloads 2386150 Net Regularity and Its Ethical Implications on Internet Stake Holders
Authors: Nourhan Elshenawi
Abstract:
Net Neutrality (NN) is the principle of treating all online data the same without any prioritization of some over others. A research gap in current scholarship about “violations of NN” and the subsequent ethical concerns paves the way for the following research question: To what extent violations of NN entail ethical concerns and implications for Internet stakeholders? To answer this question, NR is examined using the two major action-based ethical theories, Kantian and Utilitarian, across the relevant Internet stakeholders. First some necessary IT background is provided that shapes how the Internet works and who the key stakeholders are. Following the IT background, the relationship between the stakeholders, users, Internet Service Providers (ISPs) and content providers is discussed and illustrated. Then some violations of NN that are currently occurring is covered, without attracting any attention from the general public from an ethical perspective, as a new term Net Regularity (NR). Afterwards, the current scholarship on NN and its violations are discussed, that are mainly from an economic and sociopolitical perspectives to highlight the lack of ethical discussions on the issue. Before moving on to the ethical analysis however, websites are presented as digital entities that are affected by NR and their happiness is measured using functionalism. The analysis concludes that NR is prone to an unethical treatment of Internet stakeholders in the perspective of both theories. Finally, the current Digital Divide in the world is presented to be able to better illustrate the implications of NR. The implications present the new Internet divide that will take place between individuals within society. Through answering the research question using ethical analysis, it attempts to shed some light on the issue of NR and what kind of society it would lead to. NR would not just lead to a divided society, but divided individuals that are separated by something greater than distance, the Internet.Keywords: digital divide, digital entities, digital ontology, internet ethics, internet law, net neutrality, internet service providers, websites as beings
Procedia PDF Downloads 2766149 Numerical Modelling and Soil-structure Interaction Analysis of Rigid Ballast-less and Flexible Ballast-based High-speed Rail Track-embankments Using Software
Authors: Tokirhusen Iqbalbhai Shaikh, M. V. Shah
Abstract:
With an increase in travel demand and a reduction in travel time, high-speed rail (HSR) has been introduced in India. Simplified 3-D finite element modelling is necessary to predict the stability and deformation characteristics of railway embankments and soil structure interaction behaviour under high-speed design requirements for Indian soil conditions. The objective of this study is to analyse the rigid ballast-less and flexible ballast-based high speed rail track embankments for various critical conditions subjected to them, viz. static condition, moving train condition, sudden brake application, and derailment case, using software. The input parameters for the analysis are soil type, thickness of the relevant strata, unit weight, Young’s modulus, Poisson’s ratio, undrained cohesion, friction angle, dilatancy angle, modulus of subgrade reaction, design speed, and other anticipated, relevant data. Eurocode 1, IRS-004(D), IS 1343, IRS specifications, California high-speed rail technical specifications, and the NHSRCL feasibility report will be followed in this study.Keywords: soil structure interaction, high speed rail, numerical modelling, PLAXIS3D
Procedia PDF Downloads 1106148 Traditional Drawing, BIM and Erudite Design Process
Authors: Maryam Kalkatechi
Abstract:
Nowadays, parametric design, scientific analysis, and digital fabrication are dominant. Many architectural practices are increasingly seeking to incorporate advanced digital software and fabrication in their projects. Proposing an erudite design process that combines digital and practical aspects in a strong frame within the method was resulted from the dissertation research. The digital aspects are the progressive advancements in algorithm design and simulation software. These aspects have assisted the firms to develop more holistic concepts at the early stage and maintain collaboration among disciplines during the design process. The erudite design process enhances the current design processes by encouraging the designer to implement the construction and architecture knowledge within the algorithm to make successful design processes. The erudite design process also involves the ongoing improvements of applying the new method of 3D printing in construction. This is achieved through the ‘data-sketches’. The term ‘data-sketch’ was developed by the author in the dissertation that was recently completed. It accommodates the decisions of the architect on the algorithm. This paper introduces the erudite design process and its components. It will summarize the application of this process in development of the ‘3D printed construction unit’. This paper contributes to overlaying the academic and practice with advanced technology by presenting a design process that transfers the dominance of tool to the learned architect and encourages innovation in design processes.Keywords: erudite, data-sketch, algorithm design in architecture, design process
Procedia PDF Downloads 2776147 Accurate Position Electromagnetic Sensor Using Data Acquisition System
Authors: Z. Ezzouine, A. Nakheli
Abstract:
This paper presents a high position electromagnetic sensor system (HPESS) that is applicable for moving object detection. The authors have developed a high-performance position sensor prototype dedicated to students’ laboratory. The challenge was to obtain a highly accurate and real-time sensor that is able to calculate position, length or displacement. An electromagnetic solution based on a two coil induction principal was adopted. The HPESS converts mechanical motion to electric energy with direct contact. The output signal can then be fed to an electronic circuit. The voltage output change from the sensor is captured by data acquisition system using LabVIEW software. The displacement of the moving object is determined. The measured data are transmitted to a PC in real-time via a DAQ (NI USB -6281). This paper also describes the data acquisition analysis and the conditioning card developed specially for sensor signal monitoring. The data is then recorded and viewed using a user interface written using National Instrument LabVIEW software. On-line displays of time and voltage of the sensor signal provide a user-friendly data acquisition interface. The sensor provides an uncomplicated, accurate, reliable, inexpensive transducer for highly sophisticated control systems.Keywords: electromagnetic sensor, accurately, data acquisition, position measurement
Procedia PDF Downloads 2866146 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation
Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk
Abstract:
The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set
Procedia PDF Downloads 2206145 Sustainability of Heritage Management in Aksum: Focus on Heritage Conservation and Interpretation
Authors: Gebrekiros Welegebriel Asfaw
Abstract:
The management of the fragile, unique and irreplaceable cultural heritage from different perspectives is becoming a major challenge as important elements of culture are vanishing throughout the globe. The major purpose of this study is to assess how the cultural heritages of Aksum are managed for their future sustainability from heritage conservation and interpretation perspectives. Descriptive type of research design inculcating both quantitative and qualitative research methods is employed. Primary quantitative data was collected from 189 respondents (19 professionals, 88 tourism service providers and 82 tourists) and interview was conducted with 33 targeted informants from heritage and related professions, security employees, local community, service providers and church representatives by applying probability and non probability sampling methods. Findings of the study reveal that the overall sustainable management status of the cultural heritage of Aksum is below average. It is found that the sustainability of cultural heritage management in Aksum is facing a lot of unfavorable factors like lack of long term planning, incompatible system of heritage administration, limited capacity and number of professionals, scant attention to community based heritage and tourism development, dirtiness and drainage problems, problems with stakeholder involvement and cooperation, lack of organized interpretation and presentation systems and others. So, re-organization of the management system, creating platform for coordination among stakeholders and developing appropriate interpretation system can be good remedies. Introducing community based heritage and tourism development concept is also recommendable for a long term win-win success in Aksum.Keywords: Aksum, conservation, interpretation, Sustainable Cultural Heritage Management
Procedia PDF Downloads 3256144 Hansen Solubility Parameters, Quality by Design Tool for Developing Green Nanoemulsion to Eliminate Sulfamethoxazole from Contaminated Water
Authors: Afzal Hussain, Mohammad A. Altamimi, Syed Sarim Imam, Mudassar Shahid, Osamah Abdulrahman Alnemer
Abstract:
Exhaustive application of sulfamethoxazole (SUX) became as a global threat for human health due to water contamination through diverse sources. The addressed combined application of Hansen solubility (HSPiP software) parameters and Quality by Design tool for developing various green nanoemulsions. HSPiP program assisted to screen suitable excipients based on Hansen solubility parameters and experimental solubility data. Various green nanoemulsions were prepared and characterized for globular size, size distribution, zeta potential, and removal efficiency. Design Expert (DoE) software further helped to identify critical factors responsible to have direct impact on percent removal efficiency, size, and viscosity. Morphological investigation was visualized under transmission electron microscopy (TEM). Finally, the treated was studied to negate the presence of the tested drug employing ICP-OES (inductively coupled plasma optical emission microscopy) technique and HPLC (high performance liquid chromatography). Results showed that HSPiP predicted biocompatible lipid, safe surfactant (lecithin), and propylene glycol (PG). Experimental solubility of the drug in the predicted excipients were quite convincing and vindicated. Various green nanoemulsions were fabricated, and these were evaluated for in vitro findings. Globular size (100-300 nm), PDI (0.1-0.5), zeta potential (~ 25 mV), and removal efficiency (%RE = 70-98%) were found to be in acceptable range for deciding input factors with level in DoE. Experimental design tool assisted to identify the most critical variables controlling %RE and optimized content of nanoemulsion under set constraints. Dispersion time was varied from 5-30 min. Finally, ICP-OES and HPLC techniques corroborated the absence of SUX in the treated water. Thus, the strategy is simple, economic, selective, and efficient.Keywords: quality by design, sulfamethoxazole, green nanoemulsion, water treatment, icp-oes, hansen program (hspip software
Procedia PDF Downloads 846143 The Relationship between Land Use Factors and Feeling of Happiness at the Neighbourhood Level
Authors: M. Moeinaddini, Z. Asadi-Shekari, Z. Sultan, M. Zaly Shah
Abstract:
Happiness can be related to everything that can provide a feeling of satisfaction or pleasure. This study tries to consider the relationship between land use factors and feeling of happiness at the neighbourhood level. Land use variables (beautiful and attractive neighbourhood design, availability and quality of shopping centres, sufficient recreational spaces and facilities, and sufficient daily service centres) are used as independent variables and the happiness score is used as the dependent variable in this study. In addition to the land use variables, socio-economic factors (gender, race, marital status, employment status, education, and income) are also considered as independent variables. This study uses the Oxford happiness questionnaire to estimate happiness score of more than 300 people living in six neighbourhoods. The neighbourhoods are selected randomly from Skudai neighbourhoods in Johor, Malaysia. The land use data were obtained by adding related questions to the Oxford happiness questionnaire. The strength of the relationship in this study is found using generalised linear modelling (GLM). The findings of this research indicate that increase in happiness feeling is correlated with an increasing income, more beautiful and attractive neighbourhood design, sufficient shopping centres, recreational spaces, and daily service centres. The results show that all land use factors in this study have significant relationship with happiness but only income, among socio-economic factors, can affect happiness significantly. Therefore, land use factors can affect happiness in Skudai more than socio-economic factors.Keywords: neighbourhood land use, neighbourhood design, happiness, socio-economic factors, generalised linear modelling
Procedia PDF Downloads 1496142 Scientific Linux Cluster for BIG-DATA Analysis (SLBD): A Case of Fayoum University
Authors: Hassan S. Hussein, Rania A. Abul Seoud, Amr M. Refaat
Abstract:
Scientific researchers face in the analysis of very large data sets that is increasing noticeable rate in today’s and tomorrow’s technologies. Hadoop and Spark are types of software that developed frameworks. Hadoop framework is suitable for many Different hardware platforms. In this research, a scientific Linux cluster for Big Data analysis (SLBD) is presented. SLBD runs open source software with large computational capacity and high performance cluster infrastructure. SLBD composed of one cluster contains identical, commodity-grade computers interconnected via a small LAN. SLBD consists of a fast switch and Gigabit-Ethernet card which connect four (nodes). Cloudera Manager is used to configure and manage an Apache Hadoop stack. Hadoop is a framework allows storing and processing big data across the cluster by using MapReduce algorithm. MapReduce algorithm divides the task into smaller tasks which to be assigned to the network nodes. Algorithm then collects the results and form the final result dataset. SLBD clustering system allows fast and efficient processing of large amount of data resulting from different applications. SLBD also provides high performance, high throughput, high availability, expandability and cluster scalability.Keywords: big data platforms, cloudera manager, Hadoop, MapReduce
Procedia PDF Downloads 3616141 Relevance of Copyright and Trademark in the Gaming Industry
Authors: Deeksha Karunakar
Abstract:
The gaming industry is one of the biggest industries in the world. Video games are interactive works of authorship that require the execution of a computer programme on specialized hardware but which also incorporate a wide variety of other artistic mediums, such as music, scripts, stories, video, paintings, and characters, into which the player takes an active role. Therefore, video games are not made as singular, simple works but rather as a collection of elements that, if they reach a certain level of originality and creativity, can each be copyrighted on their own. A video game is made up of a wide variety of parts, all of which combine to form the overall sensation that we, the players, have while playing. The entirety of the components is implemented in the form of software code, which is then translated into the game's user interface. Even while copyright protection is already in place for the coding of software, the work that is produced because of that coding can also be protected by copyright. This includes the game's storyline or narrative, its characters, and even elements of the code on their own. In each sector, there is a potential legal framework required, and the gaming industry also requires legal frameworks. This represents the importance of intellectual property laws in each sector. This paper will explore the beginnings of video games, the various aspects of game copyrights, and the approach of the courts, including examples of a few different instances. Although the creative arts have always been known to draw inspiration from and build upon the works of others, it has not always been simple to evaluate whether a game has been cloned. The video game business is experiencing growth as it has never seen before today. The majority of today's video games are both pieces of software and works of audio-visual art. Even though the existing legal framework does not have a clause specifically addressing video games, it is clear that there is a great many alternative means by which this protection can be granted. This paper will represent the importance of copyright and trademark laws in the gaming industry and its regulations with the help of relevant case laws via utilizing doctrinal methodology to support its findings. The aim of the paper is to make aware of the applicability of intellectual property laws in the gaming industry and how the justice system is evolving to adapt to such new industries. Furthermore, it will provide in-depth knowledge of their relationship with each other.Keywords: copyright, DMCA, gaming industry, trademark, WIPO
Procedia PDF Downloads 706140 Technical Non-Destructive Evaluation of Burnt Bridge at CH. 57+450 Along Abuja-Abaji-Lokoja Road, Nigeria
Authors: Abraham O. Olaniyi, Oluyemi Oke, Atilade Otunla
Abstract:
The structural performance of bridges decreases progressively throughout their service life due to many contributing factors (fatigue, carbonation, fire incidents etc.). Around the world, numerous bridges have attained their estimated service life and many have approached this limit. The structural integrity assessment of the burnt composite bridge located at CH57+450, Koita village along Abuja-Abaji-Lokoja road, Nigeria, is presented as a case study and shall be forthwith referred to as the 'Koita bridge' in this paper. From the technical evaluation, the residual compressive strength of the concrete piers was found to be below 16.0 N/mm2. This value is very low compared to the expected design value of 30.0 N/mm2. The pier capping beam at pier location 1 has a very low residual compressive strength. The cover to the reinforcement of certain capping beams has an outline of reinforcement which signifies poor concrete cover and the mean compressive strength is also less than 20.0 N/mm2. The steel girder indicated black colouration as a result of the fire incident without any significant structural defect like buckling or warping of the steel section. This paper reviews the structural integrity assessment and repair methodology of the Koita bridge; a composite bridge damaged by fire, highlighting the various challenges of limited obtainable guidance documents about the bridge. The objectives are to increase the understanding of processes and versatile equipment required to test and assess a fire-damaged bridge in order to improve the quality of structural appraisal and rehabilitation; thus, eliminating the prejudice associated with current visual inspection techniques.Keywords: assessment, bridge, rehabilitation, sustainability
Procedia PDF Downloads 366