Search results for: underestimation errors
291 Causes and Impacts of Rework Costs in Construction Projects
Authors: Muhammad Ejaz1
Abstract:
Rework has been defined as: "The unnecessary effort of re-doing a process or activity that was incorrectly implemented the first time." A great threat to the construction industry is rework. By and large due attention has not been given to avoid the causes of reworks, resulting time and cost over runs, in civil engineering projects. Besides these direct consequences, there might also be indirect consequences, such as stress, de-motivation or loss of future clients. When delivered products do not meet the requirements or expectations, work often has to be redone. Rework occurs in various phases of the construction process or in various divisions of a company. Rework can occur on the construction site or in a management department due to for example bad materials management. Rework can also have internal or external origins. Changes in clients’ expectations are an example of an external factor that might lead to rework. Rework can cause many costs to be higher than calculated at the start of the project. Rework events can have many different origins and for this research they have been categorized into four categories; changes, errors, omissions, and damages. The research showed that the major source of reworks were non professional attitude from technical hands and ignorance of total quality management principals by stakeholders. It also revealed that sources of reworks have not major differences among project categories. The causes were further analyzed by interviewing employees. Based on existing literature an extensive list of rework causes was made and during the interviews the interviewees were asked to confirm or deny statements regarding rework causes. The causes that were most frequently confirmed can be grouped into the understanding categories. 56% (max) of the causes are change-related, 30% (max) is error-related and 18% (max) falls into another category. Therefore, by recognizing above mentioned factors, reworks can be reduced to a great extent.Keywords: total quality management, construction industry, cost overruns, rework, material management, client’s expectations
Procedia PDF Downloads 293290 A Two-Week and Six-Month Stability of Cancer Health Literacy Classification Using the CHLT-6
Authors: Levent Dumenci, Laura A. Siminoff
Abstract:
Health literacy has been shown to predict a variety of health outcomes. Reliable identification of persons with limited cancer health literacy (LCHL) has been proved questionable with existing instruments using an arbitrary cut point along a continuum. The CHLT-6, however, uses a latent mixture modeling approach to identify persons with LCHL. The purpose of this study was to estimate two-week and six-month stability of identifying persons with LCHL using the CHLT-6 with a discrete latent variable approach as the underlying measurement structure. Using a test-retest design, the CHLT-6 was administered to cancer patients with two-week (N=98) and six-month (N=51) intervals. The two-week and six-month latent test-retest agreements were 89% and 88%, respectively. The chance-corrected latent agreements estimated from Dumenci’s latent kappa were 0.62 (95% CI: 0.41 – 0.82) and .47 (95% CI: 0.14 – 0.80) for the two-week and six-month intervals, respectively. High levels of latent test-retest agreement between limited and adequate categories of cancer health literacy construct, coupled with moderate to good levels of change-corrected latent agreements indicated that the CHLT-6 classification of limited versus adequate cancer health literacy is relatively stable over time. In conclusion, the measurement structure underlying the instrument allows for estimating classification errors circumventing limitations due to arbitrary approaches adopted by all other instruments. The CHLT-6 can be used to identify persons with LCHL in oncology clinics and intervention studies to accurately estimate treatment effectiveness.Keywords: limited cancer health literacy, the CHLT-6, discrete latent variable modeling, latent agreement
Procedia PDF Downloads 178289 Predicting the Human Impact of Natural Onset Disasters Using Pattern Recognition Techniques and Rule Based Clustering
Authors: Sara Hasani
Abstract:
This research focuses on natural sudden onset disasters characterised as ‘occurring with little or no warning and often cause excessive injuries far surpassing the national response capacities’. Based on the panel analysis of the historic record of 4,252 natural onset disasters between 1980 to 2015, a predictive method was developed to predict the human impact of the disaster (fatality, injured, homeless) with less than 3% of errors. The geographical dispersion of the disasters includes every country where the data were available and cross-examined from various humanitarian sources. The records were then filtered into 4252 records of the disasters where the five predictive variables (disaster type, HDI, DRI, population, and population density) were clearly stated. The procedure was designed based on a combination of pattern recognition techniques and rule-based clustering for prediction and discrimination analysis to validate the results further. The result indicates that there is a relationship between the disaster human impact and the five socio-economic characteristics of the affected country mentioned above. As a result, a framework was put forward, which could predict the disaster’s human impact based on their severity rank in the early hours of disaster strike. The predictions in this model were outlined in two worst and best-case scenarios, which respectively inform the lower range and higher range of the prediction. A necessity to develop the predictive framework can be highlighted by noticing that despite the existing research in literature, a framework for predicting the human impact and estimating the needs at the time of the disaster is yet to be developed. This can further be used to allocate the resources at the response phase of the disaster where the data is scarce.Keywords: disaster management, natural disaster, pattern recognition, prediction
Procedia PDF Downloads 153288 Comparing Phonological Processes in Persian-Arabic Bilingual Children and Monolingual Children
Authors: Vafa Delphi, Maryam Delphi, Talieh Zarifian, Enayatolah Bakhshi
Abstract:
Background and Aim: Bilingualism is a common phenomenon in many countries of the world and May be consistent consonant errors in the speech of bilingual children. The aim of this study was to evaluate Phonological skills include occurrence proportion, frequency and type of phonological processes in Persian-Arabic speaking children in Ahvaz city, the center of Khuzestan. Method: This study is descriptive-analytical and cross-sectional. Twenty-eight children aged 36-48 months were divided into two groups Persian monolingual and Persian-Arabic bilingual: (14 participants in each group). Sampling was recruited randomly based on inclusion criteria from kindergartens of the Ahvaz city in Iran. The tool of this study was the Persian Phonological Test (PPT), a subtest of Persian Diagnostic Evaluation Articulation and Phonological test. In this test, Phonological processes were investigated in two groups: structure and substitution processes. Data was investigated using SPSS software and the U Mann-Whitney test. Results: The results showed that the proportion occurrence of substitution process was significantly different between two groups of monolingual and bilingual (P=0/001), But the type of phonological processes didn’t show a significant difference in both monolingual and bilingual children of the Persian-Arabic.The frequency of phonological processes is greater in bilingual children than monolingual children. Conclusion: The study showed that bilingualism has no effect on type of phonological processes, but this can be effective on the frequency of processes. Since the type of phonological processes in bilingual children is similar to monolingual children So we can conclude the Persian_arabic bilingual children's phonological system is similar to monolingual children.Keywords: Persian-Arabic bilingual child, phonological processes, the proportion occurrence of syllable structure, the proportion occurrence of substitution
Procedia PDF Downloads 316287 Simulating the Dynamics of E-waste Production from Mobile Phone: Model Development and Case Study of Rwanda
Authors: Rutebuka Evariste, Zhang Lixiao
Abstract:
Mobile phone sales and stocks showed an exponential growth in the past years globally and the number of mobile phones produced each year was surpassing one billion in 2007, this soaring growth of related e-waste deserves sufficient attentions paid to it regionally and globally as long as 40% of its total weight is made from metallic which 12 elements are identified to be highly hazardous and 12 are less harmful. Different research and methods have been used to estimate the obsolete mobile phones but none has developed a dynamic model and handle the discrepancy resulting from improper approach and error in the input data. The study aim was to develop a comprehensive dynamic system model for simulating the dynamism of e-waste production from mobile phone regardless the country or region and prevail over the previous errors. The logistic model method combined with STELLA program has been used to carry out this study. Then the simulation for Rwanda has been conducted and compared with others countries’ results as model testing and validation. Rwanda is about 1.5 million obsoletes mobile phone with 125 tons of waste in 2014 with e-waste production peak in 2017. It is expected to be 4.17 million obsoletes with 351.97 tons by 2020 along with environmental impact intensity of 21times to 2005. Thus, it is concluded through the model testing and validation that the present dynamic model is competent and able deal with mobile phone e-waste production the fact that it has responded to the previous studies questions from Czech Republic, Iran, and China.Keywords: carrying capacity, dematerialization, logistic model, mobile phone, obsolescence, similarity, Stella, system dynamics
Procedia PDF Downloads 344286 Object-Based Image Analysis for Gully-Affected Area Detection in the Hilly Loess Plateau Region of China Using Unmanned Aerial Vehicle
Authors: Hu Ding, Kai Liu, Guoan Tang
Abstract:
The Chinese Loess Plateau suffers from serious gully erosion induced by natural and human causes. Gully features detection including gully-affected area and its two dimension parameters (length, width, area et al.), is a significant task not only for researchers but also for policy-makers. This study aims at gully-affected area detection in three catchments of Chinese Loess Plateau, which were selected in Changwu, Ansai, and Suide by using unmanned aerial vehicle (UAV). The methodology includes a sequence of UAV data generation, image segmentation, feature calculation and selection, and random forest classification. Two experiments were conducted to investigate the influences of segmentation strategy and feature selection. Results showed that vertical and horizontal root-mean-square errors were below 0.5 and 0.2 m, respectively, which were ideal for the Loess Plateau region. The segmentation strategy adopted in this paper, which considers the topographic information, and optimal parameter combination can improve the segmentation results. Besides, the overall extraction accuracy in Changwu, Ansai, and Suide achieved was 84.62%, 86.46%, and 93.06%, respectively, which indicated that the proposed method for detecting gully-affected area is more objective and effective than traditional methods. This study demonstrated that UAV can bridge the gap between field measurement and satellite-based remote sensing, obtaining a balance in resolution and efficiency for catchment-scale gully erosion research.Keywords: unmanned aerial vehicle (UAV), object-analysis image analysis, gully erosion, gully-affected area, Loess Plateau, random forest
Procedia PDF Downloads 218285 ALEF: An Enhanced Approach to Arabic-English Bilingual Translation
Authors: Abdul Muqsit Abbasi, Ibrahim Chhipa, Asad Anwer, Saad Farooq, Hassan Berry, Sonu Kumar, Sundar Ali, Muhammad Owais Mahmood, Areeb Ur Rehman, Bahram Baloch
Abstract:
Accurate translation between structurally diverse languages, such as Arabic and English, presents a critical challenge in natural language processing due to significant linguistic and cultural differences. This paper investigates the effectiveness of Facebook’s mBART model, fine-tuned specifically for sequence-tosequence (seq2seq) translation tasks between Arabic and English, and enhanced through advanced refinement techniques. Our approach leverages the Alef Dataset, a meticulously curated parallel corpus spanning various domains to capture the linguistic richness, nuances, and contextual accuracy essential for high-quality translation. We further refine the model’s output using advanced language models such as GPT-3.5 and GPT-4, which improve fluency, coherence, and correct grammatical errors in translated texts. The fine-tuned model demonstrates substantial improvements, achieving a BLEU score of 38.97, METEOR score of 58.11, and TER score of 56.33, surpassing widely used systems such as Google Translate. These results underscore the potential of mBART, combined with refinement strategies, to bridge the translation gap between Arabic and English, providing a reliable, context-aware machine translation solution that is robust across diverse linguistic contexts.Keywords: natural language processing, machine translation, fine-tuning, Arabic-English translation, transformer models, seq2seq translation, translation evaluation metrics, cross-linguistic communication
Procedia PDF Downloads 10284 Feasibility Study of MongoDB and Radio Frequency Identification Technology in Asset Tracking System
Authors: Mohd Noah A. Rahman, Afzaal H. Seyal, Sharul T. Tajuddin, Hartiny Md Azmi
Abstract:
Taking into consideration the real time situation specifically the higher academic institutions, small, medium to large companies, public to private sectors and the remaining sectors, do experience the inventory or asset shrinkages due to theft, loss or even inventory tracking errors. This happening is due to a zero or poor security systems and measures being taken and implemented in their organizations. Henceforth, implementing the Radio Frequency Identification (RFID) technology into any manual or existing web-based system or web application can simply deter and will eventually solve certain major issues to serve better data retrieval and data access. Having said, this manual or existing system can be enhanced into a mobile-based system or application. In addition to that, the availability of internet connections can aid better services of the system. Such involvement of various technologies resulting various privileges to individuals or organizations in terms of accessibility, availability, mobility, efficiency, effectiveness, real-time information and also security. This paper will look deeper into the integration of mobile devices with RFID technologies with the purpose of asset tracking and control. Next, it is to be followed by the development and utilization of MongoDB as the main database to store data and its association with RFID technology. Finally, the development of a web based system which can be viewed in a mobile based formation with the aid of Hypertext Preprocessor (PHP), MongoDB, Hyper-Text Markup Language 5 (HTML5), Android, JavaScript and AJAX programming language.Keywords: RFID, asset tracking system, MongoDB, NoSQL
Procedia PDF Downloads 306283 Mapping of Geological Structures Using Aerial Photography
Authors: Ankit Sharma, Mudit Sachan, Anurag Prakash
Abstract:
Rapid growth in data acquisition technologies through drones, have led to advances and interests in collecting high-resolution images of geological fields. Being advantageous in capturing high volume of data in short flights, a number of challenges have to overcome for efficient analysis of this data, especially while data acquisition, image interpretation and processing. We introduce a method that allows effective mapping of geological fields using photogrammetric data of surfaces, drainage area, water bodies etc, which will be captured by airborne vehicles like UAVs, we are not taking satellite images because of problems in adequate resolution, time when it is captured may be 1 yr back, availability problem, difficult to capture exact image, then night vision etc. This method includes advanced automated image interpretation technology and human data interaction to model structures and. First Geological structures will be detected from the primary photographic dataset and the equivalent three dimensional structures would then be identified by digital elevation model. We can calculate dip and its direction by using the above information. The structural map will be generated by adopting a specified methodology starting from choosing the appropriate camera, camera’s mounting system, UAVs design ( based on the area and application), Challenge in air borne systems like Errors in image orientation, payload problem, mosaicing and geo referencing and registering of different images to applying DEM. The paper shows the potential of using our method for accurate and efficient modeling of geological structures, capture particularly from remote, of inaccessible and hazardous sites.Keywords: digital elevation model, mapping, photogrammetric data analysis, geological structures
Procedia PDF Downloads 686282 Origin Variability of Superior Vesical Artery
Authors: Waseem Al-Talalwah
Abstract:
The superior vesical artery usually arises directly from the anterior division of the internal iliac artery. It may arise from the umbilical artery as three or four branches to supply the upper and middle parts of bladder. Current study focuses on the different origins of the superior vesical artery to provide a sufficient data for surgeons to disease iatrogenic fault. The superior vesical artery arises directly from the anterior division of the internal iliac artery in 24.5% whereas it arises indirectly as from umbilical artery in 83.7%. Further, it may arise from any branch of the anterior division as from the utrine and obturator arteries in 6.1% and in 6.3% respectively. It also shares the origin of the internal pudendal and inferior glutyeal artery as it arises from the gluteopudendal trunk in 4.1%. The superior vesical artery arises as a single, double, triple and quadruple in 69.4%, 20.4%, 8.2% and 2% respectively. In case of cystectomy for bladder cancer, surgeons have to be aware of the origin variability of superior vesical artery to prevent post-surgical complication such as intra-pelvic bleeding. Also, the as intra-pelvic bleeding has to be expected in case of hysterectomy therefore a great caution of the vesical branches arising from uterine artery has to be considered. In case of aneurysm resection of inferior gluteal artery arising from the gluteopudendal trunk, the surgeons have to be careful of the vascular supply of urinary bladder coming from above and below this common trunk as from superior and inferior vesical arteries respectively. Therefore, present study increases the awareness of clinical significance of superior vesical artery origin for surgeons to minimise the iatroginc errors.Keywords: superior vesical artery, anterior division, internal iliac, internal pudendal, inferior glutyeal, intra-pelvic bleeding, hysterectomy, cystectomy
Procedia PDF Downloads 394281 Near Optimal Closed-Loop Guidance Gains Determination for Vector Guidance Law, from Impact Angle Errors and Miss Distance Considerations
Authors: Karthikeyan Kalirajan, Ashok Joshi
Abstract:
An optimization problem is to setup to maximize the terminal kinetic energy of a maneuverable reentry vehicle (MaRV). The target location, the impact angle is given as constraints. The MaRV uses an explicit guidance law called Vector guidance. This law has two gains which are taken as decision variables. The problem is to find the optimal value of these gains which will result in minimum miss distance and impact angle error. Using a simple 3DOF non-rotating flat earth model and Lockheed martin HP-MARV as the reentry vehicle, the nature of solutions of the optimization problem is studied. This is achieved by carrying out a parametric study for a range of closed loop gain values and the corresponding impact angle error and the miss distance values are generated. The results show that there are well defined lower and upper bounds on the gains that result in near optimal terminal guidance solution. It is found from this study, that there exist common permissible regions (values of gains) where all constraints are met. Moreover, the permissible region lies between flat regions and hence the optimization algorithm has to be chosen carefully. It is also found that, only one of the gain values is independent and that the other dependent gain value is related through a simple straight-line expression. Moreover, to reduce the computational burden of finding the optimal value of two gains, a guidance law called Diveline guidance is discussed, which uses single gain. The derivation of the Diveline guidance law from Vector guidance law is discussed in this paper.Keywords: Marv guidance, reentry trajectory, trajectory optimization, guidance gain selection
Procedia PDF Downloads 427280 Apparent Temperature Distribution on Scaffoldings during Construction Works
Authors: I. Szer, J. Szer, K. Czarnocki, E. Błazik-Borowa
Abstract:
People on construction scaffoldings work in dynamically changing, often unfavourable climate. Additionally, this kind of work is performed on low stiffness structures at high altitude, which increases the risk of accidents. It is therefore desirable to define the parameters of the work environment that contribute to increasing the construction worker occupational safety level. The aim of this article is to present how changes in microclimate parameters on scaffolding can impact the development of dangerous situations and accidents. For this purpose, indicators based on the human thermal balance were used. However, use of this model under construction conditions is often burdened by significant errors or even impossible to implement due to the lack of precise data. Thus, in the target model, the modified parameter was used – apparent environmental temperature. Apparent temperature in the proposed Scaffold Use Risk Assessment Model has been a perceived outdoor temperature, caused by the combined effects of air temperature, radiative temperature, relative humidity and wind speed (wind chill index, heat index). In the paper, correlations between component factors and apparent temperature for facade scaffolding with a width of 24.5 m and a height of 42.3 m, located at south-west side of building are presented. The distribution of factors on the scaffolding has been used to evaluate fitting of the microclimate model. The results of the studies indicate that observed ranges of apparent temperature on the scaffolds frequently results in a worker’s inability to adapt. This leads to reduced concentration and increased fatigue, adversely affects health, and consequently increases the risk of dangerous situations and accidental injuriesKeywords: apparent temperature, health, safety work, scaffoldings
Procedia PDF Downloads 182279 Evaluating the Validity of CFD Model of Dispersion in a Complex Urban Geometry Using Two Sets of Experimental Measurements
Authors: Mohammad R. Kavian Nezhad, Carlos F. Lange, Brian A. Fleck
Abstract:
This research presents the validation study of a computational fluid dynamics (CFD) model developed to simulate the scalar dispersion emitted from rooftop sources around the buildings at the University of Alberta North Campus. The ANSYS CFX code was used to perform the numerical simulation of the wind regime and pollutant dispersion by solving the 3D steady Reynolds-averaged Navier-Stokes (RANS) equations on a building-scale high-resolution grid. The validation study was performed in two steps. First, the CFD model performance in 24 cases (eight wind directions and three wind speeds) was evaluated by comparing the predicted flow fields with the available data from the previous measurement campaign designed at the North Campus, using the standard deviation method (SDM), while the estimated results of the numerical model showed maximum average percent errors of approximately 53% and 37% for wind incidents from the North and Northwest, respectively. Good agreement with the measurements was observed for the other six directions, with an average error of less than 30%. In the second step, the reliability of the implemented turbulence model, numerical algorithm, modeling techniques, and the grid generation scheme was further evaluated using the Mock Urban Setting Test (MUST) dispersion dataset. Different statistical measures, including the fractional bias (FB), the geometric mean bias (MG), and the normalized mean square error (NMSE), were used to assess the accuracy of the predicted dispersion field. Our CFD results are in very good agreement with the field measurements.Keywords: CFD, plume dispersion, complex urban geometry, validation study, wind flow
Procedia PDF Downloads 135278 A Preparatory Method for Building Construction Implemented in a Case Study in Brazil
Authors: Aline Valverde Arroteia, Tatiana Gondim do Amaral, Silvio Burrattino Melhado
Abstract:
During the last twenty years, the construction field in Brazil has evolved significantly in response to its market growing and competitiveness. However, this evolving path has faced many obstacles such as cultural barriers and the lack of efforts to achieve quality at the construction site. At the same time, the greatest amount of information generated on the designing or construction phases is lost due to the lack of an effective coordination of these activities. Face this problem, the aim of this research was to implement a French method named PEO which means preparation for building construction (in Portuguese) seeking to understand the design management process and its interface with the building construction phase. The research method applied was qualitative, and it was carried out through two case studies in the city of Goiania, in Goias, Brazil. The research was divided into two stages called pilot study at Company A and implementation of PEO at Company B. After the implementation; the results demonstrated the PEO method's effectiveness and feasibility while a booster on the quality improvement of design management. The analysis showed that the method has a purpose to improve the design and allow the reduction of failures, errors and rework commonly found in the production of buildings. Therefore, it can be concluded that the PEO is feasible to be applied to real estate and building companies. But, companies need to believe in the contribution they can make to the discovery of design failures in conjunction with other stakeholders forming a construction team. The result of PEO can be maximized when adopting the principles of simultaneous engineering and insertion of new computer technologies, which use a three-dimensional model of the building with BIM process.Keywords: communication, design and construction interface management, preparation for building construction (PEO), proactive coordination (CPA)
Procedia PDF Downloads 162277 Comparison of Different Machine Learning Algorithms for Solubility Prediction
Authors: Muhammet Baldan, Emel Timuçin
Abstract:
Molecular solubility prediction plays a crucial role in various fields, such as drug discovery, environmental science, and material science. In this study, we compare the performance of five machine learning algorithms—linear regression, support vector machines (SVM), random forests, gradient boosting machines (GBM), and neural networks—for predicting molecular solubility using the AqSolDB dataset. The dataset consists of 9981 data points with their corresponding solubility values. MACCS keys (166 bits), RDKit properties (20 properties), and structural properties(3) features are extracted for every smile representation in the dataset. A total of 189 features were used for training and testing for every molecule. Each algorithm is trained on a subset of the dataset and evaluated using metrics accuracy scores. Additionally, computational time for training and testing is recorded to assess the efficiency of each algorithm. Our results demonstrate that random forest model outperformed other algorithms in terms of predictive accuracy, achieving an 0.93 accuracy score. Gradient boosting machines and neural networks also exhibit strong performance, closely followed by support vector machines. Linear regression, while simpler in nature, demonstrates competitive performance but with slightly higher errors compared to ensemble methods. Overall, this study provides valuable insights into the performance of machine learning algorithms for molecular solubility prediction, highlighting the importance of algorithm selection in achieving accurate and efficient predictions in practical applications.Keywords: random forest, machine learning, comparison, feature extraction
Procedia PDF Downloads 40276 Formulation and Optimization of Topical 5-Fluorouracil Microemulsions Using Central Compisite Design
Authors: Sudhir Kumar, V. R. Sinha
Abstract:
Water in oil topical microemulsions of 5-FU were developed and optimized using face centered central composite design. Topical w/o microemulsion of 5-FU were prepared using sorbitan monooleate (Span 80), polysorbate 80 (Tween 80), with different oils such as oleic acid (OA), triacetin (TA), and isopropyl myristate (IPM). The ternary phase diagrams designated the microemulsion region and face centered central composite design helped in determining the effects of selected variables viz. type of oil, smix ratio and water concentration on responses like drug content, globule size and viscosity of microemulsions. The CCD design exhibited that the factors have statistically significant effects (p<0.01) on the selected responses. The actual responses showed excellent agreement with the predicted values as suggested by the CCD with lower residual standard error. Similarly, the optimized values were found within the range as predicted by the model. Furthermore, other characteristics of microemulsions like pH, conductivity were investigated. For the optimized microemulsion batch, ex-vivo skin flux, skin irritation and retention studies were performed and compared with marketed 5-FU formulation. In ex vivo skin permeation studies, higher skin retention of drug and minimal flux was achieved for optimized microemulsion batch then the marketed cream. Results confirmed the actual responses to be in agreement with predicted ones with least residual standard errors. Controlled release of drug was achieved for the optimized batch with higher skin retention of 5-FU, which can further be utilized for the treatment of many dermatological disorders.Keywords: 5-FU, central composite design, microemulsion, ternanry phase diagram
Procedia PDF Downloads 479275 Acceleration-Based Motion Model for Visual Simultaneous Localization and Mapping
Authors: Daohong Yang, Xiang Zhang, Lei Li, Wanting Zhou
Abstract:
Visual Simultaneous Localization and Mapping (VSLAM) is a technology that obtains information in the environment for self-positioning and mapping. It is widely used in computer vision, robotics and other fields. Many visual SLAM systems, such as OBSLAM3, employ a constant-speed motion model that provides the initial pose of the current frame to improve the speed and accuracy of feature matching. However, in actual situations, the constant velocity motion model is often difficult to be satisfied, which may lead to a large deviation between the obtained initial pose and the real value, and may lead to errors in nonlinear optimization results. Therefore, this paper proposed a motion model based on acceleration, which can be applied on most SLAM systems. In order to better describe the acceleration of the camera pose, we decoupled the pose transformation matrix, and calculated the rotation matrix and the translation vector respectively, where the rotation matrix is represented by rotation vector. We assume that, in a short period of time, the changes of rotating angular velocity and translation vector remain the same. Based on this assumption, the initial pose of the current frame is estimated. In addition, the error of constant velocity model was analyzed theoretically. Finally, we applied our proposed approach to the ORBSLAM3 system and evaluated two sets of sequences on the TUM dataset. The results showed that our proposed method had a more accurate initial pose estimation and the accuracy of ORBSLAM3 system is improved by 6.61% and 6.46% respectively on the two test sequences.Keywords: error estimation, constant acceleration motion model, pose estimation, visual SLAM
Procedia PDF Downloads 94274 Quantification of Soft Tissue Artefacts Using Motion Capture Data and Ultrasound Depth Measurements
Authors: Azadeh Rouhandeh, Chris Joslin, Zhen Qu, Yuu Ono
Abstract:
The centre of rotation of the hip joint is needed for an accurate simulation of the joint performance in many applications such as pre-operative planning simulation, human gait analysis, and hip joint disorders. In human movement analysis, the hip joint center can be estimated using a functional method based on the relative motion of the femur to pelvis measured using reflective markers attached to the skin surface. The principal source of errors in estimation of hip joint centre location using functional methods is soft tissue artefacts due to the relative motion between the markers and bone. One of the main objectives in human movement analysis is the assessment of soft tissue artefact as the accuracy of functional methods depends upon it. Various studies have described the movement of soft tissue artefact invasively, such as intra-cortical pins, external fixators, percutaneous skeletal trackers, and Roentgen photogrammetry. The goal of this study is to present a non-invasive method to assess the displacements of the markers relative to the underlying bone using optical motion capture data and tissue thickness from ultrasound measurements during flexion, extension, and abduction (all with knee extended) of the hip joint. Results show that the artefact skin marker displacements are non-linear and larger in areas closer to the hip joint. Also marker displacements are dependent on the movement type and relatively larger in abduction movement. The quantification of soft tissue artefacts can be used as a basis for a correction procedure for hip joint kinematics.Keywords: hip joint center, motion capture, soft tissue artefact, ultrasound depth measurement
Procedia PDF Downloads 281273 Analysis on the Need of Engineering Drawing and Feasibility Study on 3D Model Based Engineering Implementation
Authors: Parthasarathy J., Ramshankar C. S.
Abstract:
Engineering drawings these days play an important role in every part of an industry. By and large, Engineering drawings are influential over every phase of the product development process. Traditionally, drawings are used for communication in industry because they are the clearest way to represent the product manufacturing information. Until recently, manufacturing activities were driven by engineering data captured in 2D paper documents or digital representations of those documents. The need of engineering drawing is inevitable. Still Engineering drawings are disadvantageous in re-entry of data throughout manufacturing life cycle. This document based approach is prone to errors and requires costly re-entry of data at every stage in the manufacturing life cycle. So there is a requirement to eliminate Engineering drawings throughout product development process and to implement 3D Model Based Engineering (3D MBE or 3D MBD). Adopting MBD appears to be the next logical step to continue reducing time-to-market and improve product quality. Ideally, by fully applying the MBD concept, the product definition will no longer rely on engineering drawings throughout the product lifecycle. This project addresses the need of Engineering drawing and its influence in various parts of an industry and the need to implement the 3D Model Based Engineering with its advantages and the technical barriers that must be overcome in order to implement 3D Model Based Engineering. This project also addresses the requirements of neutral formats and its realisation in order to implement the digital product definition principles in a light format. In order to prove the concepts of 3D Model Based Engineering, the screw jack body part is also demonstrated. At ZF Windpower Coimbatore Limited, 3D Model Based Definition is implemented to Torque Arm (Machining and Casting), Steel tube, Pinion shaft, Cover, Energy tube.Keywords: engineering drawing, model based engineering MBE, MBD, CAD
Procedia PDF Downloads 435272 A Pilot Study to Investigate the Use of Machine Translation Post-Editing Training for Foreign Language Learning
Authors: Hong Zhang
Abstract:
The main purpose of this study is to show that machine translation (MT) post-editing (PE) training can help our Chinese students learn Spanish as a second language. Our hypothesis is that they might make better use of it by learning PE skills specific for foreign language learning. We have developed PE training materials based on the data collected in a previous study. Training material included the special error types of the output of MT and the error types that our Chinese students studying Spanish could not detect in the experiment last year. This year we performed a pilot study in order to evaluate the PE training materials effectiveness and to what extent PE training helps Chinese students who study the Spanish language. We used screen recording to record these moments and made note of every action done by the students. Participants were speakers of Chinese with intermediate knowledge of Spanish. They were divided into two groups: Group A performed PE training and Group B did not. We prepared a Chinese text for both groups, and participants translated it by themselves (human translation), and then used Google Translate to translate the text and asked them to post-edit the raw MT output. Comparing the results of PE test, Group A could identify and correct the errors faster than Group B students, Group A did especially better in omission, word order, part of speech, terminology, mistranslation, official names, and formal register. From the results of this study, we can see that PE training can help Chinese students learn Spanish as a second language. In the future, we could focus on the students’ struggles during their Spanish studies and complete the PE training materials to teach Chinese students learning Spanish with machine translation.Keywords: machine translation, post-editing, post-editing training, Chinese, Spanish, foreign language learning
Procedia PDF Downloads 144271 Quality Control of 99mTc-Labeled Radiopharmaceuticals Using the Chromatography Strips
Authors: Yasuyuki Takahashi, Akemi Yoshida, Hirotaka Shimada
Abstract:
99mTc-2-methoxy-isobutyl-isonitrile (MIBI) and 99mTcmercaptoacetylgylcylglycyl-glycine (MAG3 ) are heat to 368-372K and are labeled with 99mTc-pertechnetate. Quality control (QC) of 99mTc-labeled radiopharmaceuticals is performed at hospitals, using liquid chromatography, which is difficult to perform in general hospitals. We used chromatography strips to simplify QC and investigated the effects of the test procedures on quality control. In this study is 99mTc- MAG3. Solvent using chloroform + acetone + tetrahydrofuran, and the gamma counter was ARC-380CL. The changed conditions are as follows; heating temperature, resting time after labeled, and expiration year for use: which were 293, 313, 333, 353 and 372K; 15 min (293K and 372K) and 1 hour (293K); and 2011, 2012, 2013, 2014 and 2015 respectively were tested. Measurement time using the gamma counter was one minute. A nuclear medical clinician decided the quality of the preparation in judging the usability of the retest agent. Two people conducted the test procedure twice, in order to compare reproducibility. The percentage of radiochemical purity (% RCP) was approximately 50% under insufficient heat treatment, which improved as the temperature and heating time increased. Moreover, the % RCP improved with time even under low temperatures. Furthermore, there was no deterioration with time after the expiration date. The objective of these tests was to determine soluble 99mTc impurities, including 99mTc-pertechnetate and the hydrolyzed-reduced 99mTc. Therefore, we assumed that insufficient heating and heating to operational errors in the labeling. It is concluded that quality control is a necessary procedure in nuclear medicine to ensure safe scanning. It is suggested that labeling is necessary to identify specifications.Keywords: quality control, tc-99m labeled radio-pharmaceutical, chromatography strip, nuclear medicine
Procedia PDF Downloads 322270 Improving Efficiency and Effectiveness of FMEA Studies
Authors: Joshua Loiselle
Abstract:
This paper discusses the challenges engineering teams face in conducting Failure Modes and Effects Analysis (FMEA) studies. This paper focuses on the specific topic of improving the efficiency and effectiveness of FMEA studies. Modern economic needs and increased business competition require engineers to constantly develop newer and better solutions within shorter timeframes and tighter margins. In addition, documentation requirements for meeting standards/regulatory compliance and customer needs are becoming increasingly complex and verbose. Managing open actions and continuous improvement activities across all projects, product variations, and processes in addition to daily engineering tasks is cumbersome, time consuming, and is susceptible to errors, omissions, and non-conformances. FMEA studies are proven methods for improving products and processes while subsequently reducing engineering workload and improving machine and resource availability through a pre-emptive, systematic approach of identifying, analyzing, and improving high-risk components. If implemented correctly, FMEA studies significantly reduce costs and improve productivity. However, the value of an effective FMEA is often shrouded by a lack of clarity and structure, misconceptions, and previous experiences and, as such, FMEA studies are frequently grouped with the other required information and documented retrospectively in preparation of customer requirements or audits. Performing studies in this way only adds cost to a project and perpetuates the misnomer that FMEA studies are not value-added activities. This paper discusses the benefits of effective FMEA studies, the challenges related to conducting FMEA studies, best practices for efficiently overcoming challenges via structure and automation, and the benefits of implementing those practices.Keywords: FMEA, quality, APQP, PPAP
Procedia PDF Downloads 304269 The Relationship between Renewable Energy, Real Income, Tourism and Air Pollution
Authors: Eyup Dogan
Abstract:
One criticism of the energy-growth-environment literature, to the best of our knowledge, is that only a few studies analyze the influence of tourism on CO₂ emissions even though tourism sector is closely related to the environment. The other criticism is the selection of methodology. Panel estimation techniques that fail to consider both heterogeneity and cross-sectional dependence across countries can cause forecasting errors. To fulfill the mentioned gaps in the literature, this study analyzes the impacts of real GDP, renewable energy and tourism on the levels of carbon dioxide (CO₂) emissions for the top 10 most-visited countries around the world. This study focuses on the top 10 touristic (most-visited) countries because they receive about the half of the worldwide tourist arrivals in late years and are among the top ones in 'Renewables Energy Country Attractiveness Index (RECAI)'. By looking at Pesaran’s CD test and average growth rates of variables for each country, we detect the presence of cross-sectional dependence and heterogeneity. Hence, this study uses second generation econometric techniques (cross-sectionally augmented Dickey-Fuller (CADF), and cross-sectionally augmented IPS (CIPS) unit root test, the LM bootstrap cointegration test, and the DOLS and the FMOLS estimators) which are robust to the mentioned issues. Therefore, the reported results become accurate and reliable. It is found that renewable energy mitigates the pollution whereas real GDP and tourism contribute to carbon emissions. Thus, regulatory policies are necessary to increase the awareness of sustainable tourism. In addition, the use of renewable energy and the adoption of clean technologies in tourism sector as well as in producing goods and services play significant roles in reducing the levels of emissions.Keywords: air pollution, tourism, renewable energy, income, panel data
Procedia PDF Downloads 184268 Performance Evaluation of the CareSTART S1 Analyzer for Quantitative Point-Of-Care Measurement of Glucose-6-Phosphate Dehydrogenase Activity
Authors: Haiyoung Jung, Mi Joung Leem, Sun Hwa Lee
Abstract:
Background & Objective: Glucose-6-phosphate dehydrogenase (G6PD) deficiency is a genetic abnormality that results in an inadequate amount of G6PD, leading to increased susceptibility of red blood cells to reactive oxygen species and hemolysis. The present study aimed to evaluate the careSTARTTM S1 analyzer for measuring G6PD activity to hemoglobin (Hb) ratio. Methods: Precision for G6PD activity and hemoglobin measurement was evaluated using control materials with two levels on five repeated runs per day for five days. The analytic performance of the careSTARTTM S1 analyzer was compared with spectrophotometry in 40 patient samples. Reference ranges suggested by the manufacturer were validated in 20 healthy males and females each. Results: The careSTARTTM S1 analyzer demonstrated precision of 6.0% for low-level (14~45 U/dL) and 2.7% for high-level (60~90 U/dL) control in G6PD activity, and 1.4% in hemoglobin (7.9~16.3 u/g Hb). A comparison study of G6PD to Hb ratio between the careSTARTTM S1 analyzer and spectrophotometry showed an average difference of 29.1% with a positive bias of the careSTARTTM S1 analyzer. All normal samples from the healthy population were validated for the suggested reference range for males (≥2.19 U/g Hb) and females (≥5.83 U/g Hb). Conclusion: The careSTARTTM S1 analyzer demonstrated good analytical performance and can replace the current spectrophotometric measurement of G6PD enzyme activity. In the aspect of the management of clinical laboratories, it can be a reasonable option as a point-of-care analyzer with minimal handling of samples and reagents, in addition to the automatic calculation of the ratio of measured G6PD activity and Hb concentration, to minimize any clerical errors involved with manual calculation.Keywords: POCT, G6PD, performance evaluation, careSTART
Procedia PDF Downloads 64267 The Architectural Conservation and Restoration Problems of Istanbul’s “Yalı” Waterfront Mansions
Authors: Zeynep Tanrıverdi
Abstract:
The Bosphorus is an international waterway in Istanbul city of Turkey connecting the Sea of Marmara and the Black Sea. The Bosphorus, which has formed an important part of the silhouette of Istanbul throughout history, has also influenced the design of the coastal structures built around it. The waterfront mansions, which are located on both sides of the Bosphorus by the sea, and can be generally of two or three storeys, are called “yalı”. The yalı buildings with their architectural characteristics of the traditional Turkish House are the most grandiose examples of Ottoman residential architecture. However, the classical Ottoman yalı architecture of the 18th century can only be seen in engravings, and today only the modest and smaller yalı examples from the 19th century can be seen because of their disappearance over time. The study aims to reveal the architectural conservation and restoration problems of waterfront mansions and propose solutions for them. Firstly, the development of the waterfront mansion architecture in Bosphorus was evaluated in its historical process. Secondly, the waterfront mansions and their architectural features were explained. Thirdly, the architectural conservation and restoration problems that caused the disappearance of waterfront mansions were discussed. These problems include disruptions in legal regulations and practices about the Bosphorus, dramatic changes in Turkey’s socio-cultural life from the Ottoman Empire to the present, inadequacies in economic resources, negative environmental effects, and errors in restoration works. Finally, solution suggestions were proposed for the problems that threaten the protection of waterfront mansions. In the study, literature on waterfront mansions was reviewed using historical reports, photographs, maps, and drawings in archival documents. It is hoped that this study will contribute the conservation of the “Yalı” waterfront mansions, which occupy a particular role in the cultural heritage of Turkey, and to their transmission with their authentic values to the next generation.Keywords: bosphorus architecture, conservation, heritage, Istanbul, waterfront mansions (yalı)
Procedia PDF Downloads 77266 Role of Vision Centers in Eliminating Avoidable Blindness Caused Due to Uncorrected Refractive Error in Rural South India
Authors: Ranitha Guna Selvi D, Ramakrishnan R, Mohideen Abdul Kader
Abstract:
Purpose: To study the role of Vision centers in managing preventable blindness through refractive error correction in Rural South India. Methods: A retrospective analysis of patients attending 15 Vision centers in Rural South India from a period of January 2021 to December 2021 was done. Medical records of 10,85,81 patients both new and reviewed, 79,562 newly registered patients and 29,019 review patient’s from15 Vision centers were included for data analysis. All the patients registered at the vision center underwent basic eye examination, including visual acuity, IOP measurement, Slit-lamp examination, retinoscopy, Fundus examination etc. Results: A total of 1,08,581 patients were included in the study. Of the total 1,08,581 patients, 79,562 were newly registered patients at Vision center and 29,019 were review patients. Males were 52,201(48.1%) and Females were 56,308(51.9) among them. The mean age of all examined patients was 41.03 ± 20.9 years (Standard deviation) and ranged from 01 – 113 years. Presenting mean visual acuity was 0.31 ± 0.5 in the right eye and 0.31 ± 0.4 in the left eye. Of the 1,08,581 patients 22,770 patients had refractive error in right eye and 22,721 patients had uncorrected refractive error in left eye. Glass prescription was given to 17,178 (15.8%) patients. 8,109 (7.5%) patients were referred to the base hospital for specialty clinic expert opinion or for cataract surgery. Conclusion: Vision center utilizing teleconsultation for comprehensive eye screening unit is a very effective tool in reducing the avoidable visual impairment caused due to uncorrected refractive error. Vision Centre model is believed to be efficient as it facilitates early detection and management of uncorrected refractive errors.Keywords: refractive error, uncorrected refractive error, vision center, vision technician, teleconsultation
Procedia PDF Downloads 142265 Evaluation of Short-Term Load Forecasting Techniques Applied for Smart Micro-Grids
Authors: Xiaolei Hu, Enrico Ferrera, Riccardo Tomasi, Claudio Pastrone
Abstract:
Load Forecasting plays a key role in making today's and future's Smart Energy Grids sustainable and reliable. Accurate power consumption prediction allows utilities to organize in advance their resources or to execute Demand Response strategies more effectively, which enables several features such as higher sustainability, better quality of service, and affordable electricity tariffs. It is easy yet effective to apply Load Forecasting at larger geographic scale, i.e. Smart Micro Grids, wherein the lower available grid flexibility makes accurate prediction more critical in Demand Response applications. This paper analyses the application of short-term load forecasting in a concrete scenario, proposed within the EU-funded GreenCom project, which collect load data from single loads and households belonging to a Smart Micro Grid. Three short-term load forecasting techniques, i.e. linear regression, artificial neural networks, and radial basis function network, are considered, compared, and evaluated through absolute forecast errors and training time. The influence of weather conditions in Load Forecasting is also evaluated. A new definition of Gain is introduced in this paper, which innovatively serves as an indicator of short-term prediction capabilities of time spam consistency. Two models, 24- and 1-hour-ahead forecasting, are built to comprehensively compare these three techniques.Keywords: short-term load forecasting, smart micro grid, linear regression, artificial neural networks, radial basis function network, gain
Procedia PDF Downloads 470264 The Application of a Neural Network in the Reworking of Accu-Chek to Wrist Bands to Monitor Blood Glucose in the Human Body
Authors: J. K Adedeji, O. H Olowomofe, C. O Alo, S.T Ijatuyi
Abstract:
The issue of high blood sugar level, the effects of which might end up as diabetes mellitus, is now becoming a rampant cardiovascular disorder in our community. In recent times, a lack of awareness among most people makes this disease a silent killer. The situation calls for urgency, hence the need to design a device that serves as a monitoring tool such as a wrist watch to give an alert of the danger a head of time to those living with high blood glucose, as well as to introduce a mechanism for checks and balances. The neural network architecture assumed 8-15-10 configuration with eight neurons at the input stage including a bias, 15 neurons at the hidden layer at the processing stage, and 10 neurons at the output stage indicating likely symptoms cases. The inputs are formed using the exclusive OR (XOR), with the expectation of getting an XOR output as the threshold value for diabetic symptom cases. The neural algorithm is coded in Java language with 1000 epoch runs to bring the errors into the barest minimum. The internal circuitry of the device comprises the compatible hardware requirement that matches the nature of each of the input neurons. The light emitting diodes (LED) of red, green, and yellow colors are used as the output for the neural network to show pattern recognition for severe cases, pre-hypertensive cases and normal without the traces of diabetes mellitus. The research concluded that neural network is an efficient Accu-Chek design tool for the proper monitoring of high glucose levels than the conventional methods of carrying out blood test.Keywords: Accu-Check, diabetes, neural network, pattern recognition
Procedia PDF Downloads 147263 Cyber Security and Risk Assessment of the e-Banking Services
Authors: Aisha F. Bushager
Abstract:
Today we are more exposed than ever to cyber threats and attacks at personal, community, organizational, national, and international levels. More aspects of our lives are operating on computer networks simply because we are living in the fifth domain, which is called the Cyberspace. One of the most sensitive areas that are vulnerable to cyber threats and attacks is the Electronic Banking (e-Banking) area, where the banking sector is providing online banking services to its clients. To be able to obtain the clients trust and encourage them to practice e-Banking, also, to maintain the services provided by the banks and ensure safety, cyber security and risks control should be given a high priority in the e-banking area. The aim of the study is to carry out risk assessment on the e-banking services and determine the cyber threats, cyber attacks, and vulnerabilities that are facing the e-banking area specifically in the Kingdom of Bahrain. To collect relevant data, structured interviews were taken place with e-banking experts in different banks. Then, collected data where used as in input to the risk management framework provided by the National Institute of Standards and Technology (NIST), which was the model used in the study to assess the risks associated with e-banking services. The findings of the study showed that the cyber threats are commonly human errors, technical software or hardware failure, and hackers, on the other hand, the most common attacks facing the e-banking sector were phishing, malware attacks, and denial-of-service. The risks associated with the e-banking services were around the moderate level, however, more controls and countermeasures must be applied to maintain the moderate level of risks. The results of the study will help banks discover their vulnerabilities and maintain their online services, in addition, it will enhance the cyber security and contribute to the management and control of risks that are facing the e-banking sector.Keywords: cyber security, e-banking, risk assessment, threats identification
Procedia PDF Downloads 350262 Explaining the Role of Iran Health System in Polypharmacy among the Elderly
Authors: Mohsen Shati, Seyede Salehe Mortazavi, Seyed Kazem Malakouti, Hamidreza Khanke Fazlollah Ahmadi
Abstract:
Taking unnecessary or excessive medication or using drugs with no indication (polypharmacy) by people of all ages, especially the elderly, is associated with increased adverse drug reactions (ADR), medical errors, hospitalization and escalating the costs. It may be facilitated or impeded by the healthcare system. In this study, we are going to describe the role of the health system in the practice of polypharmacy in Iranian elderly. In this Inductive qualitative content analysis using Graneheim and Lundman methods, purposeful sample selection until saturation has been made. Participants have been selected from doctors, pharmacists, policy-makers and the elderly. A total of 25 persons (9 men and 16 women) have participated in this study. Data analysis after incorporating codes with similar characteristics revealed 14 subcategories and six main categories of the referral system, physicians’ accessibility, health data management, drug market, laws enforcement, and social protection. Some of the conditions of the healthcare system have given rise to polypharmacy in the elderly. In the absence of a comprehensive specialty and subspecialty referral system, patients may go to any physician office so may well be confused about numerous doctors' prescriptions. Electronic records not being prepared for the patients, failure to comply with laws, lack of robust enforcement for the existing laws and close surveillance are among the contributing factors. Inadequate insurance and supportive services are also evident. Age-specific care providing has not yet been institutionalized, while, inadequate specialist workforce playing a major role. So, one may not ignore the health system as contributing factor in designing effective interventions to fix the problem.Keywords: elderly, polypharmacy, health system, qualitative study
Procedia PDF Downloads 151