Search results for: measurement and empirical software engineering
9901 Reliability Prediction of Tires Using Linear Mixed-Effects Model
Authors: Myung Hwan Na, Ho- Chun Song, EunHee Hong
Abstract:
We widely use normal linear mixed-effects model to analysis data in repeated measurement. In case of detecting heteroscedasticity and the non-normality of the population distribution at the same time, normal linear mixed-effects model can give improper result of analysis. To achieve more robust estimation, we use heavy tailed linear mixed-effects model which gives more exact and reliable analysis conclusion than standard normal linear mixed-effects model.Keywords: reliability, tires, field data, linear mixed-effects model
Procedia PDF Downloads 5649900 Hybrid Data-Driven Drilling Rate of Penetration Optimization Scheme Guided by Geological Formation and Historical Data
Authors: Ammar Alali, Mahmoud Abughaban, William Contreras Otalvora
Abstract:
Optimizing the drilling process for cost and efficiency requires the optimization of the rate of penetration (ROP). ROP is the measurement of the speed at which the wellbore is created, in units of feet per hour. It is the primary indicator of measuring drilling efficiency. Maximization of the ROP can indicate fast and cost-efficient drilling operations; however, high ROPs may induce unintended events, which may lead to nonproductive time (NPT) and higher net costs. The proposed ROP optimization solution is a hybrid, data-driven system that aims to improve the drilling process, maximize the ROP, and minimize NPT. The system consists of two phases: (1) utilizing existing geological and drilling data to train the model prior, and (2) real-time adjustments of the controllable dynamic drilling parameters [weight on bit (WOB), rotary speed (RPM), and pump flow rate (GPM)] that direct influence on the ROP. During the first phase of the system, geological and historical drilling data are aggregated. After, the top-rated wells, as a function of high instance ROP, are distinguished. Those wells are filtered based on NPT incidents, and a cross-plot is generated for the controllable dynamic drilling parameters per ROP value. Subsequently, the parameter values (WOB, GPM, RPM) are calculated as a conditioned mean based on physical distance, following Inverse Distance Weighting (IDW) interpolation methodology. The first phase is concluded by producing a model of drilling best practices from the offset wells, prioritizing the optimum ROP value. This phase is performed before the commencing of drilling. Starting with the model produced in phase one, the second phase runs an automated drill-off test, delivering live adjustments in real-time. Those adjustments are made by directing the driller to deviate two of the controllable parameters (WOB and RPM) by a small percentage (0-5%), following the Constrained Random Search (CRS) methodology. These minor incremental variations will reveal new drilling conditions, not explored before through offset wells. The data is then consolidated into a heat-map, as a function of ROP. A more optimum ROP performance is identified through the heat-map and amended in the model. The validation process involved the selection of a planned well in an onshore oil field with hundreds of offset wells. The first phase model was built by utilizing the data points from the top-performing historical wells (20 wells). The model allows drillers to enhance decision-making by leveraging existing data and blending it with live data in real-time. An empirical relationship between controllable dynamic parameters and ROP was derived using Artificial Neural Networks (ANN). The adjustments resulted in improved ROP efficiency by over 20%, translating to at least 10% saving in drilling costs. The novelty of the proposed system lays is its ability to integrate historical data, calibrate based geological formations, and run real-time global optimization through CRS. Those factors position the system to work for any newly drilled well in a developing field event.Keywords: drilling optimization, geological formations, machine learning, rate of penetration
Procedia PDF Downloads 1319899 The Use of Computer-Aided Design in Small Contractors in a Local Area of Korea
Authors: Myunghoun Jang
Abstract:
A survey of small-size contractors in Jeju was conducted to investigate college graduate's computer-aided design (CAD) competence. Most of small-size contractors use CAD software to review and update drawings submitted from an architect. This research analyzed the curriculum of the architectural engineering in several national universities. The CAD classes have 4 or 6 hours per week and use AutoCAD primarily. This paper proposes that a CAD class needs 6 hours per week, 2D drawing is the main theme in the curriculum, and exercises to make 3D models are also included in the CAD class. An improved method, for example Internet cafe and real time feedbacks using smartphones, to evaluate the reports and exercise results is necessary.Keywords: CAD (Computer Aided Design), CAD education, education improvement, small-size contractor
Procedia PDF Downloads 2679898 Fuzzy Expert Approach for Risk Mitigation on Functional Urban Areas Affected by Anthropogenic Ground Movements
Authors: Agnieszka A. Malinowska, R. Hejmanowski
Abstract:
A number of European cities are strongly affected by ground movements caused by anthropogenic activities or post-anthropogenic metamorphosis. Those are mainly water pumping, current mining operation, the collapse of post-mining underground voids or mining-induced earthquakes. These activities lead to large and small-scale ground displacements and a ground ruptures. The ground movements occurring in urban areas could considerably affect stability and safety of structures and infrastructures. The complexity of the ground deformation phenomenon in relation to the structures and infrastructures vulnerability leads to considerable constraints in assessing the threat of those objects. However, the increase of access to the free software and satellite data could pave the way for developing new methods and strategies for environmental risk mitigation and management. Open source geographical information systems (OS GIS), may support data integration, management, and risk analysis. Lately, developed methods based on fuzzy logic and experts methods for buildings and infrastructure damage risk assessment could be integrated into OS GIS. Those methods were verified base on back analysis proving their accuracy. Moreover, those methods could be supported by ground displacement observation. Based on freely available data from European Space Agency and free software, ground deformation could be estimated. The main innovation presented in the paper is the application of open source software (OS GIS) for integration developed models and assessment of the threat of urban areas. Those approaches will be reinforced by analysis of ground movement based on free satellite data. Those data would support the verification of ground movement prediction models. Moreover, satellite data will enable our mapping of ground deformation in urbanized areas. Developed models and methods have been implemented in one of the urban areas hazarded by underground mining activity. Vulnerability maps supported by satellite ground movement observation would mitigate the hazards of land displacements in urban areas close to mines.Keywords: fuzzy logic, open source geographic information science (OS GIS), risk assessment on urbanized areas, satellite interferometry (InSAR)
Procedia PDF Downloads 1599897 Rounded-off Measurements and Their Implication on Control Charts
Authors: Ran Etgar
Abstract:
The process of rounding off measurements in continuous variables is commonly encountered. Although it usually has minor effects, sometimes it can lead to poor outcomes in statistical process control using X ̅-chart. The traditional control limits can cause incorrect conclusions if applied carelessly. This study looks into the limitations of classical control limits, particularly the impact of asymmetry. An approach to determining the distribution function of the measured parameter (Y ̅) is presented, resulting in a more precise method to establish the upper and lower control limits. The proposed method, while slightly more complex than Shewhart's original idea, is still user-friendly and accurate and only requires the use of two straightforward tables.Keywords: inaccurate measurement, SPC, statistical process control, rounded-off, control chart
Procedia PDF Downloads 409896 Clinical Impact of Ultra-Deep Versus Sanger Sequencing Detection of Minority Mutations on the HIV-1 Drug Resistance Genotype Interpretations after Virological Failure
Authors: S. Mohamed, D. Gonzalez, C. Sayada, P. Halfon
Abstract:
Drug resistance mutations are routinely detected using standard Sanger sequencing, which does not detect minor variants with a frequency below 20%. The impact of detecting minor variants generated by ultra-deep sequencing (UDS) on HIV drug-resistance (DR) interpretations has not yet been studied. Fifty HIV-1 patients who experienced virological failure were included in this retrospective study. The HIV-1 UDS protocol allowed the detection and quantification of HIV-1 protease and reverse transcriptase variants related to genotypes A, B, C, E, F, and G. DeepChek®-HIV simplified DR interpretation software was used to compare Sanger sequencing and UDS. The total time required for the UDS protocol was found to be approximately three times longer than Sanger sequencing with equivalent reagent costs. UDS detected all of the mutations found by population sequencing and identified additional resistance variants in all patients. An analysis of DR revealed a total of 643 and 224 clinically relevant mutations by UDS and Sanger sequencing, respectively. Three resistance mutations with > 20% prevalence were detected solely by UDS: A98S (23%), E138A (21%) and V179I (25%). A significant difference in the DR interpretations for 19 antiretroviral drugs was observed between the UDS and Sanger sequencing methods. Y181C and T215Y were the most frequent mutations associated with interpretation differences. A combination of UDS and DeepChek® software for the interpretation of DR results would help clinicians provide suitable treatments. A cut-off of 1% allowed a better characterisation of the viral population by identifying additional resistance mutations and improving the DR interpretation.Keywords: HIV-1, ultra-deep sequencing, Sanger sequencing, drug resistance
Procedia PDF Downloads 3359895 Modeling of International Financial Integration: A Multicriteria Decision
Authors: Zouari Ezzeddine, Tarchoun Monaem
Abstract:
Despite the multiplicity of advanced approaches, the concept of financial integration couldn’t be an explicit analysis. Indeed, empirical studies appear that the measures of international financial integration are one-dimensional analyses. For the ambivalence of the concept and its multiple determinants, it must be analyzed in multidimensional level. The interest of this research is a proposal of a decision support by multicriteria approach for determining the positions of countries according to their international and financial dependencies links with the behavior of financial actors (trying to make governance decisions or diversification strategies of international portfolio ...Keywords: financial integration, decision support, behavior, multicriteria approach, governance and diversification
Procedia PDF Downloads 5279894 Social Media and the Future of Veganism Influence on Gender Norms
Authors: Athena Johnson
Abstract:
Veganism has seen a rapid increase in members over recent years. Understanding the mechanisms of social change associated with these dietary practices in relation to gender is significant as these groups may seem small, but they have a large impact as they influence many and change the food market. This research article's basic methodology is primarily a deep article research literature review with empirical research. The research findings show that the popularity of veganism is growing, in large part due to the extensive use of social media, which dispels longstanding gendered connotations with food, such as the correlations between meat and masculinity.Keywords: diversity, gender roles, social media, veganism
Procedia PDF Downloads 1139893 Finite Element Modeling of Two-Phase Microstructure during Metal Cutting
Authors: Junior Nomani
Abstract:
This paper presents a novel approach to modelling the metal cutting of duplex stainless steels, a two-phase alloy regarded as a difficult-to-machine material. Calculation and control of shear strain and stresses during cutting are essential to achievement of ideal cutting conditions. Too low or too high leads to higher required cutting force or excessive heat generation causing premature tool wear failure. A 2D finite element cutting model was created based on electron backscatter diffraction (EBSD) data imagery of duplex microstructure. A mesh was generated using ‘object-oriented’ software OOF2 version V2.1.11, converting microstructural images to quadrilateral elements. A virtual workpiece was created on ABAQUS modelling software where a rigid body toolpiece advanced towards workpiece simulating chip formation, generating serrated edge chip formation cutting. Model results found calculated stress strain contour plots correlated well with similar finite element models tied with austenite stainless steel alloys. Virtual chip form profile is also similar compared experimental frozen machining chip samples. The output model data provides new insight description of strain behavior of two phase material on how it transitions from workpiece into the chip.Keywords: Duplex stainless steel, ABAQUS, OOF2, Chip formation
Procedia PDF Downloads 1009892 Stress Concentration Trend for Combined Loading Conditions
Authors: Aderet M. Pantierer, Shmuel Pantierer, Raphael Cordina, Yougashwar Budhoo
Abstract:
Stress concentration occurs when there is an abrupt change in geometry, a mechanical part under loading. These changes in geometry can include holes, notches, or cracks within the component. The modifications create larger stress within the part. This maximum stress is difficult to determine, as it is directly at the point of the minimum area. Strain gauges have yet to be developed to analyze stresses at such minute areas. Therefore, a stress concentration factor must be utilized. The stress concentration factor is a dimensionless parameter calculated solely on the geometry of a part. The factor is multiplied by the nominal, or average, stress of the component, which can be found analytically or experimentally. Stress concentration graphs exist for common loading conditions and geometrical configurations to aid in the determination of the maximum stress a part can withstand. These graphs were developed from historical data yielded from experimentation. This project seeks to verify a stress concentration graph for combined loading conditions. The aforementioned graph was developed using CATIA Finite Element Analysis software. The results of this analysis will be validated through further testing. The 3D modeled parts will be subjected to further finite element analysis using Patran-Nastran software. The finite element models will then be verified by testing physical specimen using a tensile testing machine. Once the data is validated, the unique stress concentration graph will be submitted for publication so it can aid engineers in future projects.Keywords: stress concentration, finite element analysis, finite element models, combined loading
Procedia PDF Downloads 4449891 An Approach to Physical Performance Analysis for Judo
Authors: Stefano Frassinelli, Alessandro Niccolai, Riccardo E. Zich
Abstract:
Sport performance analysis is a technique that is becoming every year more important for athletes of every level. Many techniques have been developed to measure and analyse efficiently the performance of athletes in some sports, but in combat sports these techniques found in many times their limits, due to the high interaction between the two opponents during the competition. In this paper the problem will be framed. Moreover the physical performance measurement problem will be analysed and three different techniques to manage it will be presented. All the techniques have been used to analyse the performance of 22 high level Judo athletes.Keywords: sport performance, physical performance, judo, performance coefficients
Procedia PDF Downloads 4139890 Evaluating Daylight Performance in an Office Environment in Malaysia, Using Venetian Blind System: Case Study
Authors: Fatemeh Deldarabdolmaleki, Mohamad Fakri Zaky Bin Ja'afar
Abstract:
Having a daylit space together with view results in a pleasant and productive environment for office employees. A daylit space is a space which utilizes daylight as a basic source of illumination to fulfill user’s visual demands and minimizes the electric energy consumption. Malaysian weather is hot and humid all over the year because of its location in the equatorial belt. however, because most of the commercial buildings in Malaysia are air-conditioned, huge glass windows are normally installed in order to keep the physical and visual relation between inside and outside. As a result of climatic situation and mentioned new trend, an ordinary office has huge heat gain, glare, and discomfort for occupants. Balancing occupant’s comfort and energy conservation in a tropical climate is a real challenge. This study concentrates on evaluating a venetian blind system using per pixel analyzing tools based on the suggested cut-out metrics by the literature. Workplace area in a private office room has been selected as a case study. Eight-day measurement experiment was conducted to investigate the effect of different venetian blind angles in an office area under daylight conditions in Serdang, Malaysia. The study goal was to explore daylight comfort of a commercially available venetian blind system, its’ daylight sufficiency and excess (8:00 AM to 5 PM) as well as Glare examination. Recently developed software, analyzing High Dynamic Range Images (HDRI captured by CCD camera), such as radiance based Evalglare and hdrscope help to investigate luminance-based metrics. The main key factors are illuminance and luminance levels, mean and maximum luminance, daylight glare probability (DGP) and luminance ratio of the selected mask regions. The findings show that in most cases, morning session needs artificial lighting in order to achieve daylight comfort. However, in some conditions (e.g. 10° and 40° slat angles) in the second half of day the workplane illuminance level exceeds the maximum of 2000 lx. Generally, a rising trend is discovered toward mean window luminance and the most unpleasant cases occur after 2 P.M. Considering the luminance criteria rating, the uncomfortable conditions occur in the afternoon session. Surprisingly in no blind condition, extreme case of window/task ratio is not common. Studying the daylight glare probability, there is not any DGP value higher than 0.35 in this experiment.Keywords: daylighting, energy simulation, office environment, Venetian blind
Procedia PDF Downloads 2589889 Designing Automated Embedded Assessment to Assess Student Learning in a 3D Educational Video Game
Authors: Mehmet Oren, Susan Pedersen, Sevket C. Cetin
Abstract:
Despite the frequently criticized disadvantages of the traditional used paper and pencil assessment, it is the most frequently used method in our schools. Although assessments do an acceptable measurement, they are not capable of measuring all the aspects and the richness of learning and knowledge. Also, many assessments used in schools decontextualize the assessment from the learning, and they focus on learners’ standing on a particular topic but do not concentrate on how student learning changes over time. For these reasons, many scholars advocate that using simulations and games (S&G) as a tool for assessment has significant potentials to overcome the problems in traditionally used methods. S&G can benefit from the change in technology and provide a contextualized medium for assessment and teaching. Furthermore, S&G can serve as an instructional tool rather than a method to test students’ learning at a particular time point. To investigate the potentials of using educational games as an assessment and teaching tool, this study presents the implementation and the validation of an automated embedded assessment (AEA), which can constantly monitor student learning in the game and assess their performance without intervening their learning. The experiment was conducted on an undergraduate level engineering course (Digital Circuit Design) with 99 participant students over a period of five weeks in Spring 2016 school semester. The purpose of this research study is to examine if the proposed method of AEA is valid to assess student learning in a 3D Educational game and present the implementation steps. To address this question, this study inspects three aspects of the AEA for the validation. First, the evidence-centered design model was used to lay out the design and measurement steps of the assessment. Then, a confirmatory factor analysis was conducted to test if the assessment can measure the targeted latent constructs. Finally, the scores of the assessment were compared with an external measure (a validated test measuring student learning on digital circuit design) to evaluate the convergent validity of the assessment. The results of the confirmatory factor analysis showed that the fit of the model with three latent factors with one higher order factor was acceptable (RMSEA < 0.00, CFI =1, TLI=1.013, WRMR=0.390). All of the observed variables significantly loaded to the latent factors in the latent factor model. In the second analysis, a multiple regression analysis was used to test if the external measure significantly predicts students’ performance in the game. The results of the regression indicated the two predictors explained 36.3% of the variance (R2=.36, F(2,96)=27.42.56, p<.00). It was found that students’ posttest scores significantly predicted game performance (β = .60, p < .000). The statistical results of the analyses show that the AEA can distinctly measure three major components of the digital circuit design course. It was aimed that this study can help researchers understand how to design an AEA, and showcase an implementation by providing an example methodology to validate this type of assessment.Keywords: educational video games, automated embedded assessment, assessment validation, game-based assessment, assessment design
Procedia PDF Downloads 4219888 Non-Invasive Assessment of Peripheral Arterial Disease: Automated Ankle Brachial Index Measurement and Pulse Volume Analysis Compared to Ultrasound Duplex Scan
Authors: Jane E. A. Lewis, Paul Williams, Jane H. Davies
Abstract:
Introduction: There is, at present, a clear and recognized need to optimize the diagnosis of peripheral arterial disease (PAD), particularly in non-specialist settings such as primary care, and this arises from several key facts. Firstly, PAD is a highly prevalent condition. In 2010, it was estimated that globally, PAD affected more than 202 million people and furthermore, this prevalence is predicted to further escalate. The disease itself, although frequently asymptomatic, can cause considerable patient suffering with symptoms such as lower limb pain, ulceration, and gangrene which, in worse case scenarios, can necessitate limb amputation. A further and perhaps the most eminent consequence of PAD arises from the fact that it is a manifestation of systemic atherosclerosis and therefore is a powerful predictor of coronary heart disease and cerebrovascular disease. Objective: This cross sectional study aimed to individually and cumulatively compare sensitivity and specificity of the (i) ankle brachial index (ABI) and (ii) pulse volume waveform (PVW) recorded by the same automated device, with the presence or absence of peripheral arterial disease (PAD) being verified by an Ultrasound Duplex Scan (UDS). Methods: Patients (n = 205) referred for lower limb arterial assessment underwent an ABI and PVW measurement using volume plethysmography followed by a UDS. Presence of PAD was recorded for ABI if < 0.9 (noted if > 1.30) if PVW was graded as 2, 3 or 4 or a hemodynamically significant stenosis > 50% with UDS. Outcome measure was agreement between measured ABI and interpretation of the PVW for PAD diagnosis, using UDS as the reference standard. Results: Sensitivity of ABI was 80%, specificity 91%, and overall accuracy 88%. Cohen’s kappa revealed good agreement between ABI and UDS (k = 0.7, p < .001). PVW sensitivity 97%, specificity 81%, overall accuracy 84%, with a good level of agreement between PVW and UDS (k = 0.67, p < .001). The combined sensitivity of ABI and PVW was 100%, specificity 76%, and overall accuracy 85% (k = 0.67, p < .001). Conclusions: Combing these two diagnostic modalities within one device provided a highly accurate method of ruling out PAD. Such a device could be utilized within the primary care environment to reduce the number of unnecessary referrals to secondary care with concomitant cost savings, reduced patient inconvenience, and prioritization of urgent PAD cases.Keywords: ankle brachial index, peripheral arterial disease, pulse volume waveform, ultrasound duplex scan
Procedia PDF Downloads 1669887 Harnessing the Power of Loss: On the Discriminatory Dynamic of Non-Emancipatory Organization Identity
Authors: Rickard Grassman
Abstract:
In this paper, Lacanian theory will be used to illustrate the way discourses interact with the material by way of reifying antagonisms to shape our sense of identities in and around organizations. The ability to ‘sustain the loss’ is, in this view, the common structure here discerned in the very texture of a discourse, which reifies ‘lack’ as an ontological condition into something contingently absent (loss) that the subject hopes to overcome (desire). These fundamental human tendencies of identification are illustrated in the paper by examples drawn from history, cinema, and literature. Turning to a select sample of empirical accounts from a management consultancy firm, it is argued that this ‘sustaining the loss’ operates in discourse to enact identification in an organizational context.Keywords: Lacan, identification, discourse, desire, loss
Procedia PDF Downloads 969886 Developing a Driving Simulator with a Navigation System to Measure Driver Distraction, Workload, Driving Safety and Performance
Authors: Tamer E. Yared
Abstract:
The use of driving simulators has made laboratory testing easier. It has been proven to be valid for testing driving ability by many researchers. One benefit of using driving simulators is keeping the human subjects away from traffic hazards, which drivers usually face in a real driving environment while performing a driving experiment. In this study, a driving simulator was developed with a navigation system using a game development software (Unity 3D) and C-sharp codes to measure and evaluate driving performance, safety, and workload for different driving tasks. The driving simulator hardware included a gaming steering wheel and pedals as well as a monitor to view the driving tasks. Moreover, driver distraction was evaluated by utilizing an eye-tracking system working in conjunction with the driving simulator. Twenty subjects were recruited to evaluate driver distraction, workload, driving safety, and performance, as well as provide their feedback about the driving simulator. The subjects’ feedback was obtained by filling a survey after conducting several driving tasks. The main question of that survey was asking the subjects to compare driving on the driving simulator with real driving. Furthermore, other aspects of the driving simulator were evaluated by the subjects in the survey. The survey revealed that the recruited subjects gave an average score of 7.5 out of 10 to the driving simulator when compared to real driving, where the scores ranged between 6 and 8.5. This study is a preliminary effort that opens the door for more improvements to the driving simulator in terms of hardware and software development, which will contribute significantly to driving ability testing.Keywords: driver distraction, driving performance, driving safety, driving simulator, driving workload, navigation system
Procedia PDF Downloads 1789885 Relationship between Right Brain and Left Brain Dominance and Intonation Learning
Authors: Mohammad Hadi Mahmoodi, Soroor Zekrati
Abstract:
The aim of this study was to investigate the relationship between hemispheric dominance and intonation learning of Iranian EFL students. In order to gain this goal, 52 female students from three levels of beginner, elementary and intermediate in Paradise Institute, and 18 male university students at Bu-Ali Sina University constituted the sample. In order to assist students learn the correct way of applying intonation to their everyday speech, the study proposed an interactive approach and provided students with visual aid through which they were able to see the intonation pattern on computer screen using 'Speech Analyzer' software. This software was also used to record subjects’ voice and compare them with the original intonation pattern. Edinburg Handedness Questionnaire (EHD), which ranges from –100 for strong left-handedness to +100 for strong right-handedness was used to indicate the hemispheric dominance of each student. The result of an independent sample t-test indicated that girls learned intonation pattern better than boys, and that right brained students significantly outperformed the left brained ones. Using one-way ANOVA, a significant difference between three proficiency levels was also found. The posthoc Scheffer test showed that the exact difference was between intermediate and elementary, and intermediate and beginner levels, but no significant difference was observed between elementary and beginner levels. The findings of the study might provide researchers with some helpful implications and useful directions for future investigation into the domain of the relationship between mind and second language learning.Keywords: intonation, hemispheric dominance, visual aid, language learning, second language learning
Procedia PDF Downloads 5199884 Measuring the Cavitation Cloud by Electrical Impedance Tomography
Authors: Michal Malik, Jiri Primas, Darina Jasikova, Michal Kotek, Vaclav Kopecky
Abstract:
This paper is a case study dealing with the viability of using Electrical Impedance Tomography for measuring cavitation clouds in a pipe setup. The authors used a simple passive cavitation generator to cause a cavitation cloud, which was then recorded for multiple flow rates using electrodes in two measuring planes. The paper presents the results of the experiment, showing the used industrial grade tomography system ITS p2+ is able to measure the cavitation cloud and may be particularly useful for identifying the inception of cavitation in setups where other measuring tools may not be viable.Keywords: cavitation cloud, conductivity measurement, electrical impedance tomography, mechanically induced cavitation
Procedia PDF Downloads 2489883 Energy and Exergy Performance Optimization on a Real Gas Turbine Power Plant
Authors: Farhat Hajer, Khir Tahar, Cherni Rafik, Dakhli Radhouen, Ammar Ben Brahim
Abstract:
This paper presents the energy and exergy optimization of a real gas turbine power plant performance of 100 MW of power, installed in the South East of Tunisia. A simulation code is established using the EES (Engineering Equation Solver) software. The parameters considered are those of the actual operating conditions of the gas turbine thermal power station under study. The results show that thermal and exergetic efficiency decreases with the increase of the ambient temperature. Air excess has an important effect on the thermal efficiency. The emission of NOx rises in the summer and decreases in the winter. The obtained rates of NOx are compared with measurements results.Keywords: efficiency, exergy, gas turbine, temperature
Procedia PDF Downloads 2849882 Knowledge Based Software Model for the Management and Treatment of Malaria Patients: A Case of Kalisizo General Hospital
Authors: Mbonigaba Swale
Abstract:
Malaria is an infection or disease caused by parasites (Plasmodium Falciparum — causes severe Malaria, plasmodium Vivax, Plasmodium Ovale, and Plasmodium Malariae), transmitted by bites of infected anopheles (female) mosquitoes to humans. These vectors comprise of two types in Africa, particularly in Uganda, i.e. anopheles fenestus and Anopheles gambaie (‘example Anopheles arabiensis,,); feeds on man inside the house mainly at dusk, mid-night and dawn and rests indoors and makes them effective transmitters (vectors) of the disease. People in both urban and rural areas have consistently become prone to repetitive attacks of malaria, causing a lot of deaths and significantly increasing the poverty levels of the rural poor. Malaria is a national problem; it causes a lot of maternal pre-natal and antenatal disorders, anemia in pregnant mothers, low birth weights for the newly born, convulsions and epilepsy among the infants. Cumulatively, it kills about one million children every year in sub-Saharan Africa. It has been estimated to account for 25-35% of all outpatient visits, 20-45% of acute hospital admissions and 15-35% of hospital deaths. Uganda is the leading victim country, for which Rakai and Masaka districts are the most affected. So, it is not clear whether these abhorrent situations and episodes of recurrences and failure to cure from the disease are a result of poor diagnosis, prescription and dosing, treatment habits and compliance of the patients to the drugs or the ethical domain of the stake holders in relation to the main stream methodology of malaria management. The research is aimed at offering an alternative approach to manage and deal absolutely with problem by using a knowledge based software model of Artificial Intelligence (Al) that is capable of performing common-sense and cognitive reasoning so as to take decisions like the human brain would do to provide instantaneous expert solutions so as to avoid speculative simulation of the problem during differential diagnosis in the most accurate and literal inferential aspect. This system will assist physicians in many kinds of medical diagnosis, prescribing treatments and doses, and in monitoring patient responses, basing on the body weight and age group of the patient, it will be able to provide instantaneous and timely information options, alternative ways and approaches to influence decision making during case analysis. The computerized system approach, a new model in Uganda termed as “Software Aided Treatment” (SAT) will try to change the moral and ethical approach and influence conduct so as to improve the skills, experience and values (social and ethical) in the administration and management of the disease and drugs (combination therapy and generics) by both the patient and the health worker.Keywords: knowledge based software, management, treatment, diagnosis
Procedia PDF Downloads 579881 Basic Calibration and Normalization Techniques for Time Domain Reflectometry Measurements
Authors: Shagufta Tabassum
Abstract:
The study of dielectric properties in a binary mixture of liquids is very useful to understand the liquid structure, molecular interaction, dynamics, and kinematics of the mixture. Time-domain reflectometry (TDR) is a powerful tool for studying the cooperation and molecular dynamics of the H-bonded system. In this paper, we discuss the basic calibration and normalization procedure for time-domain reflectometry measurements. Our approach is to explain the different types of error occur during TDR measurements and how these errors can be eliminated or minimized.Keywords: time domain reflectometry measurement techinque, cable and connector loss, oscilloscope loss, and normalization technique
Procedia PDF Downloads 2069880 A Method and System for Secure Authentication Using One Time QR Code
Authors: Divyans Mahansaria
Abstract:
User authentication is an important security measure for protecting confidential data and systems. However, the vulnerability while authenticating into a system has significantly increased. Thus, necessary mechanisms must be deployed during the process of authenticating a user to safeguard him/her from the vulnerable attacks. The proposed solution implements a novel authentication mechanism to counter various forms of security breach attacks including phishing, Trojan horse, replay, key logging, Asterisk logging, shoulder surfing, brute force search and others. QR code (Quick Response Code) is a type of matrix barcode or two-dimensional barcode that can be used for storing URLs, text, images and other information. In the proposed solution, during each new authentication request, a QR code is dynamically generated and presented to the user. A piece of generic information is mapped to plurality of elements and stored within the QR code. The mapping of generic information with plurality of elements, randomizes in each new login, and thus the QR code generated for each new authentication request is for one-time use only. In order to authenticate into the system, the user needs to decode the QR code using any QR code decoding software. The QR code decoding software needs to be installed on handheld mobile devices such as smartphones, personal digital assistant (PDA), etc. On decoding the QR code, the user will be presented a mapping between the generic piece of information and plurality of elements using which the user needs to derive cipher secret information corresponding to his/her actual password. Now, in place of the actual password, the user will use this cipher secret information to authenticate into the system. The authentication terminal will receive the cipher secret information and use a validation engine that will decipher the cipher secret information. If the entered secret information is correct, the user will be provided access to the system. Usability study has been carried out on the proposed solution, and the new authentication mechanism was found to be easy to learn and adapt. Mathematical analysis of the time taken to carry out brute force attack on the proposed solution has been carried out. The result of mathematical analysis showed that the solution is almost completely resistant to brute force attack. Today’s standard methods for authentication are subject to a wide variety of software, hardware, and human attacks. The proposed scheme can be very useful in controlling the various types of authentication related attacks especially in a networked computer environment where the use of username and password for authentication is common.Keywords: authentication, QR code, cipher / decipher text, one time password, secret information
Procedia PDF Downloads 2689879 Algae Biofertilizers Promote Sustainable Food Production and Nutrient Efficiency: An Integrated Empirical-Modeling Study
Authors: Zeenat Rupawalla, Nicole Robinson, Susanne Schmidt, Sijie Li, Selina Carruthers, Elodie Buisset, John Roles, Ben Hankamer, Juliane Wolf
Abstract:
Agriculture has radically changed the global biogeochemical cycle of nitrogen (N). Fossil fuel-enabled synthetic N-fertiliser is a foundation of modern agriculture but applied to soil crops only use about half of it. To address N-pollution from cropping and the large carbon and energy footprint of N-fertiliser synthesis, new technologies delivering enhanced energy efficiency, decarbonisation, and a circular nutrient economy are needed. We characterised algae fertiliser (AF) as an alternative to synthetic N-fertiliser (SF) using empirical and modelling approaches. We cultivated microalgae in nutrient solution and modelled up-scaled production in nutrient-rich wastewater. Over four weeks, AF released 63.5% of N as ammonium and nitrate, and 25% of phosphorous (P) as phosphate to the growth substrate, while SF released 100% N and 20% P. To maximise crop N-use and minimise N-leaching, we explored AF and SF dose-response-curves with spinach in glasshouse conditions. AF-grown spinach produced 36% less biomass than SF-grown plants due to AF’s slower and linear N-release, while SF resulted in 5-times higher N-leaching loss than AF. Optimised blends of AF and SF boosted crop yield and minimised N-loss due to greater synchrony of N-release and crop uptake. Additional benefits of AF included greener leaves, lower leaf nitrate concentration, and higher microbial diversity and water holding capacity in the growth substrate. Life-cycle-analysis showed that replacing the most effective SF dosage with AF lowered the carbon footprint of fertiliser production from 2.02 g CO₂ (C-producing) to -4.62 g CO₂ (C-sequestering), with a further 12% reduction when AF is produced on wastewater. Embodied energy was lowest for AF-SF blends and could be reduced by 32% when cultivating algae on wastewater. We conclude that (i) microalgae offer a sustainable alternative to synthetic N-fertiliser in spinach production and potentially other crop systems, and (ii) microalgae biofertilisers support the circular nutrient economy and several sustainable development goals.Keywords: bioeconomy, decarbonisation, energy footprint, microalgae
Procedia PDF Downloads 1379878 Umkhonto Wesizwe as the Foundation of Post-Apartheid South Africa’s Foreign Policy and International Relations.
Authors: Bheki R. Mngomezulu
Abstract:
The present paper cogently and systematically traces the history of Umkhonto Wesizwe (MK) and identifies its important role in shaping South Africa’s post-apartheid foreign policy and international relations under black leadership. It provides the political and historical contexts within which we can interpret and better understand South Africa’s controversial ‘Quiet Diplomacy’ approach to Zimbabwe’s endemic political and economic crises, which have dragged for too long. On 16 December 1961, the African National Congress (ANC) officially launched the MK as its military wing. The main aim was to train liberation fighters outside South Africa who would return into the country to topple the apartheid regime. Subsequently, the ANC established links with various countries across Africa and the globe in order to solicit arms, financial resources and military training for its recruits into the MK. Drawing from archival research and empirical data obtained through oral interviews that were conducted with some of the former MK cadres, this paper demonstrates how the ANC forged relations with a number of countries that were like-minded in order to ensure that its dream of removing the apartheid government became a reality. The findings reveal that South Africa’s foreign policy posture and international relations after the demise of apartheid in 1994 built on these relations. As such, even former and current socialist countries that were frowned upon by the Western world became post-apartheid South Africa’s international partners. These include countries such as Cuba and China, among others. Even countries that were not recognized by the Western world as independent states received good reception in post-apartheid South Africa’s foreign policy agenda. One of these countries is Palestine. Within Africa, countries with questionable human rights records such as Nigeria and Zimbabwe were accommodated in South Africa’s foreign policy agenda after 1994. Drawing from this history, the paper concludes that it would be difficult to fully understand and appreciate South Africa’s foreign policy direction and international relations after 1994 without bringing the history and the politics of the MK into the equation. Therefore, the paper proposes that the utilitarian role of history should never be undermined in the analysis of a country’s foreign policy direction and international relations. Umkhonto Wesizwe and South Africa are used as examples to demonstrate how such a link could be drawn through archival and empirical evidence.Keywords: African National Congress, apartheid, foreign policy, international relations
Procedia PDF Downloads 1859877 Barriers and Opportunities for Implementing Electronic Prescription Software in Public Libyan Hospitals
Authors: Abdelbaset M. Elghriani, Abdelsalam M. Maatuk, Isam Denna, Amira Abdulla Werfalli
Abstract:
Electronic prescription software (e-prescribing) benefits patients and physicians by preventing handwriting errors and giving accurate prescriptions. E-prescribing allows prescriptions to be written and sent to pharmacies electronically instead of using handwritten notes. Significant factors that may affect the adoption of e-prescription systems include lacking technical support, financial resources to operate the systems, and change resistance from some clinicians, which have been identified as barriers to the implementation of e-prescription systems. This study aims to explore the trends and opinions of physicians and pharmacists about e-prescriptions and to identify the obstacles and benefits of the application of e-prescriptions in the health care system. A cross-sectional descriptive study was conducted at three Libyan public hospitals. Data were collected through a self-constructed questionnaire to assess the opinions regarding potential constraining factors and benefits of implementing an e-prescribing system in hospitals. Data presented as mean, frequency distribution table, cross-tabulation, and bar charts. Data analysis was performed, and the results show that technical, financial, and organizational obstacles are the most important obstacles that prevent the application of e-prescribing systems in Libyan hospitals. In addition, there was awareness of the benefits of e-prescribing, especially reducing medication dispensing errors, and a desire of physicians and pharmacists to use electronic prescriptions.Keywords: physicians, e-prescribing, health care system, pharmacists
Procedia PDF Downloads 1269876 A Computer-Aided System for Tooth Shade Matching
Authors: Zuhal Kurt, Meral Kurt, Bilge T. Bal, Kemal Ozkan
Abstract:
Shade matching and reproduction is the most important element of success in prosthetic dentistry. Until recently, shade matching procedure was implemented by dentists visual perception with the help of shade guides. Since many factors influence visual perception; tooth shade matching using visual devices (shade guides) is highly subjective and inconsistent. Subjective nature of this process has lead to the development of instrumental devices. Nowadays, colorimeters, spectrophotometers, spectroradiometers and digital image analysing systems are used for instrumental shade selection. Instrumental devices have advantages that readings are quantifiable, can obtain more rapidly and simply, objectively and precisely. However, these devices have noticeable drawbacks. For example, translucent structure and irregular surfaces of teeth lead to defects on measurement with these devices. Also between the results acquired by devices with different measurement principles may make inconsistencies. So, its obligatory to search for new methods for dental shade matching process. A computer-aided system device; digital camera has developed rapidly upon today. Currently, advances in image processing and computing have resulted in the extensive use of digital cameras for color imaging. This procedure has a much cheaper process than the use of traditional contact-type color measurement devices. Digital cameras can be taken by the place of contact-type instruments for shade selection and overcome their disadvantages. Images taken from teeth show morphology and color texture of teeth. In last decades, a new method was recommended to compare the color of shade tabs taken by a digital camera using color features. This method showed that visual and computer-aided shade matching systems should be used as concatenated. Recently using methods of feature extraction techniques are based on shape description and not used color information. However, color is mostly experienced as an essential property in depicting and extracting features from objects in the world around us. When local feature descriptors with color information are extended by concatenating color descriptor with the shape descriptor, that descriptor will be effective on visual object recognition and classification task. Therefore, the color descriptor is to be used in combination with a shape descriptor it does not need to contain any spatial information, which leads us to use local histograms. This local color histogram method is remain reliable under variation of photometric changes, geometrical changes and variation of image quality. So, coloring local feature extraction methods are used to extract features, and also the Scale Invariant Feature Transform (SIFT) descriptor used to for shape description in the proposed method. After the combination of these descriptors, the state-of-art descriptor named by Color-SIFT will be used in this study. Finally, the image feature vectors obtained from quantization algorithm are fed to classifiers such as Nearest Neighbor (KNN), Naive Bayes or Support Vector Machines (SVM) to determine label(s) of the visual object category or matching. In this study, SVM are used as classifiers for color determination and shade matching. Finally, experimental results of this method will be compared with other recent studies. It is concluded from the study that the proposed method is remarkable development on computer aided tooth shade determination system.Keywords: classifiers, color determination, computer-aided system, tooth shade matching, feature extraction
Procedia PDF Downloads 4449875 Evaluating the Small-Strain Mechanical Properties of Cement-Treated Clayey Soils Based on the Confining Pressure
Authors: Muhammad Akmal Putera, Noriyuki Yasufuku, Adel Alowaisy, Ahmad Rifai
Abstract:
Indonesia’s government has planned a project for a high-speed railway connecting the capital cities, Jakarta and Surabaya, about 700 km. Based on that location, it has been planning construction above the lowland soil region. The lowland soil region comprises cohesive soil with high water content and high compressibility index, which in fact, led to a settlement problem. Among the variety of railway track structures, the adoption of the ballastless track was used effectively to reduce the settlement; it provided a lightweight structure and minimized workspace. Contradictorily, deploying this thin layer structure above the lowland area was compensated with several problems, such as lack of bearing capacity and deflection behavior during traffic loading. It is necessary to combine with ground improvement to assure a settlement behavior on the clayey soil. Reflecting on the assurance of strength increment and working period, those were convinced by adopting methods such as cement-treated soil as the substructure of railway track. Particularly, evaluating mechanical properties in the field has been well known by using the plate load test and cone penetration test. However, observing an increment of mechanical properties has uncertainty, especially for evaluating cement-treated soil on the substructure. The current quality control of cement-treated soils was established by laboratory tests. Moreover, using small strain devices measurement in the laboratory can predict more reliable results that are identical to field measurement tests. Aims of this research are to show an intercorrelation of confining pressure with the initial condition of the Young modulus (E_o), Poisson ratio (υ_o) and Shear modulus (G_o) within small strain ranges. Furthermore, discrepancies between those parameters were also investigated. Based on the experimental result confirmed the intercorrelation between cement content and confining pressure with a power function. In addition, higher cement ratios have discrepancies, conversely with low mixing ratios.Keywords: amount of cement, elastic zone, high-speed railway, lightweight structure
Procedia PDF Downloads 1419874 Numerical Analysis of NOₓ Emission in Staged Combustion for the Optimization of Once-Through-Steam-Generators
Authors: Adrien Chatel, Ehsan Askari Mahvelati, Laurent Fitschy
Abstract:
Once-Through-Steam-Generators are commonly used in the oil-sand industry in the heavy fuel oil extraction process. They are composed of three main parts: the burner, the radiant and convective sections. Natural gas is burned through staged diffusive flames stabilized by the burner. The heat generated by the combustion is transferred to the water flowing through the piping system in the radiant and convective sections. The steam produced within the pipes is then directed to the ground to reduce the oil viscosity and allow its pumping. With the rapid development of the oil-sand industry, the number of OTSG in operation has increased as well as the associated emissions of environmental pollutants, especially the Nitrous Oxides (NOₓ). To limit the environmental degradation, various international environmental agencies have established regulations on the pollutant discharge and pushed to reduce the NOₓ release. To meet these constraints, OTSG constructors have to rely on more and more advanced tools to study and predict the NOₓ emission. With the increase of the computational resources, Computational Fluid Dynamics (CFD) has emerged as a flexible tool to analyze the combustion and pollutant formation process. Moreover, to optimize the burner operating condition regarding the NOx emission, field characterization and measurements are usually accomplished. However, these kinds of experimental campaigns are particularly time-consuming and sometimes even impossible for industrial plants with strict operation schedule constraints. Therefore, the application of CFD seems to be more adequate in order to provide guidelines on the NOₓ emission and reduction problem. In the present work, two different software are employed to simulate the combustion process in an OTSG, namely the commercial software ANSYS Fluent and the open source software OpenFOAM. RANS (Reynolds-Averaged Navier–Stokes) equations combined with the Eddy Dissipation Concept to model the combustion and closed by the k-epsilon model are solved. A mesh sensitivity analysis is performed to assess the independence of the solution on the mesh. In the first part, the results given by the two software are compared and confronted with experimental data as a mean to assess the numerical modelling. Flame temperatures and chemical composition are used as reference fields to perform this validation. Results show a fair agreement between experimental and numerical data. In the last part, OpenFOAM is employed to simulate several operating conditions, and an Emission Characteristic Map of the combustion system is generated. The sources of high NOₓ production inside the OTSG are pointed and correlated to the physics of the flow. CFD is, therefore, a useful tool for providing an insight into the NOₓ emission phenomena in OTSG. Sources of high NOₓ production can be identified, and operating conditions can be adjusted accordingly. With the help of RANS simulations, an Emission Characteristics Map can be produced and then be used as a guide for a field tune-up.Keywords: combustion, computational fluid dynamics, nitrous oxides emission, once-through-steam-generators
Procedia PDF Downloads 1139873 Thermal Analysis of a Graphite Calorimeter for the Measurement of Absorbed Dose for Therapeutic X-Ray Beam
Authors: I.J. Kim, B.C. Kim, J.H. Kim, C.-Y. Yi
Abstract:
Heat transfer in a graphite calorimeter is analyzed by using the finite elements method. The calorimeter is modeled in 3D geometry. Quasi-adiabatic mode operation is realized in the simulation and the temperature rise by different sources of the ionizing radiation and electric heaters is compared, directly. The temperature distribution caused by the electric power was much different from that by the ionizing radiation because of its point-like localized heating. However, the temperature rise which was finally read by sensing thermistors agreed well to each other within 0.02 %.Keywords: graphite calorimeter, finite element analysis, heat transfer, quasi-adiabatic mode
Procedia PDF Downloads 4309872 Generating Spherical Surface of Wear Drain in Cutting Metal by Finite Element Method Analysis
Authors: D. Kabeya Nahum, L. Y. Kabeya Mukeba
Abstract:
In this work, the design of surface defects some support of the anchor rod ball joint. The future adhesion contact was rocking in manufacture machining, for giving by the numerical analysis of a short simple solution of thermo-mechanical coupled problem in process engineering. The analysis of geometrical evaluation and the quasi-static and dynamic states are discussed in kinematic dimensional tolerances onto surfaces of part. Geometric modeling using the finite element method (FEM) in rough part of such phase provides an opportunity to solve the nonlinearity behavior observed by empirical data to improve the discrete functional surfaces. The open question here is to obtain spherical geometry of drain wear with the operation of rolling. The formulation with (1 ± 0.01) mm thickness near the drain wear semi-finishing tool for studying different angles, do not help the professional factor in design cutting metal related vibration, friction and interface solid-solid of part and tool during this physical complex process, with multi-parameters no-defined in Sobolev Spaces. The stochastic approach of cracking, wear and fretting due to the cutting forces face boundary layers small dimensions thickness of the workpiece and the tool in the machining position is predicted neighbor to ‘Yakam Matrix’.Keywords: FEM, geometry, part, simulation, spherical surface engineering, tool, workpiece
Procedia PDF Downloads 273