Search results for: more comprehensive and accurate safety data
22752 Measurement of Operational and Environmental Performance of the Coal-Fired Power Plants in India by Using Data Envelopment Analysis
Authors: Vijay Kumar Bajpai, Sudhir Kumar Singh
Abstract:
In this study, the performance analyses of the twenty five coal-fired power plants (CFPPs) used for electricity generation are carried out through various data envelopment analysis (DEA) models. Three efficiency indices are defined and pursued. During the calculation of the operational performance, energy and non-energy variables are used as input, and net electricity produced is used as desired output. CO2 emitted to the environment is used as the undesired output in the computation of the pure environmental performance while in Model-3 CO2 emissions is considered as detrimental input in the calculation of operational and environmental performance. Empirical results show that most of the plants are operating in increasing returns to scale region and Mettur plant is efficient one with regards to energy use and environment. The result also indicates that the undesirable output effect is insignificant in the research sample. The present study will provide clues to plant operators towards raising the operational and environmental performance of CFPPs.Keywords: coal fired power plants, environmental performance, data envelopment analysis, operational performance
Procedia PDF Downloads 45922751 Evaluation of Settlement of Coastal Embankments Using Finite Elements Method
Authors: Sina Fadaie, Seyed Abolhassan Naeini
Abstract:
Coastal embankments play an important role in coastal structures by reducing the effect of the wave forces and controlling the movement of sediments. Many coastal areas are underlain by weak and compressible soils. Estimation of during construction settlement of coastal embankments is highly important in design and safety control of embankments and appurtenant structures. Accordingly, selecting and establishing of an appropriate model with a reasonable level of complication is one of the challenges for engineers. Although there are advanced models in the literature regarding design of embankments, there is not enough information on the prediction of their associated settlement, particularly in coastal areas having considerable soft soils. Marine engineering study in Iran is important due to the existence of two important coastal areas located in the northern and southern parts of the country. In the present study, the validity of Terzaghi’s consolidation theory has been investigated. In addition, the settlement of these coastal embankments during construction is predicted by using special methods in PLAXIS software by the help of appropriate boundary conditions and soil layers. The results indicate that, for the existing soil condition at the site, some parameters are important to be considered in analysis. Consequently, a model is introduced to estimate the settlement of the embankments in such geotechnical conditions.Keywords: consolidation, settlement, coastal embankments, numerical methods, finite elements method
Procedia PDF Downloads 16422750 The Relationship between Basic Human Needs and Opportunity Based on Social Progress Index
Authors: Ebru Ozgur Guler, Huseyin Guler, Sera Sanli
Abstract:
Social Progress Index (SPI) whose fundamentals have been thrown in the World Economy Forum is an index which aims to form a systematic basis for guiding strategy for inclusive growth which requires achieving both economic and social progress. In this research, it has been aimed to determine the relations among “Basic Human Needs” (BHN) (including four variables of ‘Nutrition and Basic Medical Care’, ‘Water and Sanitation’, ‘Shelter’ and ‘Personal Safety’) and “Opportunity” (OPT) (that is composed of ‘Personal Rights’, ‘Personal Freedom and Choice’, ‘Tolerance and Inclusion’, and ‘Access to Advanced Education’ components) dimensions of 2016 SPI for 138 countries which take place in the website of Social Progress Imperative by carrying out canonical correlation analysis (CCA) which is a data reduction technique that operates in a way to maximize the correlation between two variable sets. In the interpretation of results, the first pair of canonical variates pointing to the highest canonical correlation has been taken into account. The first canonical correlation coefficient has been found as 0.880 indicating to the high relationship between BHN and OPT variable sets. Wilk’s Lambda statistic has revealed that an overall effect of 0.809 is highly large for the full model in order to be counted as statistically significant (with a p-value of 0.000). According to the standardized canonical coefficients, the largest contribution to BHN set of variables has come from ‘shelter’ variable. The most effective variable in OPT set has been detected to be ‘access to advanced education’. Findings based on canonical loadings have also confirmed these results with respect to the contributions to the first canonical variates. When canonical cross loadings (structure coefficients) are examined, for the first pair of canonical variates, the largest contributions have been provided by ‘shelter’ and ‘access to advanced education’ variables. Since the signs for structure coefficients have been found to be negative for all variables; all OPT set of variables are positively related to all of the BHN set of variables. In case canonical communality coefficients which are the sum of the squares of structure coefficients across all interpretable functions are taken as the basis; amongst all variables, ‘personal rights’ and ‘tolerance and inclusion’ variables can be said not to be useful in the model with 0.318721 and 0.341722 coefficients respectively. On the other hand, while redundancy index for BHN set has been found to be 0.615; OPT set has a lower redundancy index with 0.475. High redundancy implies high ability for predictability. The proportion of the total variation in BHN set of variables that is explained by all of the opposite canonical variates has been calculated as 63% and finally, the proportion of the total variation in OPT set that is explained by all of the canonical variables in BHN set has been determined as 50.4% and a large part of this proportion belongs to the first pair. The results suggest that there is a high and statistically significant relationship between BHN and OPT. This relationship is generally accounted by ‘shelter’ and ‘access to advanced education’.Keywords: canonical communality coefficient, canonical correlation analysis, redundancy index, social progress index
Procedia PDF Downloads 22422749 Development of a Mixed-Reality Hands-Free Teleoperated Robotic Arm for Construction Applications
Authors: Damith Tennakoon, Mojgan Jadidi, Seyedreza Razavialavi
Abstract:
With recent advancements of automation in robotics, from self-driving cars to autonomous 4-legged quadrupeds, one industry that has been stagnant is the construction industry. The methodologies used in a modern-day construction site consist of arduous physical labor and the use of heavy machinery, which has not changed over the past few decades. The dangers of a modern-day construction site affect the health and safety of the workers due to performing tasks such as lifting and moving heavy objects and having to maintain unhealthy posture to complete repetitive tasks such as painting, installing drywall, and laying bricks. Further, training for heavy machinery is costly and requires a lot of time due to their complex control inputs. The main focus of this research is using immersive wearable technology and robotic arms to perform the complex and intricate skills of modern-day construction workers while alleviating the physical labor requirements to perform their day-to-day tasks. The methodology consists of mounting a stereo vision camera, the ZED Mini by Stereolabs, onto the end effector of an industrial grade robotic arm, streaming the video feed into the Virtual Reality (VR) Meta Quest 2 (Quest 2) head-mounted display (HMD). Due to the nature of stereo vision, and the similar field-of-views between the stereo camera and the Quest 2, human-vision can be replicated on the HMD. The main advantage this type of camera provides over a traditional monocular camera is it gives the user wearing the HMD a sense of the depth of the camera scene, specifically, a first-person view of the robotic arm’s end effector. Utilizing the built-in cameras of the Quest 2 HMD, open-source hand-tracking libraries from OpenXR can be implemented to track the user’s hands in real-time. A mixed-reality (XR) Unity application can be developed to localize the operator's physical hand motions with the end-effector of the robotic arm. Implementing gesture controls will enable the user to move the robotic arm and control its end-effector by moving the operator’s arm and providing gesture inputs from a distant location. Given that the end effector of the robotic arm is a gripper tool, gripping and opening the operator’s hand will translate to the gripper of the robot arm grabbing or releasing an object. This human-robot interaction approach provides many benefits within the construction industry. First, the operator’s safety will be increased substantially as they can be away from the site-location while still being able perform complex tasks such as moving heavy objects from place to place or performing repetitive tasks such as painting walls and laying bricks. The immersive interface enables precision robotic arm control and requires minimal training and knowledge of robotic arm manipulation, which lowers the cost for operator training. This human-robot interface can be extended to many applications, such as handling nuclear accident/waste cleanup, underwater repairs, deep space missions, and manufacturing and fabrication within factories. Further, the robotic arm can be mounted onto existing mobile robots to provide access to hazardous environments, including power plants, burning buildings, and high-altitude repair sites.Keywords: construction automation, human-robot interaction, hand-tracking, mixed reality
Procedia PDF Downloads 8522748 Algorithm Development of Individual Lumped Parameter Modelling for Blood Circulatory System: An Optimization Study
Authors: Bao Li, Aike Qiao, Gaoyang Li, Youjun Liu
Abstract:
Background: Lumped parameter model (LPM) is a common numerical model for hemodynamic calculation. LPM uses circuit elements to simulate the human blood circulatory system. Physiological indicators and characteristics can be acquired through the model. However, due to the different physiological indicators of each individual, parameters in LPM should be personalized in order for convincing calculated results, which can reflect the individual physiological information. This study aimed to develop an automatic and effective optimization method to personalize the parameters in LPM of the blood circulatory system, which is of great significance to the numerical simulation of individual hemodynamics. Methods: A closed-loop LPM of the human blood circulatory system that is applicable for most persons were established based on the anatomical structures and physiological parameters. The patient-specific physiological data of 5 volunteers were non-invasively collected as personalized objectives of individual LPM. In this study, the blood pressure and flow rate of heart, brain, and limbs were the main concerns. The collected systolic blood pressure, diastolic blood pressure, cardiac output, and heart rate were set as objective data, and the waveforms of carotid artery flow and ankle pressure were set as objective waveforms. Aiming at the collected data and waveforms, sensitivity analysis of each parameter in LPM was conducted to determine the sensitive parameters that have an obvious influence on the objectives. Simulated annealing was adopted to iteratively optimize the sensitive parameters, and the objective function during optimization was the root mean square error between the collected waveforms and data and simulated waveforms and data. Each parameter in LPM was optimized 500 times. Results: In this study, the sensitive parameters in LPM were optimized according to the collected data of 5 individuals. Results show a slight error between collected and simulated data. The average relative root mean square error of all optimization objectives of 5 samples were 2.21%, 3.59%, 4.75%, 4.24%, and 3.56%, respectively. Conclusions: Slight error demonstrated good effects of optimization. The individual modeling algorithm developed in this study can effectively achieve the individualization of LPM for the blood circulatory system. LPM with individual parameters can output the individual physiological indicators after optimization, which are applicable for the numerical simulation of patient-specific hemodynamics.Keywords: blood circulatory system, individual physiological indicators, lumped parameter model, optimization algorithm
Procedia PDF Downloads 14122747 Estimating Water Balance at Beterou Watershed, Benin Using Soil and Water Assessment Tool (SWAT) Model
Authors: Ella Sèdé Maforikan
Abstract:
Sustained water management requires quantitative information and the knowledge of spatiotemporal dynamics of hydrological system within the basin. This can be achieved through the research. Several studies have investigated both surface water and groundwater in Beterou catchment. However, there are few published papers on the application of the SWAT modeling in Beterou catchment. The objective of this study was to evaluate the performance of SWAT to simulate the water balance within the watershed. The inputs data consist of digital elevation model, land use maps, soil map, climatic data and discharge records. The model was calibrated and validated using the Sequential Uncertainty Fitting (SUFI2) approach. The calibrated started from 1989 to 2006 with four years warming up period (1985-1988); and validation was from 2007 to 2020. The goodness of the model was assessed using five indices, i.e., Nash–Sutcliffe efficiency (NSE), the ratio of the root means square error to the standard deviation of measured data (RSR), percent bias (PBIAS), the coefficient of determination (R²), and Kling Gupta efficiency (KGE). Results showed that SWAT model successfully simulated river flow in Beterou catchment with NSE = 0.79, R2 = 0.80 and KGE= 0.83 for the calibration process against validation process that provides NSE = 0.78, R2 = 0.78 and KGE= 0.85 using site-based streamflow data. The relative error (PBIAS) ranges from -12.2% to 3.1%. The parameters runoff curve number (CN2), Moist Bulk Density (SOL_BD), Base Flow Alpha Factor (ALPHA_BF), and the available water capacity of the soil layer (SOL_AWC) were the most sensitive parameter. The study provides further research with uncertainty analysis and recommendations for model improvement and provision of an efficient means to improve rainfall and discharges measurement data.Keywords: watershed, water balance, SWAT modeling, Beterou
Procedia PDF Downloads 6022746 Experimental and Numerical Investigation on Delaminated Composite Plate
Authors: Sreekanth T. G., Kishorekumar S., Sowndhariya Kumar J., Karthick R., Shanmugasuriyan S.
Abstract:
Composites are increasingly being used in industries due to their unique properties, such as high specific stiffness and specific strength, higher fatigue and wear resistances, and higher damage tolerance capability. Composites are prone to failures or damages that are difficult to identify, locate, and characterize due to their complex design features and complicated loading conditions. The lack of understanding of the damage mechanism of the composites leads to the uncertainties in the structural integrity and durability. Delamination is one of the most critical failure mechanisms in laminated composites because it progressively affects the mechanical performance of fiber-reinforced polymer composite structures over time. The identification and severity characterization of delamination in engineering fields such as the aviation industry is critical for both safety and economic concerns. The presence of delamination alters the vibration properties of composites, such as natural frequencies, mode shapes, and so on. In this study, numerical analysis and experimental analysis were performed on delaminated and non-delaminated glass fiber reinforced polymer (GFRP) plate, and the numerical and experimental analysis results were compared, and error percentage has been found out.Keywords: composites, delamination, natural frequency, mode shapes
Procedia PDF Downloads 11222745 A Study of Carbon Emissions during Building Construction
Authors: Jonggeon Lee, Sungho Tae, Sungjoon Suk, Keunhyeok Yang, George Ford, Michael E. Smith, Omidreza Shoghli
Abstract:
In recent years, research to reduce carbon emissions through quantitative assessment of building life cycle carbon emissions has been performed as it relates to the construction industry. However, most research efforts related to building carbon emissions assessment have been focused on evaluation during the operational phase of a building’s life span. Few comprehensive studies of the carbon emissions during a building’s construction phase have been performed. The purpose of this study is to propose an assessment method that quantitatively evaluates the carbon emissions of buildings during the construction phase. The study analysed the amount of carbon emissions produced by 17 construction trades, and selected four construction trades that result in high levels of carbon emissions: reinforced concrete work; sheathing work; foundation work; and form work. Building materials, and construction and transport equipment used for the selected construction trades were identified, and carbon emissions produced by the identified materials and equipment were calculated for these four construction trades. The energy consumption of construction and transport equipment was calculated by analysing fuel efficiency and equipment productivity rates. The combination of the expected levels of carbon emissions associated with the utilization of building materials and construction equipment provides means for estimating the quantity of carbon emissions related to the construction phase of a building’s life cycle. The proposed carbon emissions assessment method was validated by case studies.Keywords: building construction phase, carbon emissions assessment, building life cycle
Procedia PDF Downloads 75622744 BER Estimate of WCDMA Systems with MATLAB Simulation Model
Authors: Suyeb Ahmed Khan, Mahmood Mian
Abstract:
Simulation plays an important role during all phases of the design and engineering of communications systems, from early stages of conceptual design through the various stages of implementation, testing, and fielding of the system. In the present paper, a simulation model has been constructed for the WCDMA system in order to evaluate the performance. This model describes multiusers effects and calculation of BER (Bit Error Rate) in 3G mobile systems using Simulink MATLAB 7.1. Gaussian Approximation defines the multi-user effect on system performance. BER has been analyzed with comparison between transmitting data and receiving data.Keywords: WCDMA, simulations, BER, MATLAB
Procedia PDF Downloads 59522743 A Pipeline for Detecting Copy Number Variation from Whole Exome Sequencing Using Comprehensive Tools
Authors: Cheng-Yang Lee, Petrus Tang, Tzu-Hao Chang
Abstract:
Copy number variations (CNVs) have played an important role in many kinds of human diseases, such as Autism, Schizophrenia and a number of cancers. Many diseases are found in genome coding regions and whole exome sequencing (WES) is a cost-effective and powerful technology in detecting variants that are enriched in exons and have potential applications in clinical setting. Although several algorithms have been developed to detect CNVs using WES and compared with other algorithms for finding the most suitable methods using their own samples, there were not consistent datasets across most of algorithms to evaluate the ability of CNV detection. On the other hand, most of algorithms is using command line interface that may greatly limit the analysis capability of many laboratories. We create a series of simulated WES datasets from UCSC hg19 chromosome 22, and then evaluate the CNV detective ability of 19 algorithms from OMICtools database using our simulated WES datasets. We compute the sensitivity, specificity and accuracy in each algorithm for validation of the exome-derived CNVs. After comparison of 19 algorithms from OMICtools database, we construct a platform to install all of the algorithms in a virtual machine like VirtualBox which can be established conveniently in local computers, and then create a simple script that can be easily to use for detecting CNVs using algorithms selected by users. We also build a table to elaborate on many kinds of events, such as input requirement, CNV detective ability, for all of the algorithms that can provide users a specification to choose optimum algorithms.Keywords: whole exome sequencing, copy number variations, omictools, pipeline
Procedia PDF Downloads 32322742 The Effects of Emotional Working Memory Training on Trait Anxiety
Authors: Gabrielle Veloso, Welison Ty
Abstract:
Trait anxiety is a pervasive tendency to attend to and experience fears and worries to a disproportionate degree, across various situations. This study sought to determine if participants who undergo emotional working memory training will have significantly lower scores on the trait anxiety scales post-intervention. The study also sought to determine if emotional regulation mediated the relationship between working memory training and trait anxiety. Forty-nine participants underwent 20 days of computerized emotional working memory training called Emotional Dual n-back, which involves viewing a continuous stream of emotional content on a grid, and then remembering the location and color of items presented on the grid. Participants of the treatment group had significantly lower trait anxiety compared to controls post-intervention. Mediation analysis determined that working memory training had no significant relationship to anxiety as measured by the Beck’s Anxiety Inventory-Trait (BAIT), but was significantly related to anxiety as measured by form Y2 of the Spielberger State-Trait Anxiety Inventory (STAI-Y2). Emotion regulation, as measured by the Emotional Regulation Questionnaire (ERQ), was found not to mediate between working memory training and trait anxiety reduction. Results suggest that working memory training may be useful in reducing psychoemotional symptoms rather than somatic symptoms of trait anxiety. Moreover, it proposes for future research to further look into the mediating role of emotion regulation via neuroimaging and the development of more comprehensive measures of emotion regulation.Keywords: anxiety, emotion regulation, working-memory, working-memory training
Procedia PDF Downloads 15622741 The Employment of Unmanned Aircraft Systems for Identification and Classification of Helicopter Landing Zones and Airdrop Zones in Calamity Situations
Authors: Marielcio Lacerda, Angelo Paulino, Elcio Shiguemori, Alvaro Damiao, Lamartine Guimaraes, Camila Anjos
Abstract:
Accurate information about the terrain is extremely important in disaster management activities or conflict. This paper proposes the use of the Unmanned Aircraft Systems (UAS) at the identification of Airdrop Zones (AZs) and Helicopter Landing Zones (HLZs). In this paper we consider the AZs the zones where troops or supplies are dropped by parachute, and HLZs areas where victims can be rescued. The use of digital image processing enables the automatic generation of an orthorectified mosaic and an actual Digital Surface Model (DSM). This methodology allows obtaining this fundamental information to the terrain’s comprehension post-disaster in a short amount of time and with good accuracy. In order to get the identification and classification of AZs and HLZs images from DJI drone, model Phantom 4 have been used. The images were obtained with the knowledge and authorization of the responsible sectors and were duly registered in the control agencies. The flight was performed on May 24, 2017, and approximately 1,300 images were obtained during approximately 1 hour of flight. Afterward, new attributes were generated by Feature Extraction (FE) from the original images. The use of multispectral images and complementary attributes generated independently from them increases the accuracy of classification. The attributes of this work include the Declivity Map and Principal Component Analysis (PCA). For the classification four distinct classes were considered: HLZ 1 – small size (18m x 18m); HLZ 2 – medium size (23m x 23m); HLZ 3 – large size (28m x 28m); AZ (100m x 100m). The Decision Tree method Random Forest (RF) was used in this work. RF is a classification method that uses a large collection of de-correlated decision trees. Different random sets of samples are used as sampled objects. The results of classification from each tree and for each object is called a class vote. The resulting classification is decided by a majority of class votes. In this case, we used 200 trees for the execution of RF in the software WEKA 3.8. The classification result was visualized on QGIS Desktop 2.12.3. Through the methodology used, it was possible to classify in the study area: 6 areas as HLZ 1, 6 areas as HLZ 2, 4 areas as HLZ 3; and 2 areas as AZ. It should be noted that an area classified as AZ covers the classifications of the other classes, and may be used as AZ, HLZ of large size (HLZ3), medium size (HLZ2) and small size helicopters (HLZ1). Likewise, an area classified as HLZ for large rotary wing aircraft (HLZ3) covers the smaller area classifications, and so on. It was concluded that images obtained through small UAV are of great use in calamity situations since they can provide data with high accuracy, with low cost, low risk and ease and agility in obtaining aerial photographs. This allows the generation, in a short time, of information about the features of the terrain in order to serve as an important decision support tool.Keywords: disaster management, unmanned aircraft systems, helicopter landing zones, airdrop zones, random forest
Procedia PDF Downloads 17922740 Study of the Late Phase of Core Degradation during Reflooding by Safety Injection System for VVER1000 with ASTECv2 Computer Code
Authors: Antoaneta Stefanova, Rositsa Gencheva, Pavlin Groudev
Abstract:
This paper presents the modeling approach in SBO sequence for VVER 1000 reactors and describes the reactor core behavior at late in-vessel phase in case of late reflooding by HPIS and gives preliminary results for the ASTECv2 validation. The work is focused on investigation of plant behavior during total loss of power and the operator actions. The main goal of these analyses is to assess the phenomena arising during the Station blackout (SBO) followed by primary side high pressure injection system (HPIS) reflooding of already damaged reactor core at very late ‘in-vessel’ phase. The purpose of the analysis is to define how the later HPIS switching on can delay the time of vessel failure or possibly avoid vessel failure. For this purpose has been simulated an SBO scenario with injection of cold water by a high pressure pump (HPP) in cold leg at different stages of core degradation. The times for HPP injection were chosen based on previously performed investigations.Keywords: VVER, operator action validation, reflooding of overheated reactor core, ASTEC computer code
Procedia PDF Downloads 41722739 Study of the Mega–Landslide at the Community of Ropoto, Central Greece, and of the Design of Mitigation and Early Warning System Using the Fiber Bragg Grating Technology
Authors: Michael Bellas, George Voulgaridis
Abstract:
This paper refers to the world known mega - landslide induced at the community of Ropoto, belonging to the Municipality of Trikala, in the Central part of Greece. The landslide affected the debris as well as the colluvium mantle of the flysch, and makes up a special case of study in engineering geology and geotechnical engineering not only because of the size of the domain affected by the landslide (approximately 750m long), but also because of the geostructure’s global behavior. Due to the landslide, the whole community’s infrastructure massively collapsed and human lives were put in danger. After the complete simulation of the coupled Seepage - Deformation phenomenon due to the extreme rainfall, and by closely examining the slope’s global behavior, both the mitigation of the landslide, as well as, an advanced surveillance method (Fiber Bragg Grating) using fiber optics were further studied, in order both to retain the geostructure and to monitor its health by creating an early warning system, which would serve as a complete safety net for saving both the community’s infrastructure as well as the lives of its habitats.Keywords: landslide, remediation measures, the finite element method (FEM), Fiber Bragg Grating (FBG) sensing method
Procedia PDF Downloads 33322738 The Impact of Transformational Leadership on Individual Attributes
Authors: Bilal Liaqat, Muhammad Umar, Zara Bashir, Hassan Rafique, Mohsin Abbasi, Zarak Khan
Abstract:
Transformational leadership is one of the most studied topics in the organization sciences. However, the impact of transformational leadership on employee’s individual attributes have not yet been studied. Purpose: This research aims to discover the relationship between transformational leadership and employee motivation, performance and creativity. Moreover, the study will also investigate the influence of transformational leadership on employee performance through employee motivation and employee creativity. Design-Methodology-Approach: The data was collected from employees in different organization. This cross-sectional study collected data from employees and the methodology used includes survey data that were collected from employees in organizations. Structured interviews were also conducted to explain the outcomes from the survey. Findings: The results of this study reveal that transformational leadership has a positive impact on employee’s individual attributes. Research Implications: Although this study expands our knowledge about the role of learning orientation between transformational leadership and employee motivation, performance and creativity, the prospects for further research are still present.Keywords: employee creativity, employee motivation, employee performance, transformational leadership
Procedia PDF Downloads 23422737 Numerical Approach for Characterization of Flow Field in Pump Intake Using Two Phase Model: Detached Eddy Simulation
Authors: Rahul Paliwal, Gulshan Maheshwari, Anant S. Jhaveri, Channamallikarjun S. Mathpati
Abstract:
Large pumping facility is the necessary requirement of the cooling water systems for power plants, process and manufacturing facilities, flood control and water or waste water treatment plant. With a large capacity of few hundred to 50,000 m3/hr, cares must be taken to ensure the uniform flow to the pump to limit vibration, flow induced cavitation and performance problems due to formation of air entrained vortex and swirl flow. Successful prediction of these phenomena requires numerical method and turbulence model to characterize the dynamics of these flows. In the past years, single phase shear stress transport (SST) Reynolds averaged Navier Stokes Models (like k-ε, k-ω and RSM) were used to predict the behavior of flow. Literature study showed that two phase model will be more accurate over single phase model. In this paper, a 3D geometries simulated using detached eddy simulation (LES) is used to predict the behavior of the fluid and the results are compared with experimental results. Effect of different grid structure and boundary condition is also studied. It is observed that two phase flow model can more accurately predict the mean flow and turbulence statistics compared to the steady SST model. These validate model will be used for further analysis of vortex structure in lab scale model to generate their frequency-plot and intensity at different location in the set-up. This study will help in minimizing the ill effect of vortex on pump performance.Keywords: grid structure, pump intake, simulation, vibration, vortex
Procedia PDF Downloads 17722736 Proposal Method of Prediction of the Early Stages of Dementia Using IoT and Magnet Sensors
Authors: João Filipe Papel, Tatsuji Munaka
Abstract:
With society's aging and the number of elderly with dementia rising, researchers have been actively studying how to support the elderly in the early stages of dementia with the objective of allowing them to have a better life quality and as much as possible independence. To make this possible, most researchers in this field are using the Internet Of Things to monitor the elderly activities and assist them in performing them. The most common sensor used to monitor the elderly activities is the Camera sensor due to its easy installation and configuration. The other commonly used sensor is the sound sensor. However, we need to consider privacy when using these sensors. This research aims to develop a system capable of predicting the early stages of dementia based on monitoring and controlling the elderly activities of daily living. To make this system possible, some issues need to be addressed. First, the issue related to elderly privacy when trying to detect their Activities of Daily Living. Privacy when performing detection and monitoring Activities of Daily Living it's a serious concern. One of the purposes of this research is to achieve this detection and monitoring without putting the privacy of the elderly at risk. To make this possible, the study focuses on using an approach based on using Magnet Sensors to collect binary data. The second is to use the data collected by monitoring Activities of Daily Living to predict the early stages of Dementia. To make this possible, the research team suggests developing a proprietary ontology combined with both data-driven and knowledge-driven.Keywords: dementia, activity recognition, magnet sensors, ontology, data driven and knowledge driven, IoT, activities of daily living
Procedia PDF Downloads 10822735 Three-Dimensional Finite Element Analysis of Geogrid-Reinforced Piled Embankments on Soft Clay
Authors: Mahmoud Y. Shokry, Rami M. El-Sherbiny
Abstract:
This paper aims to highlight the role of some parameters that may be of a noticeable impact on numerical analysis/design of embankments. It presents the results of a three-dimensional (3-D) finite element analysis of a monitored earth embankment that was constructed on soft clay formation stabilized by cast in-situ piles using software PLAXIS 3D. A comparison between the predicted and the monitored responses is presented to assess the adequacy of the adopted numerical model. The model was used in the targeted parametric study. Moreover, a comparison was performed between the results of the 3-D analyses and the analytical solutions. This paper concluded that the effect of using mono pile caps led to decrease both the total and differential settlement and increased the efficiency of the piled embankment system. The study of using geogrids revealed that it can contribute in decreasing the settlement and maximizing the part of the embankment load transferred to piles. Moreover, it was found that increasing the stiffness of the geogrids provides higher values of tensile forces and hence has more effective influence on embankment load carried by piles rather than using multi-number of layers with low values of geogrid stiffness. The efficiency of the piled embankments system was also found to be greater when higher embankments are used rather than the low height embankments. The comparison between the numerical 3-D model and the theoretical design methods revealed that many analytical solutions are conservative and non-accurate rather than the 3-D finite element numerical models.Keywords: efficiency, embankment, geogrids, soft clay
Procedia PDF Downloads 32522734 Coping Strategies and Characterization of Vulnerability in the Perspective of Climate Change
Authors: Muhammad Umer Mehmood, Muhammad Luqman, Muhammad Yaseen, Imtiaz Hussain
Abstract:
Climate change is an arduous fact, which could not be unheeded easily. It is a phenomenon which has brought a collection of challenges for the mankind. Scientists have found many of its negative impacts on the life of human being and the resources on which the life of humanity is dependent. There are many issues which are associated with the factor of prime importance in this study, 'climate change'. Whenever changes happen in nature, they strike the whole globe. Effects of these changes vary from region to region. Climate of every region of this globe is different from the other. Even within a state, country or the province has different climatic conditions. So it is mandatory that the response in that specific region and the coping strategy of this specific region should be according to the prevailing risk. In the present study, the objective was to assess the coping strategies and vulnerability of small landholders. So that a professional suggestion could be made to cope with the vulnerability factor of small farmers. The cross-sectional research design was used with the intervention of quantitative approach. The study was conducted in the Khanewal district, of Punjab, Pakistan. 120 small farmers were interviewed after randomized sampling from the population of respective area. All respondents were above the age of 15 years. A questionnaire was developed after keen observation of facts in the respective area. Content and face validity of the instrument was assessed with SPSS and experts in the field. Data were analyzed through SPSS using descriptive statistics. From the sample of 120, 81.67% of the respondents claimed that the environment is getting warmer and not fit for their present agricultural practices. 84.17% of the sample expressed serious concern that they are disturbed due to change in rainfall pattern and vulnerability towards the climatic effects. On the other hand, they expressed that they are not good at tackling the effects of climate change. Adaptation of coping strategies like change in cropping pattern, use of resistant varieties, varieties with minimum water requirement, intercropping and tree planting was low by more than half of the sample. From the sample 63.33% small farmers said that the coping strategies they adopt are not effective enough. The present study showed that subsistence farming, lack of marketing and overall infrastructure, lack of access to social security networks, limited access to agriculture extension services, inappropriate access to agrometeorological system, unawareness and access to scientific development and low crop yield are the prominent factors which are responsible for the vulnerability of small farmers. A comprehensive study should be conducted at national level so that a national policy could be formulated to cope with the dilemma in future with relevance to climate change. Mainstreaming and collaboration among the researchers and academicians could prove beneficiary in this regard the interest of national leaders’ does matter. Proper policies to avoid the vulnerability factors should be the top priority. The world is taking up this issue with full responsibility as should we, keeping in view the local situation.Keywords: adaptation, coping strategies, climate change, Pakistan, small farmers, vulnerability
Procedia PDF Downloads 14822733 Motivation and Multiglossia: Exploring the Diversity of Interests, Attitudes, and Engagement of Arabic Learners
Authors: Anna-Maria Ramezanzadeh
Abstract:
Demand for Arabic language is growing worldwide, driven by increased interest in the multifarious purposes the language serves, both for the population of heritage learners and those studying Arabic as a foreign language. The diglossic, or indeed multiglossic nature of the language as used in Arabic speaking communities however, is seldom represented in the content of classroom courses. This disjoint between the nature of provision and students’ expectations can severely impact their engagement with course material, and their motivation to either commence or continue learning the language. The nature of motivation and its relationship to multiglossia is sparsely explored in current literature on Arabic. The theoretical framework here proposed aims to address this gap by presenting a model and instruments for the measurement of Arabic learners’ motivation in relation to the multiple strands of the language. It adopts and develops the Second Language Motivation Self-System model (L2MSS), originally proposed by Zoltan Dörnyei, which measures motivation as the desire to reduce the discrepancy between leaners’ current and future self-concepts in terms of the second language (L2). The tripartite structure incorporates measures of the Current L2 Self, Future L2 Self (consisting of an Ideal L2 Self, and an Ought-To Self), and the L2 Learning Experience. The strength of the self-concepts is measured across three different domains of Arabic: Classical, Modern Standard and Colloquial. The focus on learners’ self-concepts allows for an exploration of the effect of multiple factors on motivation towards Arabic, including religion. The relationship between Islam and Arabic is often given as a prominent reason behind some students’ desire to learn the language. Exactly how and why this factor features in learners’ L2 self-concepts has not yet been explored. Specifically designed surveys and interview protocols are proposed to facilitate the exploration of these constructs. The L2 Learning Experience component of the model is operationalized as learners’ task-based engagement. Engagement is conceptualised as multi-dimensional and malleable. In this model, situation-specific measures of cognitive, behavioural, and affective components of engagement are collected via specially designed repeated post-task self-report surveys on Personal Digital Assistant over multiple Arabic lessons. Tasks are categorised according to language learning skill. Given the domain-specific uses of the different varieties of Arabic, the relationship between learners’ engagement with different types of tasks and their overall motivational profiles will be examined to determine the extent of the interaction between the two constructs. A framework for this data analysis is proposed and hypotheses discussed. The unique combination of situation-specific measures of engagement and a person-oriented approach to measuring motivation allows for a macro- and micro-analysis of the interaction between learners and the Arabic learning process. By combining cross-sectional and longitudinal elements with a mixed-methods design, the model proposed offers the potential for capturing a comprehensive and detailed picture of the motivation and engagement of Arabic learners. The application of this framework offers a number of numerous potential pedagogical and research implications which will also be discussed.Keywords: Arabic, diglossia, engagement, motivation, multiglossia, sociolinguistics
Procedia PDF Downloads 16822732 Monitorization of Junction Temperature Using a Thermal-Test-Device
Authors: B. Arzhanov, A. Correia, P. Delgado, J. Meireles
Abstract:
Due to the higher power loss levels in electronic components, the thermal design of PCBs (Printed Circuit Boards) of an assembled device becomes one of the most important quality factors in electronics. Nonetheless, some of leading causes of the microelectronic component failures are due to higher temperatures, the leakages or thermal-mechanical stress, which is a concern, is the reliability of microelectronic packages. This article presents an experimental approach to measure the junction temperature of exposed pad packages. The implemented solution is in a prototype phase, using a temperature-sensitive parameter (TSP) to measure temperature directly on the die, validating the numeric results provided by the Mechanical APDL (Ansys Parametric Design Language) under same conditions. The physical device-under-test is composed by a Thermal Test Chip (TTC-1002) and assembly in a QFN cavity, soldered to a test-board according to JEDEC Standards. Monitoring the voltage drop across a forward-biased diode, is an indirectly method but accurate to obtain the junction temperature of QFN component with an applied power range between 0,3W to 1.5W. The temperature distributions on the PCB test-board and QFN cavity surface were monitored by an infra-red thermal camera (Goby-384) controlled and images processed by the Xeneth software. The article provides a set-up to monitorize in real-time the junction temperature of ICs, namely devices with the exposed pad package (i.e. QFN). Presenting the PCB layout parameters that the designer should use to improve thermal performance, and evaluate the impact of voids in solder interface in the device junction temperature.Keywords: quad flat no-Lead packages, exposed pads, junction temperature, thermal management and measurements
Procedia PDF Downloads 28822731 Identifying Factors Contributing to the Spread of Lyme Disease: A Regression Analysis of Virginia’s Data
Authors: Fatemeh Valizadeh Gamchi, Edward L. Boone
Abstract:
This research focuses on Lyme disease, a widespread infectious condition in the United States caused by the bacterium Borrelia burgdorferi sensu stricto. It is critical to identify environmental and economic elements that are contributing to the spread of the disease. This study examined data from Virginia to identify a subset of explanatory variables significant for Lyme disease case numbers. To identify relevant variables and avoid overfitting, linear poisson, and regularization regression methods such as a ridge, lasso, and elastic net penalty were employed. Cross-validation was performed to acquire tuning parameters. The methods proposed can automatically identify relevant disease count covariates. The efficacy of the techniques was assessed using four criteria on three simulated datasets. Finally, using the Virginia Department of Health’s Lyme disease data set, the study successfully identified key factors, and the results were consistent with previous studies.Keywords: lyme disease, Poisson generalized linear model, ridge regression, lasso regression, elastic net regression
Procedia PDF Downloads 14422730 Graph-Based Semantical Extractive Text Analysis
Authors: Mina Samizadeh
Abstract:
In the past few decades, there has been an explosion in the amount of available data produced from various sources with different topics. The availability of this enormous data necessitates us to adopt effective computational tools to explore the data. This leads to an intense growing interest in the research community to develop computational methods focused on processing this text data. A line of study focused on condensing the text so that we are able to get a higher level of understanding in a shorter time. The two important tasks to do this are keyword extraction and text summarization. In keyword extraction, we are interested in finding the key important words from a text. This makes us familiar with the general topic of a text. In text summarization, we are interested in producing a short-length text which includes important information about the document. The TextRank algorithm, an unsupervised learning method that is an extension of the PageRank (algorithm which is the base algorithm of Google search engine for searching pages and ranking them), has shown its efficacy in large-scale text mining, especially for text summarization and keyword extraction. This algorithm can automatically extract the important parts of a text (keywords or sentences) and declare them as a result. However, this algorithm neglects the semantic similarity between the different parts. In this work, we improved the results of the TextRank algorithm by incorporating the semantic similarity between parts of the text. Aside from keyword extraction and text summarization, we develop a topic clustering algorithm based on our framework, which can be used individually or as a part of generating the summary to overcome coverage problems.Keywords: keyword extraction, n-gram extraction, text summarization, topic clustering, semantic analysis
Procedia PDF Downloads 7622729 Optimal Retrofit Design of Reinforced Concrete Frame with Infill Wall Using Fiber Reinforced Plastic Materials
Authors: Sang Wook Park, Se Woon Choi, Yousok Kim, Byung Kwan Oh, Hyo Seon Park
Abstract:
Various retrofit techniques for reinforced concrete frame with infill wall have been steadily developed. Among those techniques, strengthening methodology based on diagonal FRP strips (FRP bracings) has numerous advantages such as feasibility of implementing without interrupting the building under operation, reduction of cost and time, and easy application. Considering the safety of structure and retrofit cost, the most appropriate retrofit solution is needed. Thus, the objective of this study is to suggest pareto-optimal solution for existing building using FRP bracings. To find pareto-optimal solution analysis, NSGA-II is applied. Moreover, the seismic performance of retrofit building is evaluated. The example building is 5-storey, 3-bay RC frames with infill wall. Nonlinear static pushover analyses are performed with FEMA 356. The criterion of performance evaluation is inter-story drift ratio at the performance level IO, LS, CP. Optimal retrofit solutions is obtained for 32 individuals and 200 generations. Through the proposed optimal solutions, we confirm the improvement of seismic performance of the example building.Keywords: retrofit, FRP bracings, reinforced concrete frame with infill wall, seismic performance evaluation, NSGA-II
Procedia PDF Downloads 44022728 Research Action Fields at the Nexus of Digital Transformation and Supply Chain Management: Findings from Practitioner Focus Group Workshops
Authors: Brandtner Patrick, Staberhofer Franz
Abstract:
Logistics and Supply Chain Management are of crucial importance for organisational success. In the era of Digitalization, several implications and improvement potentials for these domains arise, which at the same time could lead to decreased competitiveness and could endanger long-term company success if ignored or neglected. However, empirical research on the issue of Digitalization and benefits purported to it by practitioners is scarce and mainly focused on single technologies or separate, isolated Supply Chain blocks as e.g. distribution logistics or procurement only. The current paper applies a holistic focus group approach to elaborate practitioner use cases at the nexus of the concepts of Supply Chain Management (SCM) and Digitalization. In the course of three focus group workshops with over 45 participants from more than 20 organisations, a comprehensive set of benefit entitlements and areas for improvement in terms of applying digitalization to SCM is developed. The main results of the paper indicate the relevance of Digitalization being realized in practice. In the form of seventeen concrete research action fields, the benefit entitlements are aggregated and transformed into potential starting points for future research projects in this area. The main contribution of this paper is an empirically grounded basis for future research projects and an overview of actual research action fields from practitioners’ point of view.Keywords: digital supply chain, digital transformation, supply chain management, value networks
Procedia PDF Downloads 18522727 Development of Star Image Simulator for Star Tracker Algorithm Validation
Authors: Zoubida Mahi
Abstract:
A successful satellite mission in space requires a reliable attitude and orbit control system to command, control and position the satellite in appropriate orbits. Several sensors are used for attitude control, such as magnetic sensors, earth sensors, horizon sensors, gyroscopes, and solar sensors. The star tracker is the most accurate sensor compared to other sensors, and it is able to offer high-accuracy attitude control without the need for prior attitude information. There are mainly three approaches in star sensor research: digital simulation, hardware in the loop simulation, and field test of star observation. In the digital simulation approach, all of the processes are done in software, including star image simulation. Hence, it is necessary to develop star image simulation software that could simulate real space environments and various star sensor configurations. In this paper, we present a new stellar image simulation tool that is used to test and validate the stellar sensor algorithms; the developed tool allows to simulate of stellar images with several types of noise, such as background noise, gaussian noise, Poisson noise, multiplicative noise, and several scenarios that exist in space such as the presence of the moon, the presence of optical system problem, illumination and false objects. On the other hand, we present in this paper a new star extraction algorithm based on a new centroid calculation method. We compared our algorithm with other star extraction algorithms from the literature, and the results obtained show the star extraction capability of the proposed algorithm.Keywords: star tracker, star simulation, star detection, centroid, noise, scenario
Procedia PDF Downloads 9822726 One of the Missing Pieces of Inclusive Education: Sexual Orientations
Authors: Sıla Uzkul
Abstract:
As a requirement of human rights and children's rights, the basic condition of inclusive education is that it covers all children. However, the reforms made in the context of education in Turkey and around the world include a limited level of inclusiveness. Generally, the inclusiveness mentioned is for individuals who need special education. Educational reforms superficially state that differences are tolerated, but these differences are extremely limited and often do not include sexual orientation. When we look at the education modules of the Ministry of National Education within the scope of inclusive education in Turkey, there are children with special needs, bilingual children, children exposed to violence, children under temporary protection, children affected by migration and terrorism, and children affected by natural disasters. No training modules or inclusion terms regarding sexual orientations could be found. This research aimed to understand the perspectives of research assistants working in the preschool education department regarding sexual orientations within the scope of inclusive education. Six research assistants working in the preschool teaching department at a public university in Ankara (Turkey) participated in this qualitative research study. Participants were determined by typical case sampling, which is one of the purposeful sampling methods. The data of this research was obtained through a "survey consisting of open-ended questions". Raw data from the surveys were analyzed and interpreted using the "content analysis technique" (Yıldırım & Şimşek, 2005). During the data analysis process, the data from the participants were first numbered, then all the data were read, and content analysis was performed, and possible themes, categories, and codes were extracted. The opinions of the participants in the research regarding sexual orientations in inclusive education are presented under three main headings within the scope of the research questions. These are: (a) their views on inclusive education, (b) their views on sexual orientations (c) their views on sexual orientations in the preschool period.Keywords: sexual orientation, inclusive education, child rights, preschool education
Procedia PDF Downloads 6722725 The Role of Natural Gas in Reducing Carbon Emissions
Authors: Abdulrahman Nami Almutairi
Abstract:
In the face of escalating climate change concerns, the concept of smart cities emerges as a promising approach to mitigate carbon emissions and move towards carbon neutrality. This paper provides a comprehensive review of the role of Natural Gas in achieving carbon neutrality. Natural gas has often been seen as a transitional fuel in the context of reducing carbon emissions. Its main role stems from being cleaner than coal and oil when burned for electricity generation and industrial processes. The urgent need to address this global issue has prompted a global shift towards cleaner energy sources and sustainable practices. In this endeavor, natural gas has emerged as a pivotal player, hailed for its potential to mitigate carbon emissions, and facilitate the transition to a low-carbon economy. With its lower carbon intensity compared to conventional fossil fuels, natural gas presents itself as a promising alternative for meeting energy demands while reducing environmental impact. As the world stands at a critical juncture in the fight against climate change, exploring the potential of natural gas as a transitional fuel offers insights into pathways towards a more sustainable and resilient future. By critically evaluating its opportunities and challenges, we can harness the potential of natural gas as a transitional fuel while advancing towards a cleaner, more resilient energy system. Through collaborative efforts and informed decision-making, we can pave the way for a future where energy is not only abundant but also environmentally sustainable and socially equitable.Keywords: natural gas, clean fuel, carbon emissions, global warming, environmental protection
Procedia PDF Downloads 5022724 Evaluation of 18F Fluorodeoxyglucose Positron Emission Tomography, MRI, and Ultrasound in the Assessment of Axillary Lymph Node Metastases in Patients with Early Stage Breast Cancer
Authors: Wooseok Byon, Eunyoung Kim, Junseong Kwon, Byung Joo Song, Chan Heun Park
Abstract:
Purpose: 18F Fluorodeoxyglucose Positron Emission Tomography (FDG-PET) is a noninvasive imaging modality that can identify nodal metastases in women with primary breast cancer. The aim of this study was to compare the accuracy of FDG-PET with MRI and sonography scanning to determine axillary lymph node status in patients with breast cancer undergoing sentinel lymph node biopsy or axillary lymph node dissection. Patients and Methods: Between January and December 2012, ninety-nine patients with breast cancer and clinically negative axillary nodes were evaluated. All patients underwent FDG-PET, MRI, ultrasound followed by sentinel lymph node biopsy (SLNB) or axillary lymph node dissection (ALND). Results: Using axillary lymph node assessment as the gold standard, the sensitivity and specificity of FDG-PET were 51.4% (95% CI, 41.3% to 65.6%) and 92.2% (95% CI, 82.7% to 97.4%) respectively. The sensitivity and specificity of MRI and ultrasound were 57.1% (95% CI, 39.4% to 73.7%), 67.2% (95% CI, 54.3% to 78.4%) and 42.86% (95% CI, 26.3% to 60.7%), 92.2% (95% CI, 82.7% to 97.4%). Stratification according to hormone receptor status showed an increase in specificity when negative (FDG-PET: 42.3% to 77.8%, MRI 50% to 77.8%, ultrasound 34.6% to 66.7%). Also, positive HER2 status was associated with an increase in specificity (FDG-PET: 42.9% to 85.7%, MRI 50% to 85.7%, ultrasound 35.7% to 71.4%). Conclusions: The sensitivity and specificity of FDG-PET compared with MRI and ultrasound was high. However, FDG-PET is not sufficiently accurate to appropriately identify lymph node metastases. This study suggests that FDG-PET scanning cannot replace histologic staging in early-stage breast cancer, but might have a role in evaluating axillary lymph node status in hormone receptor negative or HER-2 overexpressing subtypes.Keywords: axillary lymph node metastasis, FDG-PET, MRI, ultrasound
Procedia PDF Downloads 37722723 Comparison of Elastic and Viscoelastic Modeling for Asphalt Concrete Surface Layer
Authors: Fouzieh Rouzmehr, Mehdi Mousavi
Abstract:
Hot mix asphalt concrete (HMAC) is a mixture of aggregates and bitumen. The primary ingredient that determines the mechanical properties of HMAC is the bitumen in it, which displays viscoelastic behavior under normal service conditions. For simplicity, asphalt concrete is considered an elastic material, but this is far from reality at high service temperatures and longer loading times. Viscoelasticity means that the material's stress-strain relationship depends on the strain rate and loading duration. The goal of this paper is to simulate the mechanical response of flexible pavements using linear elastic and viscoelastic modeling of asphalt concrete and predict pavement performance. Falling Weight Deflectometer (FWD) load will be simulated and the results for elastic and viscoelastic modeling will be evaluated. The viscoelastic simulation is performed by the Prony series, which will be modeled by using ANSYS software. Inflexible pavement design, tensile strain at the bottom of the surface layer and compressive strain at the top of the last layer plays an important role in the structural response of the pavement and they will imply the number of loads for fatigue (Nf) and rutting (Nd) respectively. The differences of these two modelings are investigated on fatigue cracking and rutting problem, which are the two main design parameters in flexible pavement design. Although the differences in rutting problem between the two models were negligible, in fatigue cracking, the viscoelastic model results were more accurate. Results indicate that modeling the flexible pavement with elastic material is efficient enough and gives acceptable results.Keywords: flexible pavement, asphalt, FEM, viscoelastic, elastic, ANSYS, modeling
Procedia PDF Downloads 134