Search results for: maternity care model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19620

Search results for: maternity care model

12240 Integrating Virtual Reality and Building Information Model-Based Quantity Takeoffs for Supporting Construction Management

Authors: Chin-Yu Lin, Kun-Chi Wang, Shih-Hsu Wang, Wei-Chih Wang

Abstract:

A construction superintendent needs to know not only the amount of quantities of cost items or materials completed to develop a daily report or calculate the daily progress (earned value) in each day, but also the amount of quantities of materials (e.g., reinforced steel and concrete) to be ordered (or moved into the jobsite) for performing the in-progress or ready-to-start construction activities (e.g., erection of reinforced steel and concrete pouring). These daily construction management tasks require great effort in extracting accurate quantities in a short time (usually must be completed right before getting off work every day). As a result, most superintendents can only provide these quantity data based on either what they see on the site (high inaccuracy) or the extraction of quantities from two-dimension (2D) construction drawings (high time consumption). Hence, the current practice of providing the amount of quantity data completed in each day needs improvement in terms of more accuracy and efficiency. Recently, a three-dimension (3D)-based building information model (BIM) technique has been widely applied to support construction quantity takeoffs (QTO) process. The capability of virtual reality (VR) allows to view a building from the first person's viewpoint. Thus, this study proposes an innovative system by integrating VR (using 'Unity') and BIM (using 'Revit') to extract quantities to support the above daily construction management tasks. The use of VR allows a system user to be present in a virtual building to more objectively assess the construction progress in the office. This VR- and BIM-based system is also facilitated by an integrated database (consisting of the information and data associated with the BIM model, QTO, and costs). In each day, a superintendent can work through a BIM-based virtual building to quickly identify (via a developed VR shooting function) the building components (or objects) that are in-progress or finished in the jobsite. And he then specifies a percentage (e.g., 20%, 50% or 100%) of completion of each identified building object based on his observation on the jobsite. Next, the system will generate the completed quantities that day by multiplying the specified percentage by the full quantities of the cost items (or materials) associated with the identified object. A building construction project located in northern Taiwan is used as a case study to test the benefits (i.e., accuracy and efficiency) of the proposed system in quantity extraction for supporting the development of daily reports and the orders of construction materials.

Keywords: building information model, construction management, quantity takeoffs, virtual reality

Procedia PDF Downloads 121
12239 Parameterized Lyapunov Function Based Robust Diagonal Dominance Pre-Compensator Design for Linear Parameter Varying Model

Authors: Xiaobao Han, Huacong Li, Jia Li

Abstract:

For dynamic decoupling of linear parameter varying system, a robust dominance pre-compensator design method is given. The parameterized pre-compensator design problem is converted into optimal problem constrained with parameterized linear matrix inequalities (PLMI); To solve this problem, firstly, this optimization problem is equivalently transformed into a new form with elimination of coupling relationship between parameterized Lyapunov function (PLF) and pre-compensator. Then the problem was reduced to a normal convex optimization problem with normal linear matrix inequalities (LMI) constraints on a newly constructed convex polyhedron. Moreover, a parameter scheduling pre-compensator was achieved, which satisfies robust performance and decoupling performances. Finally, the feasibility and validity of the robust diagonal dominance pre-compensator design method are verified by the numerical simulation of a turbofan engine PLPV model.

Keywords: linear parameter varying (LPV), parameterized Lyapunov function (PLF), linear matrix inequalities (LMI), diagonal dominance pre-compensator

Procedia PDF Downloads 388
12238 Implementation of a Culturally Responsive Home Visiting Framework in Head Start Teacher Professional Development

Authors: Meilan Jin, Mary Jane Moran

Abstract:

This study aims to introduce the framework of culturally responsive home visiting (CRHV) to head start teacher professional sessions in the Southeastern of the US and investigate its influence on the evolving beliefs of teachers about their roles and relationships with families in-home visits. The framework orients teachers to an effective way of taking on the role of learner to listen for spoken and unspoken needs and look for family strengths. In addition, it challenges the deficit model that is grounded on 'cultural deprivation,' and it stresses the value of family cultures and advocates equal, collaborative parent-teacher relationships. The home visit reflection papers and focus group transcriptions of eight teachers have been collected since 2010 throughout a five-year longitudinal collaboration with them. Reflection papers were written by the teachers before and after introducing the CRHV framework, including the details of visit purposes and actions and their plans for later home visits. Particularly, the CRHV framework guided the teachers to listen and look for information about family-living environments; parent-child interactions; child-rearing practices; and parental beliefs, values, and needs. Two focus groups were organized in 2014 by asking the teachers to read their written reflection papers and then discussing their shared beliefs and experiences of home visits in recent years. The average length of the discussions was one hour, and the discussions were audio-recorded and transcribed verbatim. Moreover, the data were analyzed using constant comparative analysis, and the analysis was verified through (a) the uses of multiple data sources, (b) the involvement of multiple researchers, (c) coding checks, and (d) the provisions of the thick descriptions of the findings. The study findings corroborate that the teachers become to reposition themselves as 'knowledge seekers' through reorienting their cynosure toward 'setting stones' to learn, grow, and change rather than framing their home visits. The teachers also continually engage in careful listening, observing, questioning, and dialoguing, and these actions reflect their care toward parents. The value of teamwork with parents is advocated, and the teachers recognize that when parents feel empowered, they are active and committed to doing more for their children, which can further advantage proactive long-term parent-teacher collaborations. The study findings also validate that the framework is influential for educators to provide the experiences of home visiting that is culturally responsive and to share collaborative relationships with caregivers. The long-term impact of the framework further implies that teachers continue to put themselves in the position of evolving, including beliefs and actions, to better work with children and families who are culturally, ethnically, and linguistically different from them. This framework can be applicable to educators and professionals who are looking for avenues to bridge the relationship between home and school and parents and teachers.

Keywords: culturally responsive home visit, early childhood education, parent–teacher collaboration, teacher professional development

Procedia PDF Downloads 84
12237 Collision Detection Algorithm Based on Data Parallelism

Authors: Zhen Peng, Baifeng Wu

Abstract:

Modern computing technology enters the era of parallel computing with the trend of sustainable and scalable parallelism. Single Instruction Multiple Data (SIMD) is an important way to go along with the trend. It is able to gather more and more computing ability by increasing the number of processor cores without the need of modifying the program. Meanwhile, in the field of scientific computing and engineering design, many computation intensive applications are facing the challenge of increasingly large amount of data. Data parallel computing will be an important way to further improve the performance of these applications. In this paper, we take the accurate collision detection in building information modeling as an example. We demonstrate a model for constructing a data parallel algorithm. According to the model, a complex object is decomposed into the sets of simple objects; collision detection among complex objects is converted into those among simple objects. The resulting algorithm is a typical SIMD algorithm, and its advantages in parallelism and scalability is unparalleled in respect to the traditional algorithms.

Keywords: data parallelism, collision detection, single instruction multiple data, building information modeling, continuous scalability

Procedia PDF Downloads 275
12236 Ground Surface Temperature History Prediction Using Long-Short Term Memory Neural Network Architecture

Authors: Venkat S. Somayajula

Abstract:

Ground surface temperature history prediction model plays a vital role in determining standards for international nuclear waste management. International standards for borehole based nuclear waste disposal require paleoclimate cycle predictions on scale of a million forward years for the place of waste disposal. This research focuses on developing a paleoclimate cycle prediction model using Bayesian long-short term memory (LSTM) neural architecture operated on accumulated borehole temperature history data. Bayesian models have been previously used for paleoclimate cycle prediction based on Monte-Carlo weight method, but due to limitations pertaining model coupling with certain other prediction networks, Bayesian models in past couldn’t accommodate prediction cycle’s over 1000 years. LSTM has provided frontier to couple developed models with other prediction networks with ease. Paleoclimate cycle developed using this process will be trained on existing borehole data and then will be coupled to surface temperature history prediction networks which give endpoints for backpropagation of LSTM network and optimize the cycle of prediction for larger prediction time scales. Trained LSTM will be tested on past data for validation and then propagated for forward prediction of temperatures at borehole locations. This research will be beneficial for study pertaining to nuclear waste management, anthropological cycle predictions and geophysical features

Keywords: Bayesian long-short term memory neural network, borehole temperature, ground surface temperature history, paleoclimate cycle

Procedia PDF Downloads 117
12235 Application Methodology for the Generation of 3D Thermal Models Using UAV Photogrammety and Dual Sensors for Mining/Industrial Facilities Inspection

Authors: Javier Sedano-Cibrián, Julio Manuel de Luis-Ruiz, Rubén Pérez-Álvarez, Raúl Pereda-García, Beatriz Malagón-Picón

Abstract:

Structural inspection activities are necessary to ensure the correct functioning of infrastructures. Unmanned Aerial Vehicle (UAV) techniques have become more popular than traditional techniques. Specifically, UAV Photogrammetry allows time and cost savings. The development of this technology has permitted the use of low-cost thermal sensors in UAVs. The representation of 3D thermal models with this type of equipment is in continuous evolution. The direct processing of thermal images usually leads to errors and inaccurate results. A methodology is proposed for the generation of 3D thermal models using dual sensors, which involves the application of visible Red-Blue-Green (RGB) and thermal images in parallel. Hence, the RGB images are used as the basis for the generation of the model geometry, and the thermal images are the source of the surface temperature information that is projected onto the model. Mining/industrial facilities representations that are obtained can be used for inspection activities.

Keywords: aerial thermography, data processing, drone, low-cost, point cloud

Procedia PDF Downloads 129
12234 Topology and Shape Optimization of Macpherson Control Arm under Fatigue Loading

Authors: Abolfazl Hosseinpour, Javad Marzbanrad

Abstract:

In this research, the topology and shape optimization of a Macpherson control arm has been accomplished to achieve lighter weight. Present automotive market demands low cost and light weight component to meet the need of fuel efficient and cost effective vehicle. This in turn gives the rise to more effective use of materials for automotive parts which can reduce the mass of vehicle. Since automotive components are under dynamic loads which cause fatigue damage, considering fatigue criteria seems to be essential in designing automotive components. At first, in order to create severe loading condition for control arm, some rough roads are generated through power spectral density. Then, the most critical loading conditions are obtained through multibody dynamics analysis of a full vehicle model. Then, the topology optimization is performed based on fatigue life criterion using HyperMesh software, which resulted to 50 percent mass reduction. In the next step a CAD model is created using CATIA software and shape optimization is performed to achieve accurate dimensions with less mass.

Keywords: topology optimization, shape optimization, fatigue life, MacPherson control arm

Procedia PDF Downloads 305
12233 Advancements in Arthroscopic Surgery Techniques for Anterior Cruciate Ligament (ACL) Reconstruction

Authors: Islam Sherif, Ahmed Ashour, Ahmed Hassan, Hatem Osman

Abstract:

Anterior Cruciate Ligament (ACL) injuries are common among athletes and individuals participating in sports with sudden stops, pivots, and changes in direction. Arthroscopic surgery is the gold standard for ACL reconstruction, aiming to restore knee stability and function. Recent years have witnessed significant advancements in arthroscopic surgery techniques, graft materials, and technological innovations, revolutionizing the field of ACL reconstruction. This presentation delves into the latest advancements in arthroscopic surgery techniques for ACL reconstruction and their potential impact on patient outcomes. Traditionally, autografts from the patellar tendon, hamstring tendon, or quadriceps tendon have been commonly used for ACL reconstruction. However, recent studies have explored the use of allografts, synthetic scaffolds, and tissue-engineered grafts as viable alternatives. This abstract evaluates the benefits and potential drawbacks of each graft type, considering factors such as graft incorporation, strength, and risk of graft failure. Moreover, the application of augmented reality (AR) and virtual reality (VR) technologies in surgical planning and intraoperative navigation has gained traction. AR and VR platforms provide surgeons with detailed 3D anatomical reconstructions of the knee joint, enhancing preoperative visualization and aiding in graft tunnel placement during surgery. We discuss the integration of AR and VR in arthroscopic ACL reconstruction procedures, evaluating their accuracy, cost-effectiveness, and overall impact on surgical outcomes. Beyond graft selection and surgical navigation, patient-specific planning has gained attention in recent research. Advanced imaging techniques, such as MRI-based personalized planning, enable surgeons to tailor ACL reconstruction procedures to each patient's unique anatomy. By accounting for individual variations in the femoral and tibial insertion sites, this personalized approach aims to optimize graft placement and potentially improve postoperative knee kinematics and stability. Furthermore, rehabilitation and postoperative care play a crucial role in the success of ACL reconstruction. This abstract explores novel rehabilitation protocols, emphasizing early mobilization, neuromuscular training, and accelerated recovery strategies. Integrating technology, such as wearable sensors and mobile applications, into postoperative care can facilitate remote monitoring and timely intervention, contributing to enhanced rehabilitation outcomes. In conclusion, this presentation provides an overview of the cutting-edge advancements in arthroscopic surgery techniques for ACL reconstruction. By embracing innovative graft materials, augmented reality, patient-specific planning, and technology-driven rehabilitation, orthopedic surgeons and sports medicine specialists can achieve superior outcomes in ACL injury management. These developments hold great promise for improving the functional outcomes and long-term success rates of ACL reconstruction, benefitting athletes and patients alike.

Keywords: arthroscopic surgery, ACL, autograft, allograft, graft materials, ACL reconstruction, synthetic scaffolds, tissue-engineered graft, virtual reality, augmented reality, surgical planning, intra-operative navigation

Procedia PDF Downloads 78
12232 Young Carers’ Dilemma: Family Responsibility, Bonding and Commitment to Supporting Their Mentally Ill Parent in Taiwan

Authors: Esabella Yuan

Abstract:

This study explored the recollections of young carers who lived with and cared for their mentally ill parent and how they managed life difficulties in Taiwan. 19 former young carers took part in the study, conducted from July to October 2021. The findings provided the unique view that all the participants acknowledged being taught by the mainstream culture to honour family value and prioritize the needs of parents over their own ones, they stepped in to care for the ill parent out of love and out of necessity through there having no-one to turn to, they were willing to assume long-term caring responsibilities, strikingly, a much more common experience was that the participants hided parental illness and young carer identity in the community through the fear of social discrimination attached to mental illness. As a result, these former young carers stayed in hidden circumstances and coped alone with caring challenges. The findings suggest that there needs multi-disciplinary services working together to recognize the needs of young carers and provide appropriate intervention to young carers based on a family-focus approach and ensure to serve the best interests of young carers and their families. It is to be hope that young carers can grow up safely and healthily within the community.

Keywords: young carers, family well-being, mental health, parental mental illness

Procedia PDF Downloads 74
12231 Gig-Work in the Midst of the COVID-19 Pandemic

Authors: Audie Daniel Wood

Abstract:

In the spring of 2020, the country and the economy came to a halt due to an outbreak of the novel coronavirus, SARS-2, virus known as COVID-19. One of the hardest hit sectors of the economy was the gig-sector, which includes Lyft, Uber, Door-Dash, and other services. In this study, we examined the effects of the independent contractor status of laborers in this field to see how a near-complete economic shut-down affected the lives of laborers who are denied access to health-care and unemployment benefits due to their status as independent contractors. What the study found was there was no 'life-altering' change to the lives of the workers who used gig-work as supplementary income during the economic shut-down, but those who relied on Lyft and Uber, etc. as their sole source of income were more heavily impacted by the economic shut-down than part-time workers. The second significant finding of the study was that across all genders and races, the idea of having to seek unemployment or help was something that none of the workers wanted. They all felt as if unemployment and social-insurance were for those who could not work. While the findings are not generalizable due to this being a small qualitative study consisting of 27 participants, the findings suggest that the economic and social impact of COVID-19 on those that work in the gig-industry warrants further discussion and research.

Keywords: gig-work, Covid-19, independent contractor, Uber

Procedia PDF Downloads 108
12230 A Study for Area-level Mosquito Abundance Prediction by Using Supervised Machine Learning Point-level Predictor

Authors: Theoktisti Makridou, Konstantinos Tsaprailis, George Arvanitakis, Charalampos Kontoes

Abstract:

In the literature, the data-driven approaches for mosquito abundance prediction relaying on supervised machine learning models that get trained with historical in-situ measurements. The counterpart of this approach is once the model gets trained on pointlevel (specific x,y coordinates) measurements, the predictions of the model refer again to point-level. These point-level predictions reduce the applicability of those solutions once a lot of early warning and mitigation actions applications need predictions for an area level, such as a municipality, village, etc... In this study, we apply a data-driven predictive model, which relies on public-open satellite Earth Observation and geospatial data and gets trained with historical point-level in-Situ measurements of mosquito abundance. Then we propose a methodology to extract information from a point-level predictive model to a broader area-level prediction. Our methodology relies on the randomly spatial sampling of the area of interest (similar to the Poisson hardcore process), obtaining the EO and geomorphological information for each sample, doing the point-wise prediction for each sample, and aggregating the predictions to represent the average mosquito abundance of the area. We quantify the performance of the transformation from the pointlevel to the area-level predictions, and we analyze it in order to understand which parameters have a positive or negative impact on it. The goal of this study is to propose a methodology that predicts the mosquito abundance of a given area by relying on point-level prediction and to provide qualitative insights regarding the expected performance of the area-level prediction. We applied our methodology to historical data (of Culex pipiens) of two areas of interest (Veneto region of Italy and Central Macedonia of Greece). In both cases, the results were consistent. The mean mosquito abundance of a given area can be estimated with similar accuracy to the point-level predictor, sometimes even better. The density of the samples that we use to represent one area has a positive effect on the performance in contrast to the actual number of sampling points which is not informative at all regarding the performance without the size of the area. Additionally, we saw that the distance between the sampling points and the real in-situ measurements that were used for training did not strongly affect the performance.

Keywords: mosquito abundance, supervised machine learning, culex pipiens, spatial sampling, west nile virus, earth observation data

Procedia PDF Downloads 131
12229 Modeling of Erosion and Sedimentation Impacts from off-Road Vehicles in Arid Regions

Authors: Abigail Rosenberg, Jennifer Duan, Michael Poteuck, Chunshui Yu

Abstract:

The Barry M. Goldwater Range, West in southwestern Arizona encompasses 2,808 square kilometers of Sonoran Desert. The hyper-arid range has an annual rainfall of less than 10 cm with an average high temperature of 41 degrees Celsius in July to an average low of 4 degrees Celsius in January. The range shares approximately 60 kilometers of the international border with Mexico. A majority of the range is open for recreational use, primarily off-highway vehicles. Because of its proximity to Mexico, the range is also heavily patrolled by U.S. Customs and Border Protection seeking to intercept and apprehend inadmissible people and illicit goods. Decades of off-roading and Border Patrol activities have negatively impacted this sensitive desert ecosystem. To assist the range program managers, this study is developing a model to identify erosion prone areas and calibrate the model’s parameters using the Automated Geospatial Watershed Assessment modeling tool.

Keywords: arid lands, automated geospatial watershed assessment, erosion modeling, sedimentation modeling, watershed modeling

Procedia PDF Downloads 358
12228 Climate Change and Landslide Risk Assessment in Thailand

Authors: Shotiros Protong

Abstract:

The incidents of sudden landslides in Thailand during the past decade have occurred frequently and more severely. It is necessary to focus on the principal parameters used for analysis such as land cover land use, rainfall values, characteristic of soil and digital elevation model (DEM). The combination of intense rainfall and severe monsoons is increasing due to global climate change. Landslide occurrences rapidly increase during intense rainfall especially in the rainy season in Thailand which usually starts around mid-May and ends in the middle of October. The rain-triggered landslide hazard analysis is the focus of this research. The combination of geotechnical and hydrological data are used to determine permeability, conductivity, bedding orientation, overburden and presence of loose blocks. The regional landslide hazard mapping is developed using the Slope Stability Index SINMAP model supported on Arc GIS software version 10.1. Geological and land use data are used to define the probability of landslide occurrences in terms of geotechnical data. The geological data can indicate the shear strength and the angle of friction values for soils above given rock types, which leads to the general applicability of the approach for landslide hazard analysis. To address the research objectives, the methods are described in this study: setup and calibration of the SINMAP model, sensitivity of the SINMAP model, geotechnical laboratory, landslide assessment at present calibration and landslide assessment under future climate simulation scenario A2 and B2. In terms of hydrological data, the millimetres/twenty-four hours of average rainfall data are used to assess the rain triggered landslide hazard analysis in slope stability mapping. During 1954-2012 period, is used for the baseline of rainfall data at the present calibration. The climate change in Thailand, the future of climate scenarios are simulated by spatial and temporal scales. The precipitation impact is need to predict for the climate future, Statistical Downscaling Model (SDSM) version 4.2, is used to assess the simulation scenario of future change between latitude 16o 26’ and 18o 37’ north and between longitude 98o 52’ and 103o 05’ east by SDSM software. The research allows the mapping of risk parameters for landslide dynamics, and indicates the spatial and time trends of landslide occurrences. Thus, regional landslide hazard mapping under present-day climatic conditions from 1954 to 2012 and simulations of climate change based on GCM scenarios A2 and B2 from 2013 to 2099 related to the threshold rainfall values for the selected the study area in Uttaradit province in the northern part of Thailand. Finally, the landslide hazard mapping will be compared and shown by areas (km2 ) in both the present and the future under climate simulation scenarios A2 and B2 in Uttaradit province.

Keywords: landslide hazard, GIS, slope stability index (SINMAP), landslides, Thailand

Procedia PDF Downloads 541
12227 Fuzzy Data, Random Drift, and a Theoretical Model for the Sequential Emergence of Religious Capacity in Genus Homo

Authors: Margaret Boone Rappaport, Christopher J. Corbally

Abstract:

The ancient ape ancestral population from which living great ape and human species evolved had demographic features affecting their evolution. The population was large, had great genetic variability, and natural selection was effective at honing adaptations. The emerging populations of chimpanzees and humans were affected more by founder effects and genetic drift because they were smaller. Natural selection did not disappear, but it was not as strong. Consequences of the 'population crash' and the human effective population size are introduced briefly. The history of the ancient apes is written in the genomes of living humans and great apes. The expansion of the brain began before the human line emerged. Coalescence times for some genes are very old – up to several million years, long before Homo sapiens. The mismatch between gene trees and species trees highlights the anthropoid speciation processes, and gives the human genome history a fuzzy, probabilistic quality. However, it suggests traits that might form a foundation for capacities emerging later. A theoretical model is presented in which the genomes of early ape populations provide the substructure for the emergence of religious capacity later on the human line. The model does not search for religion, but its foundations. It suggests a course by which an evolutionary line that began with prosimians eventually produced a human species with biologically based religious capacity. The model of the sequential emergence of religious capacity relies on cognitive science, neuroscience, paleoneurology, primate field studies, cognitive archaeology, genomics, and population genetics. And, it emphasizes five trait types: (1) Documented, positive selection of sensory capabilities on the human line may have favored survival, but also eventually enriched human religious experience. (2) The bonobo model suggests a possible down-regulation of aggression and increase in tolerance while feeding, as well as paedomorphism – but, in a human species that remains cognitively sharp (unlike the bonobo). The two species emerged from the same ancient ape population, so it is logical to search for shared traits. (3) An up-regulation of emotional sensitivity and compassion seems to have occurred on the human line. This finds support in modern genetic studies. (4) The authors’ published model of morality's emergence in Homo erectus encompasses a cognitively based, decision-making capacity that was hypothetically overtaken, in part, by religious capacity. Together, they produced a strong, variable, biocultural capability to support human sociability. (5) The full flowering of human religious capacity came with the parietal expansion and smaller face (klinorhynchy) found only in Homo sapiens. Details from paleoneurology suggest the stage was set for human theologies. Larger parietal lobes allowed humans to imagine inner spaces, processes, and beings, and, with the frontal lobe, led to the first theologies composed of structured and integrated theories of the relationships between humans and the supernatural. The model leads to the evolution of a small population of African hominins that was ready to emerge with religious capacity when the species Homo sapiens evolved two hundred thousand years ago. By 50-60,000 years ago, when human ancestors left Africa, they were fully enabled.

Keywords: genetic drift, genomics, parietal expansion, religious capacity

Procedia PDF Downloads 326
12226 Modified Bat Algorithm for Economic Load Dispatch Problem

Authors: Daljinder Singh, J.S.Dhillon, Balraj Singh

Abstract:

According to no free lunch theorem, a single search technique cannot perform best in all conditions. Optimization method can be attractive choice to solve optimization problem that may have exclusive advantages like robust and reliable performance, global search capability, little information requirement, ease of implementation, parallelism, no requirement of differentiable and continuous objective function. In order to synergize between exploration and exploitation and to further enhance the performance of Bat algorithm, the paper proposed a modified bat algorithm that adds additional search procedure based on bat’s previous experience. The proposed algorithm is used for solving the economic load dispatch (ELD) problem. The practical constraint such valve-point loading along with power balance constraints and generator limit are undertaken. To take care of power demand constraint variable elimination method is exploited. The proposed algorithm is tested on various ELD problems. The results obtained show that the proposed algorithm is capable of performing better in majority of ELD problems considered and is at par with existing algorithms for some of problems.

Keywords: bat algorithm, economic load dispatch, penalty method, variable elimination method

Procedia PDF Downloads 450
12225 Combination of Unmanned Aerial Vehicle and Terrestrial Laser Scanner Data for Citrus Yield Estimation

Authors: Mohammed Hmimou, Khalid Amediaz, Imane Sebari, Nabil Bounajma

Abstract:

Annual crop production is one of the most important macroeconomic indicators for the majority of countries around the world. This information is valuable, especially for exporting countries which need a yield estimation before harvest in order to correctly plan the supply chain. When it comes to estimating agricultural yield, especially for arboriculture, conventional methods are mostly applied. In the case of the citrus industry, the sale before harvest is largely practiced, which requires an estimation of the production when the fruit is on the tree. However, conventional method based on the sampling surveys of some trees within the field is always used to perform yield estimation, and the success of this process mainly depends on the expertise of the ‘estimator agent’. The present study aims to propose a methodology based on the combination of unmanned aerial vehicle (UAV) images and terrestrial laser scanner (TLS) point cloud to estimate citrus production. During data acquisition, a fixed wing and rotatory drones, as well as a terrestrial laser scanner, were tested. After that, a pre-processing step was performed in order to generate point cloud and digital surface model. At the processing stage, a machine vision workflow was implemented to extract points corresponding to fruits from the whole tree point cloud, cluster them into fruits, and model them geometrically in a 3D space. By linking the resulting geometric properties to the fruit weight, the yield can be estimated, and the statistical distribution of fruits size can be generated. This later property, which is information required by importing countries of citrus, cannot be estimated before harvest using the conventional method. Since terrestrial laser scanner is static, data gathering using this technology can be performed over only some trees. So, integration of drone data was thought in order to estimate the yield over a whole orchard. To achieve that, features derived from drone digital surface model were linked to yield estimation by laser scanner of some trees to build a regression model that predicts the yield of a tree given its features. Several missions were carried out to collect drone and laser scanner data within citrus orchards of different varieties by testing several data acquisition parameters (fly height, images overlap, fly mission plan). The accuracy of the obtained results by the proposed methodology in comparison to the yield estimation results by the conventional method varies from 65% to 94% depending mainly on the phenological stage of the studied citrus variety during the data acquisition mission. The proposed approach demonstrates its strong potential for early estimation of citrus production and the possibility of its extension to other fruit trees.

Keywords: citrus, digital surface model, point cloud, terrestrial laser scanner, UAV, yield estimation, 3D modeling

Procedia PDF Downloads 129
12224 Operational Challenges of Marine Fiber Reinforced Polymer Composite Structures Coupled with Piezoelectric Transducers

Authors: H. Ucar, U. Aridogan

Abstract:

Composite structures become intriguing for the design of aerospace, automotive and marine applications due to weight reduction, corrosion resistance and radar signature reduction demands and requirements. Studies on piezoelectric ceramic transducers (PZT) for diagnostics and health monitoring have gained attention for their sensing capabilities, however PZT structures are prone to fail in case of heavy operational loads. In this paper, we develop a piezo-based Glass Fiber Reinforced Polymer (GFRP) composite finite element (FE) model, validate with experimental setup, and identify the applicability and limitations of PZTs for a marine application. A case study is conducted to assess the piezo-based sensing capabilities in a representative marine composite structure. A FE model of the composite structure combined with PZT patches is developed, afterwards the response and functionality are investigated according to the sea conditions. Results of this study clearly indicate the blockers and critical aspects towards industrialization and wide-range use of PZTs for marine composite applications.

Keywords: FRP composite, operational challenges, piezoelectric transducers, FE modeling

Procedia PDF Downloads 164
12223 Depth-Averaged Modelling of Erosion and Sediment Transport in Free-Surface Flows

Authors: Thomas Rowan, Mohammed Seaid

Abstract:

A fast finite volume solver for multi-layered shallow water flows with mass exchange and an erodible bed is developed. This enables the user to solve a number of complex sediment-based problems including (but not limited to), dam-break over an erodible bed, recirculation currents and bed evolution as well as levy and dyke failure. This research develops methodologies crucial to the under-standing of multi-sediment fluvial mechanics and waterway design. In this model mass exchange between the layers is allowed and, in contrast to previous models, sediment and fluid are able to transfer between layers. In the current study we use a two-step finite volume method to avoid the solution of the Riemann problem. Entrainment and deposition rates are calculated for the first time in a model of this nature. In the first step the governing equations are rewritten in a non-conservative form and the intermediate solutions are calculated using the method of characteristics. In the second stage, the numerical fluxes are reconstructed in conservative form and are used to calculate a solution that satisfies the conservation property. This method is found to be considerably faster than other comparative finite volume methods, it also exhibits good shock capturing. For most entrainment and deposition equations a bed level concentration factor is used. This leads to inaccuracies in both near bed level concentration and total scour. To account for diffusion, as no vertical velocities are calculated, a capacity limited diffusion coefficient is used. The additional advantage of this multilayer approach is that there is a variation (from single layer models) in bottom layer fluid velocity: this dramatically reduces erosion, which is often overestimated in simulations of this nature using single layer flows. The model is used to simulate a standard dam break. In the dam break simulation, as expected, the number of fluid layers utilised creates variation in the resultant bed profile, with more layers offering a higher deviation in fluid velocity . These results showed a marked variation in erosion profiles from standard models. The overall the model provides new insight into the problems presented at minimal computational cost.

Keywords: erosion, finite volume method, sediment transport, shallow water equations

Procedia PDF Downloads 208
12222 A Double Ended AC Series Arc Fault Location Algorithm Based on Currents Estimation and a Fault Map Trace Generation

Authors: Edwin Calderon-Mendoza, Patrick Schweitzer, Serge Weber

Abstract:

Series arc faults appear frequently and unpredictably in low voltage distribution systems. Many methods have been developed to detect this type of faults and commercial protection systems such AFCI (arc fault circuit interrupter) have been used successfully in electrical networks to prevent damage and catastrophic incidents like fires. However, these devices do not allow series arc faults to be located on the line in operating mode. This paper presents a location algorithm for series arc fault in a low-voltage indoor power line in an AC 230 V-50Hz home network. The method is validated through simulations using the MATLAB software. The fault location method uses electrical parameters (resistance, inductance, capacitance, and conductance) of a 49 m indoor power line. The mathematical model of a series arc fault is based on the analysis of the V-I characteristics of the arc and consists basically of two antiparallel diodes and DC voltage sources. In a first step, the arc fault model is inserted at some different positions across the line which is modeled using lumped parameters. At both ends of the line, currents and voltages are recorded for each arc fault generation at different distances. In the second step, a fault map trace is created by using signature coefficients obtained from Kirchhoff equations which allow a virtual decoupling of the line’s mutual capacitance. Each signature coefficient obtained from the subtraction of estimated currents is calculated taking into account the Discrete Fast Fourier Transform of currents and voltages and also the fault distance value. These parameters are then substituted into Kirchhoff equations. In a third step, the same procedure described previously to calculate signature coefficients is employed but this time by considering hypothetical fault distances where the fault can appear. In this step the fault distance is unknown. The iterative calculus from Kirchhoff equations considering stepped variations of the fault distance entails the obtaining of a curve with a linear trend. Finally, the fault distance location is estimated at the intersection of two curves obtained in steps 2 and 3. The series arc fault model is validated by comparing current registered from simulation with real recorded currents. The model of the complete circuit is obtained for a 49m line with a resistive load. Also, 11 different arc fault positions are considered for the map trace generation. By carrying out the complete simulation, the performance of the method and the perspectives of the work will be presented.

Keywords: indoor power line, fault location, fault map trace, series arc fault

Procedia PDF Downloads 125
12221 A Comparison of Methods for Estimating Dichotomous Treatment Effects: A Simulation Study

Authors: Jacqueline Y. Thompson, Sam Watson, Lee Middleton, Karla Hemming

Abstract:

Introduction: The odds ratio (estimated via logistic regression) is a well-established and common approach for estimating covariate-adjusted binary treatment effects when comparing a treatment and control group with dichotomous outcomes. Its popularity is primarily because of its stability and robustness to model misspecification. However, the situation is different for the relative risk and risk difference, which are arguably easier to interpret and better suited to specific designs such as non-inferiority studies. So far, there is no equivalent, widely acceptable approach to estimate an adjusted relative risk and risk difference when conducting clinical trials. This is partly due to the lack of a comprehensive evaluation of available candidate methods. Methods/Approach: A simulation study is designed to evaluate the performance of relevant candidate methods to estimate relative risks to represent conditional and marginal estimation approaches. We consider the log-binomial, generalised linear models (GLM) with iteratively weighted least-squares (IWLS) and model-based standard errors (SE); log-binomial GLM with convex optimisation and model-based SEs; log-binomial GLM with convex optimisation and permutation tests; modified-Poisson GLM IWLS and robust SEs; log-binomial generalised estimation equations (GEE) and robust SEs; marginal standardisation and delta method SEs; and marginal standardisation and permutation test SEs. Independent and identically distributed datasets are simulated from a randomised controlled trial to evaluate these candidate methods. Simulations are replicated 10000 times for each scenario across all possible combinations of sample sizes (200, 1000, and 5000), outcomes (10%, 50%, and 80%), and covariates (ranging from -0.05 to 0.7) representing weak, moderate or strong relationships. Treatment effects (ranging from 0, -0.5, 1; on the log-scale) will consider null (H0) and alternative (H1) hypotheses to evaluate coverage and power in realistic scenarios. Performance measures (bias, mean square error (MSE), relative efficiency, and convergence rates) are evaluated across scenarios covering a range of sample sizes, event rates, covariate prognostic strength, and model misspecifications. Potential Results, Relevance & Impact: There are several methods for estimating unadjusted and adjusted relative risks. However, it is unclear which method(s) is the most efficient, preserves type-I error rate, is robust to model misspecification, or is the most powerful when adjusting for non-prognostic and prognostic covariates. GEE estimations may be biased when the outcome distributions are not from marginal binary data. Also, it seems that marginal standardisation and convex optimisation may perform better than GLM IWLS log-binomial.

Keywords: binary outcomes, statistical methods, clinical trials, simulation study

Procedia PDF Downloads 98
12220 The Relationship Between Cyberbullying Victimization, Parent and Peer Attachment and Unconditional Self-Acceptance

Authors: Florina Magdalena Anichitoae, Anca Dobrean, Ionut Stelian Florean

Abstract:

Due to the fact that cyberbullying victimization is an increasing problem nowadays, affecting more and more children and adolescents around the world, we wanted to take a step forward analyzing this phenomenon. So, we took a look at some variables which haven't been studied together before, trying to develop another way to view cyberbullying victimization. We wanted to test the effects of the mother, father, and peer attachment on adolescent involvement in cyberbullying as victims through unconditional self acceptance. Furthermore, we analyzed each subscale of the IPPA-R, the instrument we have used for parents and peer attachment measurement, in regards to cyberbullying victimization through unconditional self acceptance. We have also analyzed if gender and age could be taken into consideration as moderators in this model. The analysis has been performed on 653 adolescents aged 11-17 years old from Romania. We used structural equation modeling, working in R program. For the fidelity analysis of the IPPA-R subscales, USAQ, and Cyberbullying Test, we have calculated the internal consistency index, which varies between .68-.91. We have created 2 models: the first model including peer alienation, peer trust, peer communication, self acceptance and cyberbullying victimization, having CFI=0.97, RMSEA=0.02, 90%CI [0.02, 0.03] and SRMR=0.07, and the second model including parental alienation, parental trust, parental communication, self acceptance and cyberbullying victimization and had CFI=0.97, RMSEA=0.02, 90%CI [0.02, 0.03] and SRMR=0.07. Our results were interesting: on one hand, cyberbullying victimization is predicted by peer alienation and peer communication through unconditional self acceptance. Peer trust directly, significantly, and negatively predicted the implication in cyberbullying. In this regard, considering gender and age as moderators, we found that the relationship between unconditional self acceptance and cyberbullying victimization is stronger in girls, but age does not moderate the relationship between unconditional self acceptance and cyberbullying victimization. On the other hand, regarding the degree of cyberbullying victimization as being predicted through unconditional self acceptance by parental alienation, parental communication, and parental trust, this hypothesis was not supported. Still, we could identify a direct path to positively predict victimization through parental alienation and negatively through parental trust. There are also some limitations to this study, which we've discussed in the end.

Keywords: adolescent, attachment, cyberbullying victimization, parents, peers, unconditional self-acceptance

Procedia PDF Downloads 191
12219 N-Heptane as Model Molecule for Cracking Catalyst Evaluation to Improve the Yield of Ethylene and Propylene

Authors: Tony K. Joseph, Balasubramanian Vathilingam, Stephane Morin

Abstract:

Currently, the refiners around the world are more focused on improving the yield of light olefins (propylene and ethylene) as both of them are very prominent raw materials to produce wide spectrum of polymeric materials such as polyethylene and polypropylene. Henceforth, it is desirable to increase the yield of light olefins via selective cracking of heavy oil fractions. In this study, zeolite grown on SiC was used as the catalyst to do model cracking reaction of n-heptane. The catalytic cracking of n-heptane was performed in a fixed bed reactor (12 mm i.d.) at three different temperatures (425, 450 and 475 °C) and at atmospheric pressure. A carrier gas (N₂) was mixed with n-heptane with ratio of 90:10 (N₂:n-heptane), and the gaseous mixture was introduced into the fixed bed reactor. Various flow rate of reactants was tested to increase the yield of ethylene and propylene. For the comparison purpose, commercial zeolite was also tested in addition to Zeolite on SiC. The products were analyzed using an Agilent gas chromatograph (GC-9860) equipped with flame ionization detector (FID). The GC is connected online with the reactor and all the cracking tests were successfully reproduced. The entire catalytic evaluation results will be presented during the conference.

Keywords: cracking, catalyst, evaluation, ethylene, heptane, propylene

Procedia PDF Downloads 126
12218 On the Dwindling Supply of the Observable Cosmic Microwave Background Radiation

Authors: Jia-Chao Wang

Abstract:

The cosmic microwave background radiation (CMB) freed during the recombination era can be considered as a photon source of small duration; a one-time event happened everywhere in the universe simultaneously. If space is divided into concentric shells centered at an observer’s location, one can imagine that the CMB photons originated from the nearby shells would reach and pass the observer first, and those in shells farther away would follow as time goes forward. In the Big Bang model, space expands rapidly in a time-dependent manner as described by the scale factor. This expansion results in an event horizon coincident with one of the shells, and its radius can be calculated using cosmological calculators available online. Using Planck 2015 results, its value during the recombination era at cosmological time t = 0.379 million years (My) is calculated to be Revent = 56.95 million light-years (Mly). The event horizon sets a boundary beyond which the freed CMB photons will never reach the observer. The photons within the event horizon also exhibit a peculiar behavior. Calculated results show that the CMB observed today was freed in a shell located at 41.8 Mly away (inside the boundary set by Revent) at t = 0.379 My. These photons traveled 13.8 billion years (Gy) to reach here. Similarly, the CMB reaching the observer at t = 1, 5, 10, 20, 40, 60, 80, 100 and 120 Gy are calculated to be originated at shells of R = 16.98, 29.96, 37.79, 46.47, 53.66, 55.91, 56.62, 56.85 and 56.92 Mly, respectively. The results show that as time goes by, the R value approaches Revent = 56.95 Mly but never exceeds it, consistent with the earlier statement that beyond Revent the freed CMB photons will never reach the observer. The difference Revert - R can be used as a measure of the remaining observable CMB photons. Its value becomes smaller and smaller as R approaching Revent, indicating a dwindling supply of the observable CMB radiation. In this paper, detailed dwindling effects near the event horizon are analyzed with the help of online cosmological calculators based on the lambda cold dark matter (ΛCDM) model. It is demonstrated in the literature that assuming the CMB to be a blackbody at recombination (about 3000 K), then it will remain so over time under cosmological redshift and homogeneous expansion of space, but with the temperature lowered (2.725 K now). The present result suggests that the observable CMB photon density, besides changing with space expansion, can also be affected by the dwindling supply associated with the event horizon. This raises the question of whether the blackbody of CMB at recombination can remain so over time. Being able to explain the blackbody nature of the observed CMB is an import part of the success of the Big Bang model. The present results cast some doubts on that and suggest that the model may have an additional challenge to deal with.

Keywords: blackbody of CMB, CMB radiation, dwindling supply of CMB, event horizon

Procedia PDF Downloads 110
12217 Development of a Context Specific Planning Model for Achieving a Sustainable Urban City

Authors: Jothilakshmy Nagammal

Abstract:

This research paper deals with the different case studies, where the Form-Based Codes are adopted in general and the different implementation methods in particular are discussed to develop a method for formulating a new planning model. The organizing principle of the Form-Based Codes, the transect is used to zone the city into various context specific transects. An approach is adopted to develop the new planning model, city Specific Planning Model (CSPM), as a tool to achieve sustainability for any city in general. A case study comparison method in terms of the planning tools used, the code process adopted and the various control regulations implemented in thirty two different cities are done. The analysis shows that there are a variety of ways to implement form-based zoning concepts: Specific plans, a parallel or optional form-based code, transect-based code /smart code, required form-based standards or design guidelines. The case studies describe the positive and negative results from based zoning, Where it is implemented. From the different case studies on the method of the FBC, it is understood that the scale for formulating the Form-Based Code varies from parts of the city to the whole city. The regulating plan is prepared with the organizing principle as the transect in most of the cases. The various implementation methods adopted in these case studies for the formulation of Form-Based Codes are special districts like the Transit Oriented Development (TOD), traditional Neighbourhood Development (TND), specific plan and Street based. The implementation methods vary from mandatory, integrated and floating. To attain sustainability the research takes the approach of developing a regulating plan, using the transect as the organizing principle for the entire area of the city in general in formulating the Form-Based Codes for the selected Special Districts in the study area in specific, street based. Planning is most powerful when it is embedded in the broader context of systemic change and improvement. Systemic is best thought of as holistic, contextualized and stake holder-owned, While systematic can be thought of more as linear, generalisable, and typically top-down or expert driven. The systemic approach is a process that is based on the system theory and system design principles, which are too often ill understood by the general population and policy makers. The system theory embraces the importance of a global perspective, multiple components, interdependencies and interconnections in any system. In addition, the recognition that a change in one part of a system necessarily alters the rest of the system is a cornerstone of the system theory. The proposed regulating plan taking the transect as an organizing principle and Form-Based Codes to achieve sustainability of the city has to be a hybrid code, which is to be integrated within the existing system - A Systemic Approach with a Systematic Process. This approach of introducing a few form based zones into a conventional code could be effective in the phased replacement of an existing code. It could also be an effective way of responding to the near-term pressure of physical change in “sensitive” areas of the community. With this approach and method the new Context Specific Planning Model is created towards achieving sustainability is explained in detail this research paper.

Keywords: context based planning model, form based code, transect, systemic approach

Procedia PDF Downloads 322
12216 Investigating (Im)Politeness Strategies in Email Communication: The Case Algerian PhD Supervisees and Irish Supervisors

Authors: Zehor Ktitni

Abstract:

In pragmatics, politeness is regarded as a feature of paramount importance to successful interpersonal relationships. On the other hand, emails have recently become one of the indispensable means of communication in educational settings. This research puts email communication at the core of the study and analyses it from a politeness perspective. More specifically, it endeavours to look closely at how the concept of (im)politeness is reflected through students’ emails. To this end, a corpus of Algerian supervisees’ email threads, exchanged with their Irish supervisors, was compiled. Leech’s model of politeness (2014) was selected as the main theoretical framework of this study, in addition to making reference to Brown and Levinson’s model (1987) as it is one of the most influential models in the area of pragmatic politeness. Further, some follow-up interviews are to be conducted with Algerian students to reinforce the results derived from the corpus. Initial findings suggest that Algerian Ph.D. students’ emails tend to include more politeness markers than impoliteness ones, they heavily make use of academic titles when addressing their supervisors (Dr. or Prof.), and they rely on hedging devices in order to sound polite.

Keywords: politeness, email communication, corpus pragmatics, Algerian PhD supervisees, Irish supervisors

Procedia PDF Downloads 52
12215 Applying Renowned Energy Simulation Engines to Neural Control System of Double Skin Façade

Authors: Zdravko Eškinja, Lovre Miljanić, Ognjen Kuljača

Abstract:

This paper is an overview of simulation tools used to model specific thermal dynamics that occurs while controlling double skin façade. Research has been conducted on simplified construction with single zone where one side is glazed. Heat flow and temperature responses are simulated in three different simulation tools: IDA-ICE, EnergyPlus and HAMBASE. The excitation of observed system, used in all simulations, was a temperature step of exterior environment. Air infiltration, insulation and other disturbances are excluded from this research. Although such isolated behaviour is not possible in reality, experiments are carried out to gain novel information about heat flow transients which are not observable under regular conditions. Results revealed new possibilities for adapting the parameters of the neural network regulator. Along numerical simulations, the same set-up has been also tested in a real-time experiment with a 1:18 scaled model and thermal chamber. The comparison analysis brings out interesting conclusion about simulation accuracy in this particular case.

Keywords: double skin façade, experimental tests, heat control, heat flow, simulated tests, simulation tools

Procedia PDF Downloads 220
12214 A Study on the Effect of the Work-Family Conflict on Work Engagement: A Mediated Moderation Model of Emotional Exhaustion and Positive Psychology Capital

Authors: Sungeun Hyun, Sooin Lee, Gyewan Moon

Abstract:

Work-Family Conflict has been an active research area for the past decades. Work-Family Conflict harms individuals and organizations, it is ultimately expected to bring the cost of losses to the company in the long run. WFC has mainly focused on effects of organizational effectiveness and job attitude such as Job Satisfaction, Organizational Commitment, and Turnover Intention variables. This study is different from consequence variable with previous research. For this purpose, we selected the positive job attitude 'Work Engagement' as a consequence of WFC. This research has its primary research purpose in identifying the negative effects of the Work-Family Conflict, and started out from the recognition of the problem that the research on the direct relationship on the influence of the WFC on Work Engagement is lacking. Based on the COR(Conservation of resource theory) and JD-R(Job Demand- Resource model), the empirical study model to examine the negative effects of WFC with Emotional Exhaustion as the link between WFC and Work Engagement was suggested and validated. Also, it was analyzed how much Positive Psychological Capital may buffer the negative effects arising from WFC within this relationship, and the Mediated Moderation model controlling the indirect effect influencing the Work Engagement by the Positive Psychological Capital mediated by the WFC and Emotional Exhaustion was verified. Data was collected by using questionnaires distributed to 500 employees engaged manufacturing, services, finance, IT industry, education services, and other sectors, of which 389 were used in the statistical analysis. The data are analyzed by statistical package, SPSS 21.0, SPSS macro and AMOS 21.0. The hierarchical regression analysis, SPSS PROCESS macro and Bootstrapping method for hypothesis testing were conducted. Results showed that all hypotheses are supported. First, WFC showed a negative effect on Work Engagement. Specifically, WIF appeared to be on more negative effects than FIW. Second, Emotional exhaustion found to mediate the relationship between WFC and Work Engagement. Third, Positive Psychological Capital showed to moderate the relationship between WFC and Emotional Exhaustion. Fourth, the effect of mediated moderation through the integration verification, Positive Psychological Capital demonstrated to buffer the relationship among WFC, Emotional Exhastion, and Work Engagement. Also, WIF showed a more negative effects than FIW through verification of all hypotheses. Finally, we discussed the theoretical and practical implications on research and management of the WFC, and proposed limitations and future research directions of research.

Keywords: emotional exhaustion, positive psychological capital, work engagement, work-family conflict

Procedia PDF Downloads 207
12213 The Impact of Artificial Intelligence on Digital Factory

Authors: Michael Maher Sedky Attia

Abstract:

The method of factory making plans has changed loads, in particular, whilst it's miles approximately making plans the factory building itself. Factory-making plans have the venture of designing merchandise, plants, tactics, organization, regions, and the constructing of a factory. Ordinary restructuring is turning into the greater essential for you to preserve the competitiveness of a manufacturing unit. Regulations in new regions, shorter lifestyle cycles of product and manufacturing era in addition to a VUCA global (Volatility, Uncertainty, Complexity and Ambiguity) cause extra common restructuring measures inside a factory. A digital factory model is the planning foundation for rebuilding measures and turns into a critical device. Furthermore, digital building fashions are increasingly being utilized in factories to help facility management and manufacturing processes. The principle research question of this paper is, therefore, about the type of virtual factory model which is suitable for the exclusive regions of utility during the operation of a manufacturing unit. First, exclusive styles of digital manufacturing unit fashions are investigated, and their residences and usabilities to be used instances are analyzed. Within the scope of research point cloud fashions, building statistics fashions, photogrammetry fashions, and those enriched with sensor information are tested. It investigates which digital fashions permit a simple integration of sensor facts and what the variations are. In the end, viable application areas of virtual manufacturing unit models are determined by a survey and the respective digital manufacturing facility fashions are assigned to the application areas. Ultimately, an application case from upkeep is selected and implemented with the assistance of the best virtual factory version. It is shown how a completely digitalized preservation process can be supported by a digital manufacturing facility version through offering facts. Among different functions, the virtual manufacturing facility version is used for indoor navigation, facts provision, and display of sensor statistics. In summary, the paper suggests a structuring of virtual factory fashions that concentrates on the geometric representation of a manufacturing facility building and its technical facilities. A practical application case is proven and implemented. For that reason, the systematic selection of virtual manufacturing facility models with the corresponding utility cases is evaluated.

Keywords: augmented reality, digital factory model, factory planning, restructuringdigital factory model, photogrammetry, restructuringbuilding information modeling, maintenance

Procedia PDF Downloads 0
12212 A Greedy Alignment Algorithm Supporting Medication Reconciliation

Authors: David Tresner-Kirsch

Abstract:

Reconciling patient medication lists from multiple sources is a critical task supporting the safe delivery of patient care. Manual reconciliation is a time-consuming and error-prone process, and recently attempts have been made to develop efficiency- and safety-oriented automated support for professionals performing the task. An important capability of any such support system is automated alignment – finding which medications from a list correspond to which medications from a different source, regardless of misspellings, naming differences (e.g. brand name vs. generic), or changes in treatment (e.g. switching a patient from one antidepressant class to another). This work describes a new algorithmic solution to this alignment task, using a greedy matching approach based on string similarity, edit distances, concept extraction and normalization, and synonym search derived from the RxNorm nomenclature. The accuracy of this algorithm was evaluated against a gold-standard corpus of 681 medication records; this evaluation found that the algorithm predicted alignments with 99% precision and 91% recall. This performance is sufficient to support decision support applications for medication reconciliation.

Keywords: clinical decision support, medication reconciliation, natural language processing, RxNorm

Procedia PDF Downloads 270
12211 Building a Composite Approach to Employees' Motivational Needs by Combining Cognitive Needs

Authors: Alexis Akinyemi, Laurene Houtin

Abstract:

Measures of employee motivation at work are often based on the theory of self-determined motivation, which implies that human resources departments and managers seek to motivate employees in the most self-determined way possible and use strategies to achieve this goal. In practice, they often tend to assess employee motivation and then adapt management to the most important source of motivation for their employees, for example by financially rewarding an employee who is extrinsically motivated, and by rewarding an intrinsically motivated employee with congratulations and recognition. Thus, the use of motivation measures contradicts theoretical positioning: theory does not provide for the promotion of extrinsically motivated behaviour. In addition, a corpus of social psychology linked to fundamental needs makes it possible to personally address a person’s different sources of motivation (need for cognition, need for uniqueness, need for effects and need for closure). By developing a composite measure of motivation based on these needs, we provide human resources professionals, and in particular occupational psychologists, with a tool that complements the assessment of self-determined motivation, making it possible to precisely address the objective of adapting work not to the self-determination of behaviours, but to the motivational traits of employees. To develop such a model, we gathered the French versions of the cognitive needs scales (need for cognition, need for uniqueness, need for effects, need for closure) and conducted a study with 645 employees of several French companies. On the basis of the data collected, we conducted a confirmatory factor analysis to validate the model, studied the correlations between the various needs, and highlighted the different reference groups that could be used to use these needs as a basis for interviews with employees (career, recruitment, etc.). The results showed a coherent model and the expected links between the different needs. Taken together, these results make it possible to propose a valid and theoretically adjusted tool to managers who wish to adapt their management to their employees’ current motivations, whether or not these motivations are self-determined.

Keywords: motivation, personality, work commitment, cognitive needs

Procedia PDF Downloads 105