Search results for: complexity measurement
3430 Revolutionary Solutions for Modeling and Visualization of Complex Software Systems
Abstract:
Existing software modeling and visualization approaches using UML are outdated, which are outcomes of reductionism and the superposition principle that the whole of a system is the sum of its parts, so that with them all tasks of software modeling and visualization are performed linearly, partially, and locally. This paper introduces revolutionary solutions for modeling and visualization of complex software systems, which make complex software systems much easy to understand, test, and maintain. The solutions are based on complexity science, offering holistic, automatic, dynamic, virtual, and executable approaches about thousand times more efficient than the traditional ones.Keywords: complex systems, software maintenance, software modeling, software visualization
Procedia PDF Downloads 4013429 Microstructural Mechanical Properties of Human Trabecular Bone Based on Nanoindentation Test
Authors: K. Jankowski, M. Pawlikowski, A. Makuch, K. Skalski
Abstract:
Depth-sensing indentation (DSI) or nanoindentation is becoming a more and more popular method of measuring mechanical properties of various materials and tissues at a micro-scale. This technique allows measurements without complicated sample preparation procedures which makes this method very useful. As a result of measurement force and displacement of the intender are obtained. It is also possible to determine three measures of hardness i.e. Martens hardness (HM), nanohardness (HIT), Vickers hardness (HV) and Young modulus EIT. In this work trabecular bone mechanical properties were investigated. The bone samples were harvested from human femoral heads during hip replacement surgery. Patients were of different age, sexes and stages of tissue degeneration caused by osteoarthritis. The specimens were divided into three groups. Each group contained samples harvested from patients of different range of age. All samples were investigated with the same measurement conditions. The maximum load was Pmax=500 mN and the loading rate was 500 mN/min. The tests were held without hold at the peak force. The tests were conducted with indenter Vickers tip and spherical tip of the diameter 0.2 mm. Each trabecular bone sample was tested 7 times in a close area of the same trabecula. The measured loading P as a function of indentation depth allowed to obtain hysteresis loop and HM, HIT, HV, EIT. Results for arbitrarily chosen sample are HM=289.95 ± 42.31 MPa, HIT=430.75 ± 45.37 MPa, HV=40.66 ± 4.28 Vickers, EIT=7.37 ± 1.84 GPa for Vickers tip and HM=115.19 ± 15.03 MPa, HIT=165.80 ± 19.30 MPa, HV=16.90 ± 1.97 Vickers, EIT=5.30 ± 1.31 GPa for spherical tip. Results of nanoindentation tests show that this method is very useful and is perfect for obtaining mechanical properties of trabecular bone. Estimated values of elastic modulus are similar. The differences between hardness are significant but it is a result of using two different types of tips. However, it has to be emphasised that the differences in the values of elastic modulus and hardness result from different testing protocols, anisotropy and asymmetry of the micro-samples and the hydration of bone.Keywords: human bone, mechanical properties, nano hardness nanoindentation, trabecular bone
Procedia PDF Downloads 2763428 Chemical Analysis of Particulate Matter (PM₂.₅) and Volatile Organic Compound Contaminants
Authors: S. Ebadzadsahraei, H. Kazemian
Abstract:
The main objective of this research was to measure particulate matter (PM₂.₅) and Volatile Organic Compound (VOCs) as two classes of air pollutants, at Prince George (PG) neighborhood in warm and cold seasons. To fulfill this objective, analytical protocols were developed for accurate sampling and measurement of the targeted air pollutants. PM₂.₅ samples were analyzed for their chemical composition (i.e., toxic trace elements) in order to assess their potential source of emission. The City of Prince George, widely known as the capital of northern British Columbia (BC), Canada, has been dealing with air pollution challenges for a long time. The city has several local industries including pulp mills, a refinery, and a couple of asphalt plants that are the primary contributors of industrial VOCs. In this research project, which is the first study of this kind in this region it measures physical and chemical properties of particulate air pollutants (PM₂.₅) at the city neighborhood. Furthermore, this study quantifies the percentage of VOCs at the city air samples. One of the outcomes of this project is updated data about PM₂.₅ and VOCs inventory in the selected neighborhoods. For examining PM₂.₅ chemical composition, an elemental analysis methodology was developed to measure major trace elements including but not limited to mercury and lead. The toxicity of inhaled particulates depends on both their physical and chemical properties; thus, an understanding of aerosol properties is essential for the evaluation of such hazards, and the treatment of such respiratory and other related diseases. Mixed cellulose ester (MCE) filters were selected for this research as a suitable filter for PM₂.₅ air sampling. Chemical analyses were conducted using Inductively Coupled Plasma Mass Spectrometry (ICP-MS) for elemental analysis. VOCs measurement of the air samples was performed using a Gas Chromatography-Flame Ionization Detector (GC-FID) and Gas Chromatography-Mass Spectrometry (GC-MS) allowing for quantitative measurement of VOC molecules in sub-ppb levels. In this study, sorbent tube (Anasorb CSC, Coconut Charcoal), 6 x 70-mm size, 2 sections, 50/100 mg sorbent, 20/40 mesh was used for VOCs air sampling followed by using solvent extraction and solid-phase micro extraction (SPME) techniques to prepare samples for measuring by a GC-MS/FID instrument. Air sampling for both PM₂.₅ and VOC were conducted in summer and winter seasons for comparison. Average concentrations of PM₂.₅ are very different between wildfire and daily samples. At wildfire time average of concentration is 83.0 μg/m³ and daily samples are 23.7 μg/m³. Also, higher concentrations of iron, nickel and manganese found at all samples and mercury element is found in some samples. It is able to stay too high doses negative effects.Keywords: air pollutants, chemical analysis, particulate matter (PM₂.₅), volatile organic compound, VOCs
Procedia PDF Downloads 1423427 Testifying in Court as a Victim of Crime for Persons with Little or No Functional Speech: Vocabulary Implications
Authors: Robyn White, Juan Bornman, Ensa Johnson
Abstract:
People with disabilities are at a high risk of becoming victims of crime. Individuals with little or no functional speech (LNFS) face an even higher risk. One way of reducing the risk of remaining a victim of crime is to face the alleged perpetrator in court as a witness – therefore it is important for a person with LNFS who has been a victim of crime to have the required vocabulary to testify in court. The aim of this study was to identify and describe the core and fringe legal vocabulary required by illiterate victims of crime, who have little or no functional speech, to testify in court as witnesses. A mixed-method, the exploratory sequential design consisting of two distinct phases was used to address the aim of the research. The first phase was of a qualitative nature and included two different data sources, namely in-depth semi-structured interviews and focus group discussions. The overall aim of this phase was to identify and describe core and fringe legal vocabulary and to develop a measurement instrument based on these results. Results from Phase 1 were used in Phase 2, the quantitative phase, during which the measurement instrument (a custom-designed questionnaire) was socially validated. The results produced six distinct vocabulary categories that represent the legal core vocabulary and 99 words that represent the legal fringe vocabulary. The findings suggested that communication boards should be individualised to the individual and the specific crime. It is believed that the vocabulary lists developed in this study act as a valid and reliable springboard from which communication boards can be developed. Recommendations were therefore made to develop an Alternative and Augmentative Communication Resource Tool Kit to assist the legal justice system.Keywords: augmentative and alternative communication, person with little or no functional speech, sexual crimes, testifying in court, victim of crime, witness competency
Procedia PDF Downloads 4803426 Performance Measurement by Analytic Hierarchy Process in Performance Based Logistics
Authors: M. Hilmi Ozdemir, Gokhan Ozkan
Abstract:
Performance Based Logistics (PBL) is a strategic approach that enables creating long-term and win-win relations among stakeholders in the acquisition. Contrary to the traditional single transactions, the expected value is created by the performance of the service pertaining to the strategic relationships in this approach. PBL motivates all relevant stakeholders to focus on their core competencies to produce the desired outcome in a collective way. The desired outcome can only be assured with a cost effective way as long as it is periodically measured with the right performance parameters. Thus, defining these parameters is a crucial step for the PBL contracts. In performance parameter determination, Analytic Hierarchy Process (AHP), which is a multi-criteria decision making methodology for complex cases, was used within this study for a complex system. AHP has been extensively applied in various areas including supply chain, inventory management, outsourcing, and logistics. This methodology made it possible to convert end-user’s main operation and maintenance requirements to sub criteria contained by a single performance parameter. Those requirements were categorized and assigned weights by the relevant stakeholders. Single performance parameter capable of measuring the overall performance of a complex system is the major outcome of this study. The parameter deals with the integrated assessment of different functions spanning from training, operation, maintenance, reporting, and documentation that are implemented within a complex system. The aim of this study is to show the methodology and processes implemented to identify a single performance parameter for measuring the whole performance of a complex system within a PBL contract. AHP methodology is recommended as an option for the researches and the practitioners who seek for a lean and integrated approach for performance assessment within PBL contracts. The implementation of AHP methodology in this study may help PBL practitioners from methodological perception and add value to AHP in becoming prevalent.Keywords: analytic hierarchy process, performance based logistics, performance measurement, performance parameters
Procedia PDF Downloads 2813425 Evaluation of E-Government Service Quality
Authors: Nguyen Manh Hien
Abstract:
Service quality is the highest requirement from users, especially for the service in electronic government. During the past decades, it has become a major area of academic investigation. Considering this issue, there are many researches that evaluated the dimensions and e-service contexts. This study also identified the dimensions of service quality but focused on a new conceptual and provides a new methodological in developing measurement scales of e-service quality such as information quality, service quality and organization quality. Finally, the study will suggest a key factor to evaluate e-government service quality better.Keywords: dimensionality, e-government, e-service, e-service quality
Procedia PDF Downloads 5413424 Secondary Charged Fragments Tracking for On-Line Beam Range Monitoring in Particle Therapy
Authors: G. Traini, G. Battistoni, F. Collamati, E. De Lucia, R. Faccini, C. Mancini-Terracciano, M. Marafini, I. Mattei, S. Muraro, A. Sarti, A. Sciubba, E. Solfaroli Camillocci, M. Toppi, S. M. Valle, C. Voena, V. Patera
Abstract:
In Particle Therapy (PT) treatments a large amount of secondary particles, whose emission point is correlated to the dose released in the crossed tissues, is produced. The measurement of the secondary charged fragments component could represent a valid technique to monitor the beam range during the PT treatments, that is a still missing item in the clinical practice. A sub-millimetrical precision on the beam range measurement is required to significantly optimise the technique and to improve the treatment quality. In this contribution, a detector, named Dose Profiler (DP), is presented. It is specifically planned to monitor on-line the beam range exploiting the secondary charged particles produced in PT Carbon ions treatment. In particular, the DP is designed to track the secondary fragments emitted at large angles with respect to the beam direction (mainly protons), with the aim to reconstruct the spatial coordinates of the fragment emission point extrapolating the measured track toward the beam axis. The DP is currently under development within of the INSIDE collaboration (Innovative Solutions for In-beam Dosimetry in hadrontherapy). The tracker is made by six layers (20 × 20 cm²) of BCF-12 square scintillating fibres (500 μm) coupled to Silicon Photo-Multipliers, followed by two plastic scintillator layers of 6 mm thickness. A system of front-end boards based on FPGAs arranged around the detector provides the data acquisition. The detector characterization with cosmic rays is currently undergoing, and a data taking campaign with protons will take place in May 2017. The DP design and the performances measured with using MIPs and protons beam will be reviewed.Keywords: fragmentation, monitoring, particle therapy, tracking
Procedia PDF Downloads 2333423 Emerging Technologies for Learning: In Need of a Pro-Active Educational Strategy
Authors: Pieter De Vries, Renate Klaassen, Maria Ioannides
Abstract:
This paper is about an explorative research into the use of emerging technologies for teaching and learning in higher engineering education. The assumption is that these technologies and applications, which are not yet widely adopted, will help to improve education and as such actively work on the ability to better deal with the mismatch of skills bothering our industries. Technologies such as 3D printing, the Internet of Things, Virtual Reality, and others, are in a dynamic state of development which makes it difficult to grasp the value for education. Also, the instruments in current educational research seem not appropriate to assess the value of such technologies. This explorative research aims to foster an approach to better deal with this new complexity. The need to find out is urgent, because these technologies will be dominantly present in the near future in all aspects of life, including education. The methodology used in this research comprised an inventory of emerging technologies and tools that potentially give way to innovation and are used or about to be used in technical universities. The inventory was based on both a literature review and a review of reports and web resources like blogs and others and included a series of interviews with stakeholders in engineering education and at representative industries. In addition, a number of small experiments were executed with the aim to analyze the requirements for the use of in this case Virtual Reality and the Internet of Things to better understanding the opportunities and limitations in the day-today learning environment. The major findings indicate that it is rather difficult to decide about the value of these technologies for education due to the dynamic state of change and therefor unpredictability and the lack of a coherent policy at the institutions. Most decisions are being made by teachers on an individual basis, who in their micro-environment are not equipped to select, test and ultimately decide about the use of these technologies. Most experiences are being made in the industry knowing that the skills to handle these technologies are in high demand. The industry though is worried about the inclination and the capability of education to help bridge the skills gap related to the emergence of new technologies. Due to the complexity, the diversity, the speed of development and the decay, education is challenged to develop an approach that can make these technologies work in an integrated fashion. For education to fully profit from the opportunities, these technologies offer it is eminent to develop a pro-active strategy and a sustainable approach to frame the emerging technologies development.Keywords: emerging technologies, internet of things, pro-active strategy, virtual reality
Procedia PDF Downloads 1903422 Groundwater Investigation Using Resistivity Method and Drilling for Irrigation during the Dry Season in Lwantonde District, Uganda
Authors: Tamale Vincent
Abstract:
Groundwater investigation is the investigation of underground formations to understand the hydrologic cycle, known groundwater occurrences, and identify the nature and types of aquifers. There are different groundwater investigation methods and surface geophysical method is one of the groundwater investigation more especially the Geoelectrical resistivity Schlumberger configuration method which provides valuable information regarding the lateral and vertical successions of subsurface geomaterials in terms of their individual thickness and corresponding resistivity values besides using surface geophysical method, hydrogeological and geological investigation methods are also incorporated to aid in preliminary groundwater investigation. Investigation for groundwater in lwantonde district has been implemented. The area project is located cattle corridor and the dry seasonal troubles the communities in lwantonde district of which 99% of people living there are farmers, thus making agriculture difficult and local government to provide social services to its people. The investigation was done using the Geoelectrical resistivity Schlumberger configuration method. The measurement point is located in the three sub-counties, with a total of 17 measurement points. The study location is at 0025S, 3110E, and covers an area of 160 square kilometers. Based on the results of the Geoelectrical information data, it was found two types of aquifers, which are open aquifers in depth ranging from six meters to twenty-two meters and a confined aquifer in depth ranging from forty-five meters to eighty meters. In addition to the Geoelectrical information data, drilling was done at an accessible point by heavy equipment in the Lwakagura village, Kabura sub-county. At the drilling point, artesian wells were obtained at a depth of eighty meters and can rise to two meters above the soil surface. The discovery of artesian well is then used by residents to meet the needs of clean water and for irrigation considering that in this area most wells contain iron content.Keywords: artesian well, geoelectrical, lwantonde, Schlumberger
Procedia PDF Downloads 1243421 Detection of Resistive Faults in Medium Voltage Overhead Feeders
Authors: Mubarak Suliman, Mohamed Hassan
Abstract:
Detection of downed conductors occurring with high fault resistance (reaching kilo-ohms) has always been a challenge, especially in countries like Saudi Arabia, on which earth resistivity is very high in general (reaching more than 1000 Ω-meter). The new approaches for the detection of resistive and high impedance faults are based on the analysis of the fault current waveform. These methods are still under research and development, and they are currently lacking security and dependability. The other approach is communication-based solutions which depends on voltage measurement at the end of overhead line branches and communicate the measured signals to substation feeder relay or a central control center. However, such a detection method is costly and depends on the availability of communication medium and infrastructure. The main objective of this research is to utilize the available standard protection schemes to increase the probability of detection of downed conductors occurring with a low magnitude of fault currents and at the same time avoiding unwanted tripping in healthy conditions and feeders. By specifying the operating region of the faulty feeder, use of tripping curve for discrimination between faulty and healthy feeders, and with proper selection of core balance current transformer (CBCT) and voltage transformers with fewer measurement errors, it is possible to set the pick-up of sensitive earth fault current to minimum values of few amps (i.e., Pick-up Settings = 3 A or 4 A, …) for the detection of earth faults with fault resistance more than (1 - 2 kΩ) for 13.8kV overhead network and more than (3-4) kΩ fault resistance in 33kV overhead network. By implementation of the outcomes of this study, the probability of detection of downed conductors is increased by the utilization of existing schemes (i.e., Directional Sensitive Earth Fault Protection).Keywords: sensitive earth fault, zero sequence current, grounded system, resistive fault detection, healthy feeder
Procedia PDF Downloads 1153420 Corruption, Tax Systems and Inclusive Development
Authors: Lawrence Kwaku Amoako, Parrendah Adwoa Kpeli
Abstract:
This paper analyses the implications of the corruption and tax system on inclusive development. We employ a sample of 45 countries between 2007 and 2020. We test for two related hypotheses; first, corruption hinders the smooth mobilisation of revenue through the tax system. Second, a rise in corruption amidst a defective tax system impairs inclusive development. We expect that a rise in the level of corruption in the economy will distort the tax system, thus affecting efficient revenue mobilisation and, subsequently, inclusive development. By extension, these findings have important policy implications for governments in containing corruption and building an effective tax system as it will help promote inclusive development.Keywords: corruption, development, tax systems, tax complexity
Procedia PDF Downloads 1133419 Mentoring of Health Professionals to Ensure Better Child-Birth and Newborn Care in Bihar, India: An Intervention Study
Authors: Aboli Gore, Aritra Das, Sunil Sonthalia, Tanmay Mahapatra, Sridhar Srikantiah, Hemant Shah
Abstract:
AMANAT is an initiative, taken in collaboration with the Government of Bihar, aimed at improving the Quality of Maternal and Neonatal care services at Bihar’s public health facilities – those offering either the Basic Emergency Obstetric and Neonatal care (BEmONC) or Comprehensive Emergency Obstetric and Neonatal care (CEmONC) services. The effectiveness of this program is evaluated by conducting cross-sectional assessments at the concerned facilities prior to (baseline) and following completion (endline) of intervention. Direct Observation of Delivery (DOD) methodology is employed for carrying out the baseline and endline assessments – through which key obstetric and neonatal care practices among the Health Care Providers (especially the nurses) are assessed quantitatively by specially trained nursing professionals. Assessment of vitals prior to delivery improved during all three phases of BEmONC and all four phases of CEmONC training with statistically significant improvement noted in: i) pulse measurement in BEmONC phase 2 (9% to 68%), 3 (4% to 57%) & 4 (14% to 59%) and CEmONC phase 2 (7% to 72%) and 3 (0% to 64%); ii) blood pressure measurement in BEmONC phase 2 (27% to 84%), 3 (21% to 76%) & 4 (36% to 71%) and CEmONC phase 2 (23% to 76%) and 3 (2% to 70%); iii) fetal heart rate measurement in BEmONC phase 2 (10% to 72%), 3 (11% to 77%) & 4 (13% to 64%) and CEmONC phase 1 (24% to 38%), 2 (14% to 82%) and 3 (1% to 73%); and iv) abdominal examination in BEmONC phase 2 (14% to 59%), 3 (3% to 59%) & 4 (6% to 56%) and CEmONC phase 1 (0% to 24%), 2 (7% to 62%) & 3 (0% to 62%). Regarding infection control, wearing of apron, mask and cap by the delivery conductors improved significantly in all BEmONC phases. Similarly, the practice of handwashing improved in all BEmONC and CEmONC phases. Even on disaggregation, the handwashing showed significant improvement in all phases but CEmONC phase-4. Not only the positive practices related to handwashing improved but also negative practices such as turning off the tap with bare hands declined significantly in the aforementioned phases. Significant decline was also noted in negative maternal care practices such as application of fundal pressure for hastening the delivery process and administration of oxytocin prior to delivery. One of the notable achievement of AMANAT is an improvement in active management of the third stage of labor (AMTSL). The overall AMTSL (including administration of oxytocin or other uterotonics uterotonic in proper dose, route and time along with controlled cord traction and uterine massage) improved in all phases of BEmONC and CEmONC mentoring. Another key area of improvement, across phases, was in proper cutting/clamping of the umbilical cord. AMANAT mentoring also led to improvement in important immediate newborn care practices such as initiation of skin-to-skin care and timely initiation of breastfeeding. The next phase of the mentoring program seeks to institutionalize mentoring across the state that could potentially perpetuate improvement with minimal external intervention.Keywords: capacity building, nurse-mentoring, quality of care, pregnancy, newborn care
Procedia PDF Downloads 1613418 Estimation of Mobility Parameters and Threshold Voltage of an Organic Thin Film Transistor Using an Asymmetric Capacitive Test Structure
Authors: Rajesh Agarwal
Abstract:
Carrier mobility at the organic/insulator interface is essential to the performance of organic thin film transistors (OTFT). The present work describes estimation of field dependent mobility (FDM) parameters and the threshold voltage of an OTFT using a simple, easy to fabricate two terminal asymmetric capacitive test structure using admittance measurements. Conventionally, transfer characteristics are used to estimate the threshold voltage in an OTFT with field independent mobility (FIDM). Yet, this technique breaks down to give accurate results for devices with high contact resistance and having field dependent mobility. In this work, a new technique is presented for characterization of long channel organic capacitor (LCOC). The proposed technique helps in the accurate estimation of mobility enhancement factor (γ), the threshold voltage (V_th) and band mobility (µ₀) using capacitance-voltage (C-V) measurement in OTFT. This technique also helps to get rid of making short channel OTFT or metal-insulator-metal (MIM) structures for making C-V measurements. To understand the behavior of devices and ease of analysis, transmission line compact model is developed. The 2-D numerical simulation was carried out to illustrate the correctness of the model. Results show that proposed technique estimates device parameters accurately even in the presence of contact resistance and field dependent mobility. Pentacene/Poly (4-vinyl phenol) based top contact bottom-gate OTFT’s are fabricated to illustrate the operation and advantages of the proposed technique. Small signal of frequency varying from 1 kHz to 5 kHz and gate potential ranging from +40 V to -40 V have been applied to the devices for measurement.Keywords: capacitance, mobility, organic, thin film transistor
Procedia PDF Downloads 1653417 Performance Analysis of a Planar Membrane Humidifier for PEM Fuel Cell
Authors: Yu-Hsuan Chang, Jian-Hao Su, Chen-Yu Chen, Wei-Mon Yan
Abstract:
In this work, the experimental measurement was applied to examine the membrane type and flow field design on the performance of a planar membrane humidifier. The performance indexes were used to evaluate the planar membrane humidifier. The performance indexes of the membrane humidifier include the dew point approach temperature (DPAT), water recovery ratio (WRR), water flux (J) and pressure loss (P). The experiments contain mainly three parts. In the first part, a single membrane humidifier was tested using different flow field under different dry-inlet temperatures. The measured results show that the dew point approach temperature decreases with increasing the depth of flow channel at the same width of flow channel. However, the WRR and J reduce with an increase in the dry air-inlet temperature. The pressure loss tests indicate that pressure loss decreases with increasing the hydraulic diameter of flow channel, resulting from an increase in Darcy friction. Owing to the comparison of humidifier performances and pressure losses, the flow channel of width W=1 and height H=1.5 was selected as the channel design of the multi-membrane humidifier in the second part of experiment. In the second part, the multi-membrane humidifier was used to evaluate the humidification performance under different relative humidity and flow rates. The measurement results indicate that the humidifier at both lower temperature and relative humidity of inlet dry air have higher DPAT but lower J and WRR. In addition, the counter flow approach has better mass and heat transfer performance than the parallel flow approach. Moreover, the effects of dry air temperature, relative humidity and humidification approach are not significant to the pressure loss in the planar membrane humidifier. For the third part, different membranes were tested in this work in order to find out which kind membrane is appropriate for humidifier.Keywords: water management, planar membrane humidifier, heat and mass transfer, pressure loss, PEM fuel cell
Procedia PDF Downloads 2063416 Developing a Toolkit of Undergraduate Nursing Student’ Desirable Characteristics (TNDC) : An application Item Response Theory
Authors: Parinyaporn Thanaboonpuang, Siridej Sujiva, Shotiga Pasiphul
Abstract:
The higher education reform that integration of nursing programmes into the higher education system. Learning outcomes represent one of the essential building blocks for transparency within higher education systems and qualifications. The purpose of this study is to develop a toolkit of undergraduate nursing student’desirable characteristics assessment on Thai Qualifications Framework for Higher education and to test psychometric property for this instrument. This toolkit seeks to improve on the Computer Multimedia test. There are three skills to be examined: Cognitive skill, Responsibility and Interpersonal Skill, and Information Technology Skill. The study was conduct in 4 phases. In Phase 1. Based on developed a measurement model and Computer Multimedia test. Phase 2 two round focus group were conducted, to determine the content validity of measurement model and the toolkit. In Phase 3, data were collected using a multistage random sampling of 1,156 senior undergraduate nursing student were recruited to test psychometric property. In Phase 4 data analysis was conducted by descriptive statistics, item analysis, inter-rater reliability, exploratory factor analysis and confirmatory factor analysis. The resulting TNDC consists of 74 items across the following four domains: Cognitive skill, Interpersonal Skill, Responsibility and Information Technology Skill. The value of Cronbach’ s alpha for the four domains were .781, 807, .831, and .865, respectively. The final model in confirmatory factor analysis fit quite well with empirical data. The TNDC was found to be appropriate, both theoretically and statistically. Due to these results, it is recommended that the toolkit could be used in future studies for Nursing Program in Thailand.Keywords: toolkit, nursing student’ desirable characteristics, Thai qualifications framework
Procedia PDF Downloads 5353415 Unknown Groundwater Pollution Source Characterization in Contaminated Mine Sites Using Optimal Monitoring Network Design
Authors: H. K. Esfahani, B. Datta
Abstract:
Groundwater is one of the most important natural resources in many parts of the world; however it is widely polluted due to human activities. Currently, effective and reliable groundwater management and remediation strategies are obtained using characterization of groundwater pollution sources, where the measured data in monitoring locations are utilized to estimate the unknown pollutant source location and magnitude. However, accurately identifying characteristics of contaminant sources is a challenging task due to uncertainties in terms of predicting source flux injection, hydro-geological and geo-chemical parameters, and the concentration field measurement. Reactive transport of chemical species in contaminated groundwater systems, especially with multiple species, is a complex and highly non-linear geochemical process. Although sufficient concentration measurement data is essential to accurately identify sources characteristics, available data are often sparse and limited in quantity. Therefore, this inverse problem-solving method for characterizing unknown groundwater pollution sources is often considered ill-posed, complex and non- unique. Different methods have been utilized to identify pollution sources; however, the linked simulation-optimization approach is one effective method to obtain acceptable results under uncertainties in complex real life scenarios. With this approach, the numerical flow and contaminant transport simulation models are externally linked to an optimization algorithm, with the objective of minimizing the difference between measured concentration and estimated pollutant concentration at observation locations. Concentration measurement data are very important to accurately estimate pollution source properties; therefore, optimal design of the monitoring network is essential to gather adequate measured data at desired times and locations. Due to budget and physical restrictions, an efficient and effective approach for groundwater pollutant source characterization is to design an optimal monitoring network, especially when only inadequate and arbitrary concentration measurement data are initially available. In this approach, preliminary concentration observation data are utilized for preliminary source location, magnitude and duration of source activity identification, and these results are utilized for monitoring network design. Further, feedback information from the monitoring network is used as inputs for sequential monitoring network design, to improve the identification of unknown source characteristics. To design an effective monitoring network of observation wells, optimization and interpolation techniques are used. A simulation model should be utilized to accurately describe the aquifer properties in terms of hydro-geochemical parameters and boundary conditions. However, the simulation of the transport processes becomes complex when the pollutants are chemically reactive. Three dimensional transient flow and reactive contaminant transport process is considered. The proposed methodology uses HYDROGEOCHEM 5.0 (HGCH) as the simulation model for flow and transport processes with chemically multiple reactive species. Adaptive Simulated Annealing (ASA) is used as optimization algorithm in linked simulation-optimization methodology to identify the unknown source characteristics. Therefore, the aim of the present study is to develop a methodology to optimally design an effective monitoring network for pollution source characterization with reactive species in polluted aquifers. The performance of the developed methodology will be evaluated for an illustrative polluted aquifer sites, for example an abandoned mine site in Queensland, Australia.Keywords: monitoring network design, source characterization, chemical reactive transport process, contaminated mine site
Procedia PDF Downloads 2313414 The Generalized Pareto Distribution as a Model for Sequential Order Statistics
Authors: Mahdy Esmailian, Mahdi Doostparast, Ahmad Parsian
Abstract:
In this article, sequential order statistics (SOS) censoring type II samples coming from the generalized Pareto distribution are considered. Maximum likelihood (ML) estimators of the unknown parameters are derived on the basis of the available multiple SOS data. Necessary conditions for existence and uniqueness of the derived ML estimates are given. Due to complexity in the proposed likelihood function, a useful re-parametrization is suggested. For illustrative purposes, a Monte Carlo simulation study is conducted and an illustrative example is analysed.Keywords: bayesian estimation, generalized pareto distribution, maximum likelihood estimation, sequential order statistics
Procedia PDF Downloads 5093413 A Cooperative Space-Time Transmission Scheme Based On Symbol Combinations
Authors: Keunhong Chae, Seokho Yoon
Abstract:
This paper proposes a cooperative Alamouti space time transmission scheme with low relay complexity for the cooperative communication systems. In the proposed scheme, the source node combines the data symbols to construct the Alamouti-coded form at the destination node, while the conventional scheme performs the corresponding operations at the relay nodes. In simulation results, it is shown that the proposed scheme achieves the second order cooperative diversity while maintaining the same bit error rate (BER) performance as that of the conventional scheme.Keywords: Space-time transmission, cooperative communication system, MIMO.
Procedia PDF Downloads 3523412 Competitiveness and Pricing Policy Assessment for Resilience Surface Access System at Airports
Authors: Dimitrios J. Dimitriou
Abstract:
Considering a worldwide tendency, air transports are growing very fast and many changes have taken place in planning, management and decision making process. Given the complexity of airport operation, the best use of existing capacity is the key driver of efficiency and productivity. This paper deals with the evaluation framework for the ground access at airports, by using a set of mode choice indicators providing key messages towards airport’s ground access performance. The application presents results for a sample of 12 European airports, illustrating recommendations to define policy and improve service for the air transport access chain.Keywords: airport ground access, air transport chain, airport access performance, airport policy
Procedia PDF Downloads 3703411 Relationship of Indoor and Outdoor Levels of Black Carbon in an Urban Environment
Authors: Daria Pashneva, Julija Pauraite, Agne Minderyte, Vadimas Dudoitis, Lina Davuliene, Kristina Plauskaite, Inga Garbariene, Steigvile Bycenkiene
Abstract:
Black carbon (BC) has received particular attention around the world, not only for its impact on regional and global climate change but also for its impact on air quality and public health. In order to study the relationship between indoor and outdoor BC concentrations, studies were carried out in Vilnius, Lithuania. The studies are aimed at determining the relationship of concentrations, identifying dependencies during the day and week with a further opportunity to analyze the key factors affecting the indoor concentration of BC. In this context, indoor and outdoor continuous real-time measurements of optical BC-related light absorption by aerosol particles were carried out during the cold season (from October to December 2020). The measurement venue was an office located in an urban background environment. Equivalent black carbon (eBC) mass concentration was measured by an Aethalometer (Magee Scientific, model AE-31). The optical transmission of carbonaceous aerosol particles was measured sequentially at seven wavelengths (λ= 370, 470, 520, 590, 660, 880, and 950 nm), where the eBC mass concentration was derived from the light absorption coefficient (σab) at 880 nm wavelength. The diurnal indoor eBC mass concentration was found to vary in the range from 0.02 to 0.08 µgm⁻³, while the outdoor eBC mass concentration - from 0.34 to 0.99 µgm⁻³. Diurnal variations of eBC mass concentration outdoor vs. indoor showed an increased contribution during 10:00 and 12:00 AM (GMT+2), with the highest indoor eBC mass concentration of 0.14µgm⁻³. An indoor/outdoor eBC ratio (I/O) was below one throughout the entire measurement period. The weekend levels of eBC mass concentration were lower than in weekdays for indoor and outdoor for 33% and 28% respectively. Hourly mean mass concentrations of eBC for weekdays and weekends show diurnal cycles, which could be explained by the periodicity of traffic intensity and heating activities. The results show a moderate influence of outdoor eBC emissions on the indoor eBC level.Keywords: black carbon, climate change, indoor air quality, I/O ratio
Procedia PDF Downloads 1993410 Scoping Review of Biological Age Measurement Composed of Biomarkers
Authors: Diego Alejandro Espíndola-Fernández, Ana María Posada-Cano, Dagnóvar Aristizábal-Ocampo, Jaime Alberto Gallo-Villegas
Abstract:
Background: With the increase in life expectancy, aging has been subject of frequent research, and therefore multiple strategies have been proposed to quantify the advance of the years based on the known physiology of human senescence. For several decades, attempts have been made to characterize these changes through the concept of biological age, which aims to integrate, in a measure of time, structural or functional variation through biomarkers in comparison with simple chronological age. The objective of this scoping review is to deepen the updated concept of measuring biological age composed of biomarkers in the general population and to summarize recent evidence to identify gaps and priorities for future research. Methods: A scoping review was conducted according to the five-phase methodology developed by Arksey and O'Malley through a search of five bibliographic databases to February 2021. Original articles were included with no time or language limit that described the biological age composed of at least two biomarkers in those over 18 years of age. Results: 674 articles were identified, of which 105 were evaluated for eligibility and 65 were included with information on the measurement of biological age composed of biomarkers. Articles from 1974 of 15 nationalities were found, most observational studies, in which clinical or paraclinical biomarkers were used, and 11 different methods described for the calculation of the composite biological age were informed. The outcomes reported were the relationship with the same measured biomarkers, specified risk factors, comorbidities, physical or cognitive functionality, and mortality. Conclusions: The concept of biological age composed of biomarkers has evolved since the 1970s and multiple methods of its quantification have been described through the combination of different clinical and paraclinical variables from observational studies. Future research should consider the population characteristics, and the choice of biomarkers against the proposed outcomes to improve the understanding of aging variables to direct effective strategies for a proper approach.Keywords: biological age, biological aging, aging, senescence, biomarker
Procedia PDF Downloads 1863409 Structural Analysis and Modelling in an Evolving Iron Ore Operation
Authors: Sameh Shahin, Nannang Arrys
Abstract:
Optimizing pit slope stability and reducing strip ratio of a mining operation are two key tasks in geotechnical engineering. With a growing demand for minerals and an increasing cost associated with extraction, companies are constantly re-evaluating the viability of mineral deposits and challenging their geological understanding. Within Rio Tinto Iron Ore, the Structural Geology (SG) team investigate and collect critical data, such as point based orientations, mapping and geological inferences from adjacent pits to re-model deposits where previous interpretations have failed to account for structurally controlled slope failures. Utilizing innovative data collection methods and data-driven investigation, SG aims to address the root causes of slope instability. Committing to a resource grid drill campaign as the primary source of data collection will often bias data collection to a specific orientation and significantly reduce the capability to identify and qualify complexity. Consequently, these limitations make it difficult to construct a realistic and coherent structural model that identifies adverse structural domains. Without the consideration of complexity and the capability of capturing these structural domains, mining operations run the risk of inadequately designed slopes that may fail and potentially harm people. Regional structural trends have been considered in conjunction with surface and in-pit mapping data to model multi-batter fold structures that were absent from previous iterations of the structural model. The risk is evident in newly identified dip-slope and rock-mass controlled sectors of the geotechnical design rather than a ubiquitous dip-slope sector across the pit. The reward is two-fold: 1) providing sectors of rock-mass controlled design in previously interpreted structurally controlled domains and 2) the opportunity to optimize the slope angle for mineral recovery and reduced strip ratio. Furthermore, a resulting high confidence model with structures and geometries that can account for historic slope instabilities in structurally controlled domains where design assumptions failed.Keywords: structural geology, geotechnical design, optimization, slope stability, risk mitigation
Procedia PDF Downloads 463408 Adaption of the Design Thinking Method for Production Planning in the Meat Industry Using Machine Learning Algorithms
Authors: Alica Höpken, Hergen Pargmann
Abstract:
The resource-efficient planning of the complex production planning processes in the meat industry and the reduction of food waste is a permanent challenge. The complexity of the production planning process occurs in every part of the supply chain, from agriculture to the end consumer. It arises from long and uncertain planning phases. Uncertainties such as stochastic yields, fluctuations in demand, and resource variability are part of this process. In the meat industry, waste mainly relates to incorrect storage, technical causes in production, or overproduction. The high amount of food waste along the complex supply chain in the meat industry could not be reduced by simple solutions until now. Therefore, resource-efficient production planning by conventional methods is currently only partially feasible. The realization of intelligent, automated production planning is basically possible through the application of machine learning algorithms, such as those of reinforcement learning. By applying the adapted design thinking method, machine learning methods (especially reinforcement learning algorithms) are used for the complex production planning process in the meat industry. This method represents a concretization to the application area. A resource-efficient production planning process is made available by adapting the design thinking method. In addition, the complex processes can be planned efficiently by using this method, since this standardized approach offers new possibilities in order to challenge the complexity and the high time consumption. It represents a tool to support the efficient production planning in the meat industry. This paper shows an elegant adaption of the design thinking method to apply the reinforcement learning method for a resource-efficient production planning process in the meat industry. Following, the steps that are necessary to introduce machine learning algorithms into the production planning of the food industry are determined. This is achieved based on a case study which is part of the research project ”REIF - Resource Efficient, Economic and Intelligent Food Chain” supported by the German Federal Ministry for Economic Affairs and Climate Action of Germany and the German Aerospace Center. Through this structured approach, significantly better planning results are achieved, which would be too complex or very time consuming using conventional methods.Keywords: change management, design thinking method, machine learning, meat industry, reinforcement learning, resource-efficient production planning
Procedia PDF Downloads 1283407 Analyzing the Connection between Productive Structure and Communicable Diseases: An Econometric Panel Study
Authors: Julio Silva, Lia Hasenclever, Gilson G. Silva Jr.
Abstract:
The aim of this paper is to check possible convergence in health measures (aged-standard rate of morbidity and mortality) for communicable diseases between developed and developing countries, conditional to productive structures features. Understanding the interrelations between health patterns and economic development is particularly important in the context of low- and middle-income countries, where economic development comes along with deep social inequality. Developing countries with less diversified productive structures (measured through complexity index) but high heterogeneous inter-sectorial labor productivity (using as a proxy inter-sectorial coefficient of variation of labor productivity) has on average low health levels in communicable diseases compared to developed countries with high diversified productive structures and low labor market heterogeneity. Structural heterogeneity and productive diversification may have influence on health levels even considering per capita income. We set up a panel data for 139 countries from 1995 to 2015, joining several data about the countries, as economic development, health, and health system coverage, environmental and socioeconomic aspects. This information was obtained from World Bank, International Labour Organization, Atlas of Economic Complexity, United Nation (Development Report) and Institute for Health Metrics and Evaluation Database. Econometric panel models evidence shows that the level of communicable diseases has a positive relationship with structural heterogeneity, even considering other factors as per capita income. On the other hand, the recent process of convergence in terms of communicable diseases have been motivated for other reasons not directly related to productive structure, as health system coverage and environmental aspects. These evidences suggest a joint dynamics between the unequal distribution of communicable diseases and countries' productive structure aspects. These set of evidence are quite important to public policy as meet the health aims in Millennium Development Goals. It also highlights the importance of the process of structural change as fundamental to shift the levels of health in terms of communicable diseases and can contribute to the debate between the relation of economic development and health patterns changes.Keywords: economic development, inequality, population health, structural change
Procedia PDF Downloads 1443406 Approximate Spring Balancing for Swimming Pool Lift Mechanism to Reduce Actuator Torque
Authors: Apurva Patil, Sujatha Srinivasan
Abstract:
Reducing actuator loads is important for applications in which human effort is required for actuation. The potential benefit of applying spring balancing to rehabilitation devices which work against gravity on a nonhorizontal plane is well recognized, but practical applications have been elusive. Although existing methods provide exact spring balance, they require additional masses or auxiliary links, or all the springs used originate from the ground, which makes the resulting device bulky and space-inefficient. This paper uses a method of static balancing of mechanisms with conservative loads such as gravity and spring loads using non-zero-free-length springs and no auxiliary links. Application of this method to a manually operated swimming pool lift mechanism which lowers and raises the physically challenged users into or out of the swimming pool is presented here. Various possible configurations using extension and compression springs as well as gas spring in the mechanism are compared. This work involves approximate spring balancing of the mechanism using minimization of potential energy variance. It uses the approach of flattening the potential energy distribution over the workspace and fuses it with numerical optimization. The results show the considerable reduction in actuator torque requirement with practical spring design and arrangement. Although the method provides only an approximate balancing, it is versatile, flexible in choosing appropriate control variables that are relevant to the design problem and easy to implement. The true potential of this technique lies in the fact that it uses a very simple optimization to find the spring constant, free length of the spring and the optimal attachment points subject to the optimization constraints. Also, it uses physically realizable non-zero-free-length springs directly, thereby reducing the complexity involved in simulating zero-free-length springs from non-zero-free-length springs. This method allows springs to be attached inside the mechanism, which makes the implementation of spring balancing practical. Because auxiliary linkages can be avoided, the resultant swimming pool lift mechanism is compact. The cost benefits and reduced complexity can be significant advantages in the development of this user-actuated swimming pool lift for developing countries.Keywords: gas spring, rehabilitation device, spring balancing, swimming pool lift
Procedia PDF Downloads 2413405 The Impact of Reducing Road Traffic Speed in London on Noise Levels: A Comparative Study of Field Measurement and Theoretical Calculation
Authors: Jessica Cecchinelli, Amer Ali
Abstract:
The continuing growth in road traffic and the resultant impact on the level of pollution and safety especially in urban areas have led local and national authorities to reduce traffic speed and flow in major towns and cities. Various boroughs of London have recently reduced the in-city speed limit from 30mph to 20mph mainly to calm traffic, improve safety and reduce noise and vibration. This paper reports the detailed field measurements using noise sensor and analyser and the corresponding theoretical calculations and analysis of the noise levels on a number of roads in the central London Borough of Camden where speed limit was reduced from 30mph to 20mph in all roads except the major routes of the ‘Transport for London (TfL)’. The measurements, which included the key noise levels and scales at residential streets and main roads, were conducted during weekdays and weekends normal and rush hours. The theoretical calculations were done according to the UK procedure ‘Calculation of Road Traffic Noise 1988’ and with conversion to the European L-day, L-evening, L-night, and L-den and other important levels. The current study also includes comparable data and analysis from previously measured noise in the Borough of Camden and other boroughs of central London. Classified traffic flow and speed on the roads concerned were observed and used in the calculation part of the study. Relevant data and description of the weather condition are reported. The paper also reports a field survey in the form of face-to-face interview questionnaires, which was carried out in parallel with the field measurement of noise, in order to ascertain the opinions and views of local residents and workers in the reduced speed zones of 20mph. The main findings are that the reduction in speed had reduced the noise pollution on the studied zones and that the measured and calculated noise levels for each speed zone are closely matched. Among the other findings was that of the field survey of the opinions and views of the local residents and workers in the reduced speed 20mph zones who supported the scheme and felt that it had improved the quality of life in their areas giving a sense of calmness and safety particularly for families with children, the elderly, and encouraged pedestrians and cyclists. The key conclusions are that lowering the speed limit in built-up areas would not just reduce the number of serious accidents but it would also reduce the noise pollution and promote clean modes of transport particularly walking and cycling. The details of the site observations and the corresponding calculations together with critical comparative analysis and relevant conclusions will be reported in the full version of the paper.Keywords: noise calculation, noise field measurement, road traffic noise, speed limit in london, survey of people satisfaction
Procedia PDF Downloads 4243404 Cyclostationary Analysis of Polytime Coded Signals for LPI Radars
Authors: Metuku Shyamsunder, Kakarla Subbarao, P. Prasanna
Abstract:
In radars, an electromagnetic waveform is sent, and an echo of the same signal is received by the receiver. From this received signal, by extracting various parameters such as round trip delay, Doppler frequency it is possible to find distance, speed, altitude, etc. However, nowadays as the technology increases, intruders are intercepting transmitted signal as it reaches them, and they will be extracting the characteristics and trying to modify them. So there is a need to develop a system whose signal cannot be identified by no cooperative intercept receivers. That is why LPI radars came into existence. In this paper, a brief discussion on LPI radar and its modulation (polytime code (PT1)), detection (cyclostationary (DFSM & FAM) techniques such as DFSM, FAM are presented and compared with respect to computational complexity.Keywords: LPI radar, polytime codes, cyclostationary DFSM, FAM
Procedia PDF Downloads 4763403 The Inclusive Human Trafficking Checklist: A Dialectical Measurement Methodology
Authors: Maria C. Almario, Pam Remer, Jeff Resse, Kathy Moran, Linda Theander Adam
Abstract:
The identification of victims of human trafficking and consequential service provision is characterized by a significant disconnection between the estimated prevalence of this issue and the number of cases identified. This poses as tremendous problem for human rights advocates as it prevents data collection, information sharing, allocation of resources and opportunities for international dialogues. The current paper introduces the Inclusive Human Trafficking Checklist (IHTC) as a measurement methodology with theoretical underpinnings derived from dialectic theory. The presence of human trafficking in a person’s life is conceptualized as a dynamic and dialectic interaction between vulnerability and exploitation. The current papers explores the operationalization of exploitation and vulnerability, evaluates the metric qualities of the instrument, evaluates whether there are differences in assessment based on the participant’s profession, level of knowledge, and training, and assesses if users of the instrument perceive it as useful. A total of 201 participants were asked to rate three vignettes predetermined by experts to qualify as a either human trafficking case or not. The participants were placed in three conditions: business as usual, utilization of the IHTC with and without training. The results revealed a statistically significant level of agreement between the expert’s diagnostic and the application of the IHTC with an improvement of 40% on identification when compared with the business as usual condition While there was an improvement in identification in the group with training, the difference was found to have a small effect size. Participants who utilized the IHTC showed an increased ability to identify elements of identity-based vulnerabilities as well as elements of fraud, which according to the results, are distinctive variables in cases of human trafficking. In terms of the perceived utility, the results revealed higher mean scores for the groups utilizing the IHTC when compared to the business as usual condition. These findings suggest that the IHTC improves appropriate identification of cases and that it is perceived as a useful instrument. The application of the IHTC as a multidisciplinary instrumentation that can be utilized in legal and human services settings is discussed as a pivotal piece of helping victims restore their sense of dignity, and advocate for legal, physical and psychological reparations. It is noteworthy that this study was conducted with a sample in the United States and later re-tested in Colombia. The implications of the instrument for treatment conceptualization and intervention in human trafficking cases are discussed as opportunities for enhancement of victim well-being, restoration engagement and activism. With the idea that what is personal is also political, we believe that the careful observation and data collection in specific cases can inform new areas of human rights activism.Keywords: exploitation, human trafficking, measurement, vulnerability, screening
Procedia PDF Downloads 3303402 Data Transformations in Data Envelopment Analysis
Authors: Mansour Mohammadpour
Abstract:
Data transformation refers to the modification of any point in a data set by a mathematical function. When applying transformations, the measurement scale of the data is modified. Data transformations are commonly employed to turn data into the appropriate form, which can serve various functions in the quantitative analysis of the data. This study addresses the investigation of the use of data transformations in Data Envelopment Analysis (DEA). Although data transformations are important options for analysis, they do fundamentally alter the nature of the variable, making the interpretation of the results somewhat more complex.Keywords: data transformation, data envelopment analysis, undesirable data, negative data
Procedia PDF Downloads 203401 Least Support Orthogonal Matching Pursuit (LS-OMP) Recovery Method for Invisible Watermarking Image
Authors: Israa Sh. Tawfic, Sema Koc Kayhan
Abstract:
In this paper, first, we propose least support orthogonal matching pursuit (LS-OMP) algorithm to improve the performance, of the OMP (orthogonal matching pursuit) algorithm. LS-OMP algorithm adaptively chooses optimum L (least part of support), at each iteration. This modification helps to reduce the computational complexity significantly and performs better than OMP algorithm. Second, we give the procedure for the invisible image watermarking in the presence of compressive sampling. The image reconstruction based on a set of watermarked measurements is performed using LS-OMP.Keywords: compressed sensing, orthogonal matching pursuit, restricted isometry property, signal reconstruction, least support orthogonal matching pursuit, watermark
Procedia PDF Downloads 338