Search results for: validation indexes
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1626

Search results for: validation indexes

936 PM10 Prediction and Forecasting Using CART: A Case Study for Pleven, Bulgaria

Authors: Snezhana G. Gocheva-Ilieva, Maya P. Stoimenova

Abstract:

Ambient air pollution with fine particulate matter (PM10) is a systematic permanent problem in many countries around the world. The accumulation of a large number of measurements of both the PM10 concentrations and the accompanying atmospheric factors allow for their statistical modeling to detect dependencies and forecast future pollution. This study applies the classification and regression trees (CART) method for building and analyzing PM10 models. In the empirical study, average daily air data for the city of Pleven, Bulgaria for a period of 5 years are used. Predictors in the models are seven meteorological variables, time variables, as well as lagged PM10 variables and some lagged meteorological variables, delayed by 1 or 2 days with respect to the initial time series, respectively. The degree of influence of the predictors in the models is determined. The selected best CART models are used to forecast future PM10 concentrations for two days ahead after the last date in the modeling procedure and show very accurate results.

Keywords: cross-validation, decision tree, lagged variables, short-term forecasting

Procedia PDF Downloads 194
935 Scoring System for the Prognosis of Sepsis Patients in Intensive Care Units

Authors: Javier E. García-Gallo, Nelson J. Fonseca-Ruiz, John F. Duitama-Munoz

Abstract:

Sepsis is a syndrome that occurs with physiological and biochemical abnormalities induced by severe infection and carries a high mortality and morbidity, therefore the severity of its condition must be interpreted quickly. After patient admission in an intensive care unit (ICU), it is necessary to synthesize the large volume of information that is collected from patients in a value that represents the severity of their condition. Traditional severity of illness scores seeks to be applicable to all patient populations, and usually assess in-hospital mortality. However, the use of machine learning techniques and the data of a population that shares a common characteristic could lead to the development of customized mortality prediction scores with better performance. This study presents the development of a score for the one-year mortality prediction of the patients that are admitted to an ICU with a sepsis diagnosis. 5650 ICU admissions extracted from the MIMICIII database were evaluated, divided into two groups: 70% to develop the score and 30% to validate it. Comorbidities, demographics and clinical information of the first 24 hours after the ICU admission were used to develop a mortality prediction score. LASSO (least absolute shrinkage and selection operator) and SGB (Stochastic Gradient Boosting) variable importance methodologies were used to select the set of variables that make up the developed score; each of this variables was dichotomized and a cut-off point that divides the population into two groups with different mean mortalities was found; if the patient is in the group that presents a higher mortality a one is assigned to the particular variable, otherwise a zero is assigned. These binary variables are used in a logistic regression (LR) model, and its coefficients were rounded to the nearest integer. The resulting integers are the point values that make up the score when multiplied with each binary variables and summed. The one-year mortality probability was estimated using the score as the only variable in a LR model. Predictive power of the score, was evaluated using the 1695 admissions of the validation subset obtaining an area under the receiver operating characteristic curve of 0.7528, which outperforms the results obtained with Sequential Organ Failure Assessment (SOFA), Oxford Acute Severity of Illness Score (OASIS) and Simplified Acute Physiology Score II (SAPSII) scores on the same validation subset. Observed and predicted mortality rates within estimated probabilities deciles were compared graphically and found to be similar, indicating that the risk estimate obtained with the score is close to the observed mortality, it is also observed that the number of events (deaths) is indeed increasing as the outcome go from the decile with the lowest probabilities to the decile with the highest probabilities. Sepsis is a syndrome that carries a high mortality, 43.3% for the patients included in this study; therefore, tools that help clinicians to quickly and accurately predict a worse prognosis are needed. This work demonstrates the importance of customization of mortality prediction scores since the developed score provides better performance than traditional scoring systems.

Keywords: intensive care, logistic regression model, mortality prediction, sepsis, severity of illness, stochastic gradient boosting

Procedia PDF Downloads 222
934 Design and Implementation of a Geodatabase and WebGIS

Authors: Sajid Ali, Dietrich Schröder

Abstract:

The merging of internet and Web has created many disciplines and Web GIS is one these disciplines which is effectively dealing with the geospatial data in a proficient way. Web GIS technologies have provided an easy accessing and sharing of geospatial data over the internet. However, there is a single platform for easy and multiple accesses of the data lacks for the European Caribbean Association (Europaische Karibische Gesselschaft - EKG) to assist their members and other research community. The technique presented in this paper deals with designing of a geodatabase using PostgreSQL/PostGIS as an object oriented relational database management system (ORDBMS) for competent dissemination and management of spatial data and Web GIS by using OpenGeo Suite for the fast sharing and distribution of the data over the internet. The characteristics of the required design for the geodatabase have been studied and a specific methodology is given for the purpose of designing the Web GIS. At the end, validation of this Web based geodatabase has been performed over two Desktop GIS software and a web map application and it is also discussed that the contribution has all the desired modules to expedite further research in the area as per the requirements.

Keywords: desktop GISSoftware, European Caribbean association, geodatabase, OpenGeo suite, postgreSQL/PostGIS, webGIS, web map application

Procedia PDF Downloads 340
933 'Low Electronic Noise' Detector Technology in Computed Tomography

Authors: A. Ikhlef

Abstract:

Image noise in computed tomography, is mainly caused by the statistical noise, system noise reconstruction algorithm filters. Since last few years, low dose x-ray imaging became more and more desired and looked as a technical differentiating technology among CT manufacturers. In order to achieve this goal, several technologies and techniques are being investigated, including both hardware (integrated electronics and photon counting) and software (artificial intelligence and machine learning) based solutions. From a hardware point of view, electronic noise could indeed be a potential driver for low and ultra-low dose imaging. We demonstrated that the reduction or elimination of this term could lead to a reduction of dose without affecting image quality. Also, in this study, we will show that we can achieve this goal using conventional electronics (low cost and affordable technology), designed carefully and optimized for maximum detective quantum efficiency. We have conducted the tests using large imaging objects such as 30 cm water and 43 cm polyethylene phantoms. We compared the image quality with conventional imaging protocols with radiation as low as 10 mAs (<< 1 mGy). Clinical validation of such results has been performed as well.

Keywords: computed tomography, electronic noise, scintillation detector, x-ray detector

Procedia PDF Downloads 126
932 Investigation of Martensitic Transformation Zone at the Crack Tip of NiTi under Mode-I Loading Using Microscopic Image Correlation

Authors: Nima Shafaghi, Gunay Anlaş, C. Can Aydiner

Abstract:

A realistic understanding of martensitic phase transition under complex stress states is key for accurately describing the mechanical behavior of shape memory alloys (SMAs). Particularly regarding the sharply changing stress fields at the tip of a crack, the size, nature and shape of transformed zones are of great interest. There is significant variation among various analytical models in their predictions of the size and shape of the transformation zone. As the fully transformed region remains inside a very small boundary at the tip of the crack, experimental validation requires microscopic resolution. Here, the crack tip vicinity of NiTi compact tension specimen has been monitored in situ with microscopic image correlation with 20x magnification. With nominal 15 micrometer grains and 0.2 micrometer per pixel optical resolution, the strains at the crack tip are mapped with intra-grain detail. The transformation regions are then deduced using an equivalent strain formulation.

Keywords: digital image correlation, fracture, martensitic phase transition, mode I, NiTi, transformation zone

Procedia PDF Downloads 353
931 Functional Instruction Set Simulator of a Neural Network IP with Native Brain Float-16 Generator

Authors: Debajyoti Mukherjee, Arathy B. S., Arpita Sahu, Saranga P. Pogula

Abstract:

A functional model to mimic the functional correctness of a neural network compute accelerator IP is very crucial for design validation. Neural network workloads are based on a Brain Floating Point (BF-16) data type. The major challenge we were facing was the incompatibility of GCC compilers to the BF-16 datatype, which we addressed with a native BF-16 generator integrated into our functional model. Moreover, working with big GEMM (General Matrix Multiplication) or SpMM (Sparse Matrix Multiplication) Work Loads (Dense or Sparse) and debugging the failures related to data integrity is highly painstaking. In this paper, we are addressing the quality challenge of such a complex neural network accelerator design by proposing a functional model-based scoreboard or software model using SystemC. The proposed functional model executes the assembly code based on the ISA of the processor IP, decodes all instructions, and executes as expected to be done by the DUT. The said model would give a lot of visibility and debug capability in the DUT, bringing up micro-steps of execution.

Keywords: ISA, neural network, Brain Float-16, DUT

Procedia PDF Downloads 94
930 Statistical Quality Control on Assignable Causes of Variation on Cement Production in Ashaka Cement PLC Gombe State

Authors: Hamisu Idi

Abstract:

The present study focuses on studying the impact of influencer recommendation in the quality of cement production. Exploratory research was done on monthly basis, where data were obtained from secondary source i.e. the record kept by an automated recompilation machine. The machine keeps all the records of the mills downtime which the process manager checks for validation and refer the fault (if any) to the department responsible for maintenance or measurement taking so as to prevent future occurrence. The findings indicated that the product of the Ashaka Cement Plc. were considered as qualitative, since all the production processes were found to be in control (preset specifications) with the exception of the natural cause of variation which is normal in the production process as it will not affect the outcome of the product. It is reduced to the bearest minimum since it cannot be totally eliminated. It is also hopeful that the findings of this study would be of great assistance to the management of Ashaka cement factory and the process manager in particular at various levels in the monitoring and implementation of statistical process control. This study is therefore of great contribution to the knowledge in this regard and it is hopeful that it would open more research in that direction.

Keywords: cement, quality, variation, assignable cause, common cause

Procedia PDF Downloads 261
929 Optimized Preprocessing for Accurate and Efficient Bioassay Prediction with Machine Learning Algorithms

Authors: Jeff Clarine, Chang-Shyh Peng, Daisy Sang

Abstract:

Bioassay is the measurement of the potency of a chemical substance by its effect on a living animal or plant tissue. Bioassay data and chemical structures from pharmacokinetic and drug metabolism screening are mined from and housed in multiple databases. Bioassay prediction is calculated accordingly to determine further advancement. This paper proposes a four-step preprocessing of datasets for improving the bioassay predictions. The first step is instance selection in which dataset is categorized into training, testing, and validation sets. The second step is discretization that partitions the data in consideration of accuracy vs. precision. The third step is normalization where data are normalized between 0 and 1 for subsequent machine learning processing. The fourth step is feature selection where key chemical properties and attributes are generated. The streamlined results are then analyzed for the prediction of effectiveness by various machine learning algorithms including Pipeline Pilot, R, Weka, and Excel. Experiments and evaluations reveal the effectiveness of various combination of preprocessing steps and machine learning algorithms in more consistent and accurate prediction.

Keywords: bioassay, machine learning, preprocessing, virtual screen

Procedia PDF Downloads 274
928 Digital Platform of Crops for Smart Agriculture

Authors: Pascal François Faye, Baye Mor Sall, Bineta Dembele, Jeanne Ana Awa Faye

Abstract:

In agriculture, estimating crop yields is key to improving productivity and decision-making processes such as financial market forecasting and addressing food security issues. The main objective of this paper is to have tools to predict and improve the accuracy of crop yield forecasts using machine learning (ML) algorithms such as CART , KNN and SVM . We developed a mobile app and a web app that uses these algorithms for practical use by farmers. The tests show that our system (collection and deployment architecture, web application and mobile application) is operational and validates empirical knowledge on agro-climatic parameters in addition to proactive decision-making support. The experimental results obtained on the agricultural data, the performance of the ML algorithms are compared using cross-validation in order to identify the most effective ones following the agricultural data. The proposed applications demonstrate that the proposed approach is effective in predicting crop yields and provides timely and accurate responses to farmers for decision support.

Keywords: prediction, machine learning, artificial intelligence, digital agriculture

Procedia PDF Downloads 80
927 Technology Maps in Energy Applications Based on Patent Trends: A Case Study

Authors: Juan David Sepulveda

Abstract:

This article reflects the current stage of progress in the project “Determining technological trends in energy generation”. At first it was oriented towards finding out those trends by employing such tools as the scientometrics community had proved and accepted as effective for getting reliable results. Because a documented methodological guide for this purpose could not be found, the decision was made to reorient the scope and aim of this project, changing the degree of interest in pursuing the objectives. Therefore it was decided to propose and implement a novel guide from the elements and techniques found in the available literature. This article begins by explaining the elements and considerations taken into account when implementing and applying this methodology, and the tools that led to the implementation of a software application for patent revision. Univariate analysis helped recognize the technological leaders in the field of energy, and steered the way for a multivariate analysis of this sample, which allowed for a graphical description of the techniques of mature technologies, as well as the detection of emerging technologies. This article ends with a validation of the methodology as applied to the case of fuel cells.

Keywords: energy, technology mapping, patents, univariate analysis

Procedia PDF Downloads 476
926 Engineering Method to Measure the Impact Sound Improvement with Floor Coverings

Authors: Katarzyna Baruch, Agata Szelag, Jaroslaw Rubacha, Bartlomiej Chojnacki, Tadeusz Kamisinski

Abstract:

Methodology used to measure the reduction of transmitted impact sound by floor coverings situated on a massive floor is described in ISO 10140-3: 2010. To carry out such tests, the standardised reverberation room separated by a standard floor from the second measuring room are required. The need to have a special laboratory results in high cost and low accessibility of this measurement. The authors propose their own engineering method to measure the impact sound improvement with floor coverings. This method does not require standard rooms and floor. This paper describes the measurement procedure of proposed engineering method. Further, verification tests were performed. Validation of the proposed method was based on the analytical model, Statistical Energy Analysis (SEA) model and empirical measurements. The received results were related to corresponding ones obtained from ISO 10140-3:2010 measurements. The study confirmed the usefulness of the engineering method.

Keywords: building acoustic, impact noise, impact sound insulation, impact sound transmission, reduction of impact sound

Procedia PDF Downloads 324
925 Experimental Study of the Behavior of Elongated Non-spherical Particles in Wall-Bounded Turbulent Flows

Authors: Manuel Alejandro Taborda Ceballos, Martin Sommerfeld

Abstract:

Transport phenomena and dispersion of non-spherical particle in turbulent flows are found everywhere in industrial application and processes. Powder handling, pollution control, pneumatic transport, particle separation are just some examples where the particle encountered are not only spherical. These types of multiphase flows are wall bounded and mostly highly turbulent. The particles found in these processes are rarely spherical but may have various shapes (e.g., fibers, and rods). Although research related to the behavior of regular non-spherical particles in turbulent flows has been carried out for many years, it is still necessary to refine models, especially near walls where the interaction fiber-wall changes completely its behavior. Imaging-based experimental studies on dispersed particle-laden flows have been applied for many decades for a detailed experimental analysis. These techniques have the advantages that they provide field information in two or three dimensions, but have a lower temporal resolution compared to point-wise techniques such as PDA (phase-Doppler anemometry) and derivations therefrom. The applied imaging techniques in dispersed two-phase flows are extensions from classical PIV (particle image velocimetry) and PTV (particle tracking velocimetry) and the main emphasis was simultaneous measurement of the velocity fields of both phases. In a similar way, such data should also provide adequate information for validating the proposed models. Available experimental studies on the behavior of non-spherical particles are uncommon and mostly based on planar light-sheet measurements. Especially for elongated non-spherical particles, however, three-dimensional measurements are needed to fully describe their motion and to provide sufficient information for validation of numerical computations. For further providing detailed experimental results allowing a validation of numerical calculations of non-spherical particle dispersion in turbulent flows, a water channel test facility was built around a horizontal closed water channel. Into this horizontal main flow, a small cross-jet laden with fiber-like particles was injected, which was also solely driven by gravity. The dispersion of the fibers was measured by applying imaging techniques based on a LED array for backlighting and high-speed cameras. For obtaining the fluid velocity fields, almost neutrally buoyant tracer was used. The discrimination between tracer and fibers was done based on image size which was also the basis to determine fiber orientation with respect to the inertial coordinate system. The synchronous measurement of fluid velocity and fiber properties also allow the collection of statistics of fiber orientation, velocity fields of tracer and fibers, the angular velocity of the fibers and the orientation between fiber and instantaneous relative velocity. Consequently, an experimental study the behavior of elongated non-spherical particles in wall bounded turbulent flows was achieved. The development of a comprehensive analysis was succeeded, especially near the wall region, where exists hydrodynamic wall interaction effects (e.g., collision or lubrication) and abrupt changes of particle rotational velocity. This allowed us to predict numerically afterwards the behavior of non-spherical particles within the frame of the Euler/Lagrange approach, where the particles are therein treated as “point-particles”.

Keywords: crossflow, non-spherical particles, particle tracking velocimetry, PIV

Procedia PDF Downloads 86
924 Enhancing Organizational Performance through Adaptive Learning: A Case Study of ASML

Authors: Ramin Shadani

Abstract:

This study introduces adaptive performance as a key organizational performance dimension and explores the relationship between the dimensions of a learning organization and adaptive performance. A survey was therefore conducted using the dimensions of the Learning Organization Questionnaire (DLOQ), followed by factor analysis and structural equation modeling in order to investigate the dynamics between learning organization practices and adaptive performance. Results confirm that adaptive performance is indeed one important dimension of organizational performance. The study also shows that perceived knowledge and adaptive performance mediate the positive relationship between the practices of a learning organization with perceived financial performance. We extend existing DLOQ research by demonstrating that adaptive performance, as a nonfinancial organizational learning outcome, has a significant impact on financial performance. Our study also provides additional validation of the measures of DLOQ's performance. Indeed, organizations need to take a glance at how the activities of learning and development can provide better overall improvement in performance, especially in enhancing adaptive capability. The study has provided requisite empirical support that activities of learning and development within organizations allow much-improved intangible performance outcomes, especially through adaptive performance.

Keywords: adaptive performance, continuous learning, financial performance, leadership style, organizational learning, organizational performance

Procedia PDF Downloads 29
923 Performance Analysis on the Smoke Management System of the Weiwuying Center for the Arts Using Hot Smoke Tests

Authors: K. H. Yang, T. C. Yeh, P. S. Lu, F. C. Yang, T. Y. Wu, W. J. Sung

Abstract:

In this study, a series of full-scale hot smoke tests has been conducted to validate the performances of the smoke management system in the WWY center for arts before grand opening. Totaled 19 scenarios has been established and experimented with fire sizes ranging from 2 MW to 10 MW. The measured ASET data provided by the smoke management system experimentation were compared with the computer-simulated RSET values for egress during the design phase. The experimental result indicated that this system could successfully provide a safety margin of 200% and ensure a safe evacuation in case of fire in the WWY project, including worst-cases and fail-safe scenarios. The methodology developed and results obtained in this project can provide a useful reference for future applications, such as for the large-scale indoor sports dome and arena, stadium, shopping malls, airport terminals, and stations or tunnels for railway and subway systems.

Keywords: building hot smoke tests, performance-based smoke management system designs, full-scale experimental validation, tenable condition criteria

Procedia PDF Downloads 445
922 A Statistical Approach to Rationalise the Number of Working Load Test for Quality Control of Pile Installation in Singapore Jurong Formation

Authors: Nuo Xu, Kok Hun Goh, Jeyatharan Kumarasamy

Abstract:

Pile load testing is significant during foundation construction due to its traditional role of design validation and routine quality control of the piling works. In order to verify whether piles can take loadings at specified settlements, piles will have to undergo working load test where the test load should normally up to 150% of the working load of a pile. Selection or sampling of piles for the working load test is done subject to the number specified in Singapore National Annex to Eurocode 7 SS EN 1997-1:2010. This paper presents an innovative way to rationalize the number of pile load test by adopting statistical analysis approach and looking at the coefficient of variance of pile elastic modulus using a case study at Singapore Tuas depot. Results are very promising and have shown that it is possible to reduce the number of working load test without influencing the reliability and confidence on the pile quality. Moving forward, it is suggested that more load test data from other geological formations to be examined to compare with the findings from this paper.

Keywords: elastic modulus of pile under soil interaction, jurong formation, kentledge test, pile load test

Procedia PDF Downloads 384
921 Spatial Integrity of Seismic Data for Oil and Gas Exploration

Authors: Afiq Juazer Rizal, Siti Zaleha Misnan, M. Zairi M. Yusof

Abstract:

Seismic data is the fundamental tool utilized by exploration companies to determine potential hydrocarbon. However, the importance of seismic trace data will be undermined unless the geo-spatial component of the data is understood. Deriving a proposed well to be drilled from data that has positional ambiguity will jeopardize business decision and millions of dollars’ investment that every oil and gas company would like to avoid. Spatial integrity QC workflow has been introduced in PETRONAS to ensure positional errors within the seismic data are recognized throughout the exploration’s lifecycle from acquisition, processing, and seismic interpretation. This includes, amongst other tests, quantifying that the data is referenced to the appropriate coordinate reference system, survey configuration validation, and geometry loading verification. The direct outcome of the workflow implementation helps improve reliability and integrity of sub-surface geological model produced by geoscientist and provide important input to potential hazard assessment where positional accuracy is crucial. This workflow’s development initiative is part of a bigger geospatial integrity management effort, whereby nearly eighty percent of the oil and gas data are location-dependent.

Keywords: oil and gas exploration, PETRONAS, seismic data, spatial integrity QC workflow

Procedia PDF Downloads 222
920 Performance Investigation of Unmanned Aerial Vehicles Attitude Control Based on Modified PI-D and Nonlinear Dynamic Inversion

Authors: Ebrahim H. Kapeel, Ahmed M. Kamel, Hossam Hendy, Yehia Z. Elhalwagy

Abstract:

Interest in autopilot design has been raised intensely as a result of recent advancements in Unmanned Aerial vehicles (UAVs). Due to the enormous number of applications that UAVs can achieve, the number of applied control theories used for them has increased in recent years. These small fixed-wing UAVs are suffering high non-linearity, sensitivity to disturbances, and coupling effects between their channels. In this work, the nonlinear dynamic inversion (NDI) control law is designed for a nonlinear small fixed-wing UAV model. The NDI is preferable for varied operating conditions, there is no need for a scheduling controller. Moreover, it’s applicable for high angles of attack. For the designed flight controller validation, a nonlinear Modified PI-D controller is performed with our model. A comparative study between both controllers is achieved to evaluate the NDI performance. Simulation results and analysis are proposed to illustrate the effectiveness of the designed controller based on NDI.

Keywords: attitude control, nonlinear PID, dynamic inversion

Procedia PDF Downloads 111
919 Optimal Scheduling of Trains in Complex National Scale Railway Networks

Authors: Sanat Ramesh, Tarun Dutt, Abhilasha Aswal, Anushka Chandrababu, G. N. Srinivasa Prasanna

Abstract:

Optimal Schedule Generation for a large national railway network operating thousands of passenger trains with tens of thousands of kilometers of track is a grand computational challenge in itself. We present heuristics based on a Mixed Integer Program (MIP) formulation for local optimization. These methods provide flexibility in scheduling new trains with varying speed and delays and improve utilization of infrastructure. We propose methods that provide a robust solution with hundreds of trains being scheduled over a portion of the railway network without significant increases in delay. We also provide techniques to validate the nominal schedules thus generated over global correlated variations in travel times thereby enabling us to detect conflicts arising due to delays. Our validation results which assume only the support of the arrival and departure time distributions takes an order of few minutes for a portion of the network and is computationally efficient to handle the entire network.

Keywords: mixed integer programming, optimization, railway network, train scheduling

Procedia PDF Downloads 158
918 Development of Beeswax-Discharge Writing Material for Visually Impaired Persons

Authors: K. Doi, T. Nishimura, H. Fujimoto, T. Tanaka

Abstract:

It has been known that visually impaired persons have some problems in getting visual information. Therefore, information accessibility for the visually impaired persons is very important in a current information society. Some application software with read-aloud function for using personal computer and smartphone are getting more and more popular among visually impaired persons in the world. On the other hand, it is also very important for being able to learn how to read and write characters such as Braille and Visual character. Braille typewriter has been widely used in learning Braille. And also raised-line drawing kits as writing material has been used for decades for especially acquired visually impaired persons. However, there are some drawbacks such as the drawn line cannot be erased. Moreover, visibility of drawing lines is not so good for visually impaired with low vision. We had significant number of requests for developing new writing material for especially acquired visually impaired persons instead of raised-line drawing kits. For conducting development research of novel writing material, we could receive a research grant from ministry of health, labor and welfare in Japanese government. In this research, we developed writing material typed pens and pencils with Beeswax-discharge instead of conventional raised-line drawing kits. This writing material was equipped with cartridge heater for melting beeswax and its heat controller. When this pen users held down the pen tip on the regular paper such as fine paper and so on, the melted beeswax could be discharged from pen tip with valve structure. The beeswax was discharged at 100 gf of holding down force based on results of our previous trial study. The shape of pen tip was semispherical for becoming low friction between pen tip and surface of paper. We conducted one basic experiment to evaluate influence of the curvature of pen tip on ease to write. Concretely, the conditions of curvature was 0.15, 0.35, 0.50, 1.00 mm. The following four interval scales were used as indexes of subjective assessment during writing such as feeling of smooth motion of pen, feeling of comfortable writing, sense of security and feeling of writing fatigue. Ten subjects were asked to participate in this experiment. The results reveal that subjects could draw easily when the radius of the pen tip was 1.00 mm, and lines drawn with beeswax-discharge writing material were easy to perceive.

Keywords: beeswax-discharge writing material, raised-line drawing kits, visually impaired persons, pen tip

Procedia PDF Downloads 308
917 Software Engineering Inspired Cost Estimation for Process Modelling

Authors: Felix Baumann, Aleksandar Milutinovic, Dieter Roller

Abstract:

Up to this point business process management projects in general and business process modelling projects in particular could not rely on a practical and scientifically validated method to estimate cost and effort. Especially the model development phase is not covered by a cost estimation method or model. Further phases of business process modelling starting with implementation are covered by initial solutions which are discussed in the literature. This article proposes a method of filling this gap by deriving a cost estimation method from available methods in similar domains namely software development or software engineering. Software development is regarded as closely similar to process modelling as we show. After the proposition of this method different ideas for further analysis and validation of the method are proposed. We derive this method from COCOMO II and Function Point which are established methods of effort estimation in the domain of software development. For this we lay out similarities of the software development rocess and the process of process modelling which is a phase of the Business Process Management life-cycle.

Keywords: COCOMO II, busines process modeling, cost estimation method, BPM COCOMO

Procedia PDF Downloads 440
916 Development and Metrological Validation of a Control Strategy in Embedded Island Grids Using Battery-Hybrid-Systems

Authors: L. Wilkening, G. Ackermann, T. T. Do

Abstract:

This article presents an approach for stand-alone and grid-connected mode of a German low-voltage grid with high share of photovoltaic. For this purpose, suitable dynamic system models have been developed. This allows the simulation of dynamic events in very small time ranges and the operation management over longer periods of time. Using these simulations, suitable control parameters could be identified, and their effects on the grid can be analyzed. In order to validate the simulation results, a LV-grid test bench has been implemented at the University of Technology Hamburg. The developed control strategies are to be validated using real inverters, generators and different realistic loads. It is shown that a battery hybrid system installed next to a voltage transformer makes it possible to operate the LV-grid in stand-alone mode without using additional information and communication technology and without intervention in the existing grid units. By simulating critical days of the year, suitable control parameters for stable stand-alone operations are determined and set point specifications for different control strategies are defined.

Keywords: battery, e-mobility, photovoltaic, smart grid

Procedia PDF Downloads 143
915 Analytical Modeling of Drain Current for DNA Biomolecule Detection in Double-Gate Tunnel Field-Effect Transistor Biosensor

Authors: Ashwani Kumar

Abstract:

Abstract- This study presents an analytical modeling approach for analyzing the drain current behavior in Tunnel Field-Effect Transistor (TFET) biosensors used for the detection of DNA biomolecules. The proposed model focuses on elucidating the relationship between the drain current and the presence of DNA biomolecules, taking into account the impact of various device parameters and biomolecule characteristics. Through comprehensive analysis, the model offers insights into the underlying mechanisms governing the sensing performance of TFET biosensors, aiding in the optimization of device design and operation. A non-local tunneling model is incorporated with other essential models to accurately trace the simulation and modeled data. An experimental validation of the model is provided, demonstrating its efficacy in accurately predicting the drain current response to DNA biomolecule detection. The sensitivity attained from the analytical model is compared and contrasted with the ongoing research work in this area.

Keywords: biosensor, double-gate TFET, DNA detection, drain current modeling, sensitivity

Procedia PDF Downloads 57
914 HEXAFLY-INT Project: Design of a High Speed Flight Experiment

Authors: S. Di Benedetto, M. P. Di Donato, A. Rispoli, S. Cardone, J. Riehmer, J. Steelant, L. Vecchione

Abstract:

Thanks to a coordinated funding by the European Space Agency (ESA) and the European Commission (EC) within the 7th framework program, the High-Speed Experimental Fly Vehicles – International (HEXAFLY-INT) project is aimed at the flight validation of hypersonics technologies enabling future trans-atmospheric flights. The project, which is currently involving partners from Europe, Russian Federation and Australia operating under ESA/ESTEC coordination, will achieve the goal of designing, manufacturing, assembling and flight testing an unpowered high speed vehicle in a glider configuration by 2018. The main technical challenges of the project are specifically related to the design of the vehicle gliding configuration and to the complexity of integrating breakthrough technologies with standard aeronautical technologies, e.g. high temperature protection system and airframe cold structures. Also, the sonic boom impact, which is one of the environmental challenges of the high speed flight, will be assessed. This paper provides a comprehensive and detailed update on all the current projects activities carried out to date on both the vehicle and mission design.

Keywords: design, flight testing, HEXAFLY-INT, hypersonics

Procedia PDF Downloads 468
913 Estimation of Solar Radiation Power Using Reference Evaluation of Solar Transmittance, 2 Bands Model: Case Study of Semarang, Central Java, Indonesia

Authors: Benedictus Asriparusa

Abstract:

Solar radiation is a green renewable energy which has the potential to answer the needs of energy problems on the period. Knowing how to estimate the strength of the solar radiation force may be one solution of sustainable energy development in an integrated manner. Unfortunately, a fairly extensive area of Indonesia is still very low availability of solar radiation data. Therefore, we need a method to estimate the exact strength of solar radiation. In this study, author used a model Reference Evaluation of Solar Transmittance, 2 Bands (REST 2). Validation of REST 2 model has been performed in Spain, India, Colorado, Saudi Arabia, and several other areas. But it is not widely used in Indonesia. Indonesian region study area is represented by the area of Semarang, Central Java. Solar radiation values estimated using REST 2 model was then verified by field data and gives average RMSE value of 6.53%. Based on the value, it can be concluded that the model REST 2 can be used to estimate the value of solar radiation in clear sky conditions in parts of Indonesia.

Keywords: estimation, solar radiation power, REST 2, solar transmittance

Procedia PDF Downloads 427
912 Anxiety and Self-Perceived L2 Proficiency: A Comparison of Which Can Better Predict L2 Pronunciation Performance

Authors: Jiexuan Lin, Huiyi Chen

Abstract:

The development of L2 pronunciation competence remains understudied in the literature and it is not clear what may influence learners’ development of L2 pronunciation. The present study was an attempt to find out which of the two common factors in L2 acquisition, i.e., foreign language anxiety or self-perceived L2 proficiency, can better predict Chinese EFL learners’ pronunciation performance. 78 first-year English majors, who had received a three-month pronunciation training course, were asked to 1) fill out a questionnaire on foreign language classroom anxiety, 2) self-report their L2 proficiency in general, in speaking and in pronunciation, and 3) complete an oral and a written test on their L2 pronunciation (the score of the oral part indicates participants’ pronunciation proficiency in oral production, and the score of the written part indexes participants’ ability in applying pronunciation knowledge in comprehension.) Results showed that the pronunciation scores were negatively correlated with the anxiety scores, and were positively correlated with the self-perceived pronunciation proficiency. But only the written scores in the L2 pronunciation test, not the oral scores, were positively correlated with the L2 self-perceived general proficiency. Neither the oral nor the written scores in the L2 pronunciation test had a significant correlation with the self-perceived speaking proficiency. Given the fairly strong correlations, the anxiety scores and the self-perceived pronunciation proficiency were put in regression models to predict L2 pronunciation performance. The anxiety factor alone accounted for 13.9% of the variance and the self-perceived pronunciation proficiency alone explained 12.1% of the variance. But when both anxiety scores and self-perceived pronunciation proficiency were put in a stepwise regression model, only the anxiety scores had a significant and unique contribution to the L2 pronunciation performance (4.8%). Taken together, the results suggested that the learners’ anxiety level could better predict their L2 pronunciation performance, compared with the self-perceived proficiency levels. The obtained data have the following pedagogical implications. 1) Given the fairly strong correlation between anxiety and L2 pronunciation performance, the instructors who are interested in predicting learners’ L2 pronunciation proficiency may measure their anxiety level, instead of their proficiency, as the predicting variable. 2) The correlation of oral scores (in the pronunciation test) with pronunciation proficiency, rather than with speaking proficiency, indicates that a) learners after receiving some amounts of training are to some extent able to evaluate their own pronunciation ability, implying the feasibility of incorporating self-evaluation and peer comments in course instruction; b) the ‘proficiency’ measure used to predict pronunciation performance should be used with caution. The proficiency of specific skills seemingly highly related to pronunciation (i.e., speaking in this case) may not be taken for granted as an effective predictor for pronunciation performance. 3) The correlation between the written scores with general L2 proficiency is interesting.

Keywords: anxiety, Chinese EFL learners, L2 pronunciation, self-perceived L2 proficiency

Procedia PDF Downloads 362
911 Self-denigration in Doctoral Defense Sessions: Scale Development and Validation

Authors: Alireza Jalilifar, Nadia Mayahi

Abstract:

The dissertation defense as a complicated conflict-prone context entails the adoption of elegant interactional strategies, one of which is self-denigration. This study aimed to develop and validate a self-denigration model that fits the context of doctoral defense sessions in applied linguistics. Two focus group discussions provided the basis for developing this conceptual model, which assumed 10 functions for self-denigration, namely good manners, modesty, affability, altruism, assertiveness, diffidence, coercive self-deprecation, evasion, diplomacy, and flamboyance. These functions were used to design a 40-item questionnaire on the attitudes of applied linguists concerning self-denigration in defense sessions. The confirmatory factor analysis of the questionnaire indicated the predictive ability of the measurement model. The findings of this study suggest that self-denigration in doctoral defense sessions is the social representation of the participants’ values, ideas and practices adopted as a negotiation strategy and a conflict management policy for the purpose of establishing harmony and maintaining resilience. This study has implications for doctoral students and academics and illuminates further research on self-denigration in other contexts.

Keywords: academic discourse, politeness, self-denigration, grounded theory, dissertation defense

Procedia PDF Downloads 137
910 Identifying Factors Contributing to the Spread of Lyme Disease: A Regression Analysis of Virginia’s Data

Authors: Fatemeh Valizadeh Gamchi, Edward L. Boone

Abstract:

This research focuses on Lyme disease, a widespread infectious condition in the United States caused by the bacterium Borrelia burgdorferi sensu stricto. It is critical to identify environmental and economic elements that are contributing to the spread of the disease. This study examined data from Virginia to identify a subset of explanatory variables significant for Lyme disease case numbers. To identify relevant variables and avoid overfitting, linear poisson, and regularization regression methods such as a ridge, lasso, and elastic net penalty were employed. Cross-validation was performed to acquire tuning parameters. The methods proposed can automatically identify relevant disease count covariates. The efficacy of the techniques was assessed using four criteria on three simulated datasets. Finally, using the Virginia Department of Health’s Lyme disease data set, the study successfully identified key factors, and the results were consistent with previous studies.

Keywords: lyme disease, Poisson generalized linear model, ridge regression, lasso regression, elastic net regression

Procedia PDF Downloads 137
909 Strategic Management Model for High Performance Sports Centers

Authors: Jose Ramon Sanabria Navarro, Yahilina Silveira Perez, Valentin Molina Moreno, Digna Dionisia Perez Bravo

Abstract:

The general objective of this research is to conceive a model of strategic direction for Latin American high-performance sports centers for the improvement of their results. The sample is 62 managers, 187 trainers, 2930 athletes and 62 expert researchers from centers in Cuba, Venezuela, Ecuador, Colombia and Argentina, for 3241. The measurement instrument includes 12 key variables in the process of management strategies which are consolidated with the factorial analysis and the ANOVA of a factor through the SPSS 24.0. The reliability of the scale obtained an alpha higher than 0.7 in each sample. In this sense, a model is obtained that taxes the deficiencies detected in the diagnosis, based on the needs of the members of these organizations, considering criteria and theories of the strategic direction in the improvement of the organizational results. The validation of the model for high performance sports centers of the countries analyzed aims to develop joint strategies to generate synergies in their operational mode, which leads to enhance the sports organization.

Keywords: sports organization, information management, decision making, control

Procedia PDF Downloads 131
908 Artificial Neural Network-Based Short-Term Load Forecasting for Mymensingh Area of Bangladesh

Authors: S. M. Anowarul Haque, Md. Asiful Islam

Abstract:

Electrical load forecasting is considered to be one of the most indispensable parts of a modern-day electrical power system. To ensure a reliable and efficient supply of electric energy, special emphasis should have been put on the predictive feature of electricity supply. Artificial Neural Network-based approaches have emerged to be a significant area of interest for electric load forecasting research. This paper proposed an Artificial Neural Network model based on the particle swarm optimization algorithm for improved electric load forecasting for Mymensingh, Bangladesh. The forecasting model is developed and simulated on the MATLAB environment with a large number of training datasets. The model is trained based on eight input parameters including historical load and weather data. The predicted load data are then compared with an available dataset for validation. The proposed neural network model is proved to be more reliable in terms of day-wise load forecasting for Mymensingh, Bangladesh.

Keywords: load forecasting, artificial neural network, particle swarm optimization

Procedia PDF Downloads 171
907 A Fuzzy Structural Equation Model for Development of a Safety Performance Index Assessment Tool in Construction Sites

Authors: Murat Gunduz, Mustafa Ozdemir

Abstract:

In this research, a framework is to be proposed to model the safety performance in construction sites. Determinants of safety performance are to be defined through extensive literature review and a multidimensional safety performance model is to be developed. In this context, a questionnaire is to be administered to construction companies with sites. The collected data through questionnaires including linguistic terms are then to be defuzzified to get concrete numbers by using fuzzy set theory which provides strong and significant instruments for the measurement of ambiguities and provides the opportunity to meaningfully represent concepts expressed in the natural language. The validity of the proposed safety performance model, relationships between determinants of safety performance are to be analyzed using the structural equation modeling (SEM) which is a highly strong multi variable analysis technique that makes possible the evaluation of latent structures. After validation of the model, a safety performance index assessment tool is to be proposed by the help of software. The proposed safety performance assessment tool will be based on the empirically validated theoretical model.

Keywords: Fuzzy set theory, safety performance assessment, safety index, structural equation modeling (SEM), construction sites

Procedia PDF Downloads 522