Search results for: expert assessments are used. In the proposed methodology
14047 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-
Authors: Nieto Bernal Wilson, Carmona Suarez Edgar
Abstract:
The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects. Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.Keywords: data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse
Procedia PDF Downloads 41214046 Economic Decision Making under Cognitive Load: The Role of Numeracy and Financial Literacy
Authors: Vânia Costa, Nuno De Sá Teixeira, Ana C. Santos, Eduardo Santos
Abstract:
Financial literacy and numeracy have been regarded as paramount for rational household decision making in the increasing complexity of financial markets. However, financial decisions are often made under sub-optimal circumstances, including cognitive overload. The present study aims to clarify how financial literacy and numeracy, taken as relevant expert knowledge for financial decision-making, modulate possible effects of cognitive load. Participants were required to perform a choice between a sure loss or a gambling pertaining a financial investment, either with or without a competing memory task. Two experiments were conducted varying only the content of the competing task. In the first, the financial choice task was made while maintaining on working memory a list of five random letters. In the second, cognitive load was based upon the retention of six random digits. In both experiments, one of the items in the list had to be recalled given its serial position. Outcomes of the first experiment revealed no significant main effect or interactions involving cognitive load manipulation and numeracy and financial literacy skills, strongly suggesting that retaining a list of random letters did not interfere with the cognitive abilities required for financial decision making. Conversely, and in the second experiment, a significant interaction between the competing mnesic task and level of financial literacy (but not numeracy) was found for the frequency of choice of a gambling option. Overall, and in the control condition, both participants with high financial literacy and high numeracy were more prone to choose the gambling option. However, and when under cognitive load, participants with high financial literacy were as likely as their illiterate counterparts to choose the gambling option. This outcome is interpreted as evidence that financial literacy prevents intuitive risk-aversion reasoning only under highly favourable conditions, as is the case when no other task is competing for cognitive resources. In contrast, participants with higher levels of numeracy were consistently more prone to choose the gambling option in both experimental conditions. These results are discussed in the light of the opposition between classical dual-process theories and fuzzy-trace theories for intuitive decision making, suggesting that while some instances of expertise (as numeracy) are prone to support easily accessible gist representations, other expert skills (as financial literacy) depend upon deliberative processes. It is furthermore suggested that this dissociation between types of expert knowledge might depend on the degree to which they are generalizable across disparate settings. Finally, applied implications of the present study are discussed with a focus on how it informs financial regulators and the importance and limits of promoting financial literacy and general numeracy.Keywords: decision making, cognitive load, financial literacy, numeracy
Procedia PDF Downloads 18514045 Development of National Guidelines for Conducting Research and Development of Herbal Medicine in Thailand According to International Standards
Authors: Patcharaporn Sudchada, Nuntika Prommee
Abstract:
Background: Herbal medicines constitute a vital component of Thailand's healthcare system and possess significant potential for international recognition. However, the absence of standardized clinical research guidelines aligned with international standards, coupled with unique local challenges, has hindered the development and registration of Thai herbal medicines in the global market. Objective: To establish comprehensive research and development guidelines for herbal medicine formulations that comply with international standards, with particular emphasis on enhancing research quality, scientific credibility, and facilitating both domestic registration and international market acceptance. Methods: The research methodology comprised eight sequential phases: (1) systematic collection and review of relevant documentation and regulatory frameworks; (2) development of preliminary content structure and template designs; (3) systematic analysis and synthesis of scientific evidence and regulatory data; (4) creation of detailed research guidelines and accompanying templates; (5) execution of domestic and international consultation meetings and study visits involving nine stakeholder groups; (6) systematic expert review of the draft guidelines; (7) incorporation of feedback from relevant regulatory and research agencies; and (8) finalization and validation of the comprehensive guidelines. Results: The study produced comprehensive research and development guidelines for herbal medicines that meet international standards, encompassing the complete development pathway from initial concept through pre-clinical studies, product development, preparation protocols, clinical trial conduct, and product registration procedures. The guidelines include standardized templates and forms specifically designed for clinical research documentation. Conclusion: The established guidelines represent a significant advancement in standardizing clinical research for Thai herbal medicines, enhancing their scientific credibility and potential for international acceptance. Nevertheless, Thailand continues to face specific challenges, including insufficient specialized personnel in herbal research (particularly in clinical trials), challenges in integrating traditional Thai medicine principles with modern scientific methodology, limited research infrastructure, inadequate funding mechanisms, complex registration procedures, and public skepticism toward herbal products. The policy recommendations outlined in this research provide a strategic framework for addressing these challenges and promoting sustainable development of Thai herbal medicines within the national context.Keywords: herbal medicine, clinical research, international standards, research guidelines, drug development, traditional thai medicine, regulatory compliance
Procedia PDF Downloads 814044 Chairussyuhur Arman, Totti Tjiptosumirat, Muhammad Gunawan, Mastur, Joko Priyono, Baiq Tri Ratna Erawati
Authors: Maria M. Giannakou, Athanasios K. Ziliaskopoulos
Abstract:
Transmission pipelines carrying natural gas are often routed through populated cities, industrial and environmentally sensitive areas. While the need for these networks is unquestionable, there are serious concerns about the risk these lifeline networks pose to the people, to their habitat and to the critical infrastructures, especially in view of natural disasters such as earthquakes. This work presents an Integrated Pipeline Risk Management methodology (IPRM) for assessing the hazard associated with a natural gas pipeline failure due to natural or manmade disasters. IPRM aims to optimize the allocation of the available resources to countermeasures in order to minimize the impacts of pipeline failure to humans, the environment, the infrastructure and the economic activity. A proposed knapsack mathematical programming formulation is introduced that optimally selects the proper mitigation policies based on the estimated cost – benefit ratios. The proposed model is demonstrated with a small numerical example. The vulnerability analysis of these pipelines and the quantification of consequences from such failures can be useful for natural gas industries on deciding which mitigation measures to implement on the existing pipeline networks with the minimum cost in an acceptable level of hazard.Keywords: cost benefit analysis, knapsack problem, natural gas distribution network, risk management, risk mitigation
Procedia PDF Downloads 29614043 Method of Synthesis of Controlled Generators Balanced a Strictly Avalanche Criteria-Functions
Authors: Ali Khwaldeh, Nimer Adwan
Abstract:
In this paper, a method for constructing a controlled balanced Boolean function satisfying the criterion of a Strictly Avalanche Criteria (SAC) effect is proposed. The proposed method is based on the use of three orthogonal nonlinear components which is unlike the high-order SAC functions. So, the generator synthesized by the proposed method has separate sets of control and information inputs. The proposed method proves its simplicity and the implementation ability. The proposed method allows synthesizing a SAC function generator with fixed control and information inputs. This ensures greater efficiency of the built-in oscillator compared to high-order SAC functions that can be used as a generator. Accordingly, the method is completely formalized and implemented as a software product.Keywords: boolean function, controlled balanced boolean function, strictly avalanche criteria, orthogonal nonlinear
Procedia PDF Downloads 15714042 A Simple Autonomous Hovering and Operating Control of Multicopter Using Only Web Camera
Authors: Kazuya Sato, Toru Kasahara, Junji Kuroda
Abstract:
In this paper, an autonomous hovering control method of multicopter using only Web camera is proposed. Recently, various control method of an autonomous flight for multicopter are proposed. But, in the previously proposed methods, a motion capture system (i.e., OptiTrack) and laser range finder are often used to measure the position and posture of multicopter. To achieve an autonomous flight control of multicopter with simple equipment, we propose an autonomous flight control method using AR marker and Web camera. AR marker can measure the position of multicopter with Cartesian coordinate in three dimensional, then its position connects with aileron, elevator, and accelerator throttle operation. A simple PID control method is applied to the each operation and adjust the controller gains. Experimental result are given to show the effectiveness of our proposed method. Moreover, another simple operation method for autonomous flight control multicopter is also proposed.Keywords: autonomous hovering control, multicopter, Web camera, operation
Procedia PDF Downloads 56314041 Wireless Battery Charger with Adaptive Rapid-Charging Algorithm
Authors: Byoung-Hee Lee
Abstract:
Wireless battery charger with adaptive rapid charging algorithm is proposed. The proposed wireless charger adopts voltage regulation technique to reduce the number of power conversion steps. Moreover, based on battery models, an adaptive rapid charging algorithm for Li-ion batteries is obtained. Rapid-charging performance with the proposed wireless battery charger and the proposed rapid charging algorithm has been experimentally verified to show more than 70% charging time reduction compared to conventional constant-current constant-voltage (CC-CV) charging without the degradation of battery lifetime.Keywords: wireless, battery charger, adaptive, rapid-charging
Procedia PDF Downloads 38114040 An Optimal Perspective on Research in Translation Studies
Authors: Andrea Musumeci
Abstract:
General theory of translation has suffered the lack of a homogeneous academic dialect, a holistic methodology to account for the diversity of factors involved in the discipline. An underlying pattern amongst theories of translation belonging to different periods and schools has been identified. Such pattern, which is linguistics oriented, could play a role towards unified academic and professional environments, both in terms of research and as a professional category. The implementation of such an approach has also led to a critique of the concept of equivalence, as being not the best of ways to account for translating phenomena.Keywords: optimal, translating, research translation theory, methodology, descriptive analysis
Procedia PDF Downloads 62114039 Utilizing the Analytic Hierarchy Process in Improving Performances of Blind Judo
Authors: Hyun Chul Cho, Hyunkyoung Oh, Hyun Yoon, Jooyeon Jin, Jae Won Lee
Abstract:
Identifying, structuring, and racking the most important factors related to improving athletes’ performances could pave the way for improve training system. The purpose of this study was to identify the relative importance factors to improve performance of the of judo athletes with visual impairments, including blindness by using the Analytic Hierarchy Process (AHP). After reviewing the literature, the relative importance of factors affecting performance of the blind judo was selected. A group of expert reviewed the first draft of the questionnaires, and then finally selected performance factors were classified into the major categories of techniques, physical fitness, and psychological categories. Later, a pre-selected experts group was asked to review the final version of questionnaire and confirm the priories of performance factors. The order of priority was determined by performing pairwise comparisons using Expert Choice 2000. Results indicated that “grappling” (.303) and “throwing” (.234) were the most important lower hierarchy factors for blind judo skills. In addition, the most important physical factors affecting performance were “muscular strength and endurance” (.238). Further, among other psychological factors “competitive anxiety” (.393) was important factor that affects performance. It is important to offer psychological skills training to reduce anxiety of judo athletes with visual impairments and blindness, so they can compete in their optimal states. These findings offer insights into what should be considered when determining factors to improve performance of judo athletes with visual impairments and blindness.Keywords: analytic hierarchy process, blind athlete, judo, sport performance
Procedia PDF Downloads 21914038 From Design, Experience and Play Framework to Common Design Thinking Tools: Using Serious Modern Board Games
Authors: Micael Sousa
Abstract:
Board games (BGs) are thriving as new designs emerge from the hobby community to greater audiences all around the world. Although digital games are gathering most of the attention in game studies and serious games research fields, the post-digital movement helps to explain why in the world dominated by digital technologies, the analog experiences are still unique and irreplaceable to users, allowing innovation in new hybrid environments. The BG’s new designs are part of these post-digital and hybrid movements because they result from the use of powerful digital tools that enable the production and knowledge sharing about the BGs and their face-to-face unique social experiences. These new BGs, defined as modern by many authors, are providing innovative designs and unique game mechanics that are still not yet fully explored by the main serious games (SG) approaches. Even the most established frameworks settled to address SG, as fun games implemented to achieve predefined goals need more development, especially when considering modern BGs. Despite the many anecdotic perceptions, researchers are only now starting to rediscover BGs and demonstrating their potentials. They are proving that BGs are easy to adapt and to grasp by non-expert players in experimental approaches, with the possibility of easy-going adaptation to players’ profiles and serious objectives even during gameplay. Although there are many design thinking (DT) models and practices, their relations with SG frameworks are also underdeveloped, mostly because this is a new research field, lacking theoretical development and the systematization of the experimental practices. Using BG as case studies promise to help develop these frameworks. Departing from the Design, Experience, and Play (DPE) framework and considering the Common Design Think Tools (CDST), this paper proposes a new experimental framework for the adaptation and development of modern BG design for DT: the Design, Experience, and Play for Think (DPET) experimental framework. This is done through the systematization of the DPE and CDST approaches applied in two case studies, where two different sequences of adapted BG were employed to establish a DT collaborative process. These two sessions occurred with different participants and in different contexts, also using different sequences of games for the same DT approach. The first session took place at the Faculty of Economics at the University of Coimbra in a training session of serious games for project development. The second session took place in the Casa do Impacto through The Great Village Design Jam light. Both sessions had the same duration and were designed to progressively achieve DT goals, using BGs as SGs in a collaborative process. The results from the sessions show that a sequence of BGs, when properly adapted to address the DPET framework, can generate a viable and innovative process of collaborative DT that is productive, fun, and engaging. The DPET proposed framework intents to help establish how new SG solutions could be defined for new goals through flexible DT. Applications in other areas of research and development can also benefit from these findings.Keywords: board games, design thinking, methodology, serious games
Procedia PDF Downloads 11414037 Bayesian Variable Selection in Quantile Regression with Application to the Health and Retirement Study
Authors: Priya Kedia, Kiranmoy Das
Abstract:
There is a rich literature on variable selection in regression setting. However, most of these methods assume normality for the response variable under consideration for implementing the methodology and establishing the statistical properties of the estimates. In many real applications, the distribution for the response variable may be non-Gaussian, and one might be interested in finding the best subset of covariates at some predetermined quantile level. We develop dynamic Bayesian approach for variable selection in quantile regression framework. We use a zero-inflated mixture prior for the regression coefficients, and consider the asymmetric Laplace distribution for the response variable for modeling different quantiles of its distribution. An efficient Gibbs sampler is developed for our computation. Our proposed approach is assessed through extensive simulation studies, and real application of the proposed approach is also illustrated. We consider the data from health and retirement study conducted by the University of Michigan, and select the important predictors when the outcome of interest is out-of-pocket medical cost, which is considered as an important measure for financial risk. Our analysis finds important predictors at different quantiles of the outcome, and thus enhance our understanding on the effects of different predictors on the out-of-pocket medical cost.Keywords: variable selection, quantile regression, Gibbs sampler, asymmetric Laplace distribution
Procedia PDF Downloads 15614036 A Physically-Based Analytical Model for Reduced Surface Field Laterally Double Diffused MOSFETs
Authors: M. Abouelatta, A. Shaker, M. El-Banna, G. T. Sayah, C. Gontrand, A. Zekry
Abstract:
In this paper, a methodology for physically modeling the intrinsic MOS part and the drift region of the n-channel Laterally Double-diffused MOSFET (LDMOS) is presented. The basic physical effects like velocity saturation, mobility reduction, and nonuniform impurity concentration in the channel are taken into consideration. The analytical model is implemented using MATLAB. A comparison of the simulations from technology computer aided design (TCAD) and that from the proposed analytical model, at room temperature, shows a satisfactory accuracy which is less than 5% for the whole voltage domain.Keywords: LDMOS, MATLAB, RESURF, modeling, TCAD
Procedia PDF Downloads 20014035 Quantitative, Preservative Methodology for Review of Interview Transcripts Using Natural Language Processing
Authors: Rowan P. Martnishn
Abstract:
During the execution of a National Endowment of the Arts grant, approximately 55 interviews were collected from professionals across various fields. These interviews were used to create deliverables – historical connections for creations that began as art and evolved entirely into computing technology. With dozens of hours’ worth of transcripts to be analyzed by qualitative coders, a quantitative methodology was created to sift through the documents. The initial step was to both clean and format all the data. First, a basic spelling and grammar check was applied, as well as a Python script for normalized formatting which used an open-source grammatical formatter to make the data as coherent as possible. 10 documents were randomly selected to manually review, where words often incorrectly translated during the transcription were recorded and replaced throughout all other documents. Then, to remove all banter and side comments, the transcripts were spliced into paragraphs (separated by change in speaker) and all paragraphs with less than 300 characters were removed. Secondly, a keyword extractor, a form of natural language processing where significant words in a document are selected, was run on each paragraph for all interviews. Every proper noun was put into a data structure corresponding to that respective interview. From there, a Bidirectional and Auto-Regressive Transformer (B.A.R.T.) summary model was then applied to each paragraph that included any of the proper nouns selected from the interview. At this stage the information to review had been sent from about 60 hours’ worth of data to 20. The data was further processed through light, manual observation – any summaries which proved to fit the criteria of the proposed deliverable were selected, as well their locations within the document. This narrowed that data down to about 5 hours’ worth of processing. The qualitative researchers were then able to find 8 more connections in addition to our previous 4, exceeding our minimum quota of 3 to satisfy the grant. Major findings of the study and subsequent curation of this methodology raised a conceptual finding crucial to working with qualitative data of this magnitude. In the use of artificial intelligence there is a general trade off in a model between breadth of knowledge and specificity. If the model has too much knowledge, the user risks leaving out important data (too general). If the tool is too specific, it has not seen enough data to be useful. Thus, this methodology proposes a solution to this tradeoff. The data is never altered outside of grammatical and spelling checks. Instead, the important information is marked, creating an indicator of where the significant data is without compromising the purity of it. Secondly, the data is chunked into smaller paragraphs, giving specificity, and then cross-referenced with the keywords (allowing generalization over the whole document). This way, no data is harmed, and qualitative experts can go over the raw data instead of using highly manipulated results. Given the success in deliverable creation as well as the circumvention of this tradeoff, this methodology should stand as a model for synthesizing qualitative data while maintaining its original form.Keywords: B.A.R.T.model, keyword extractor, natural language processing, qualitative coding
Procedia PDF Downloads 3114034 Analysis of Two Methods to Estimation Stochastic Demand in the Vehicle Routing Problem
Authors: Fatemeh Torfi
Abstract:
Estimation of stochastic demand in physical distribution in general and efficient transport routs management in particular is emerging as a crucial factor in urban planning domain. It is particularly important in some municipalities such as Tehran where a sound demand management calls for a realistic analysis of the routing system. The methodology involved critically investigating a fuzzy least-squares linear regression approach (FLLRs) to estimate the stochastic demands in the vehicle routing problem (VRP) bearing in mind the customer's preferences order. A FLLR method is proposed in solving the VRP with stochastic demands. Approximate-distance fuzzy least-squares (ADFL) estimator ADFL estimator is applied to original data taken from a case study. The SSR values of the ADFL estimator and real demand are obtained and then compared to SSR values of the nominal demand and real demand. Empirical results showed that the proposed methods can be viable in solving problems under circumstances of having vague and imprecise performance ratings. The results further proved that application of the ADFL was realistic and efficient estimator to face the stochastic demand challenges in vehicle routing system management and solve relevant problems.Keywords: fuzzy least-squares, stochastic, location, routing problems
Procedia PDF Downloads 43514033 Optimal Price Points in Differential Pricing
Authors: Katerina Kormusheva
Abstract:
Pricing plays a pivotal role in the marketing discipline as it directly influences consumer perceptions, purchase decisions, and overall market positioning of a product or service. This paper seeks to expand current knowledge in the area of discriminatory and differential pricing, a main area of marketing research. The methodology includes developing a framework and a model for determining how many price points to implement in differential pricing. We focus on choosing the levels of differentiation, derive a function form of the model framework proposed, and lastly, test it empirically with data from a large-scale marketing pricing experiment of services in telecommunications.Keywords: marketing, differential pricing, price points, optimization
Procedia PDF Downloads 9314032 Evaluating the Performance of 28 EU Member Countries on Health2020: A Data Envelopment Analysis Evaluation of the Successful Implementation of Policies
Authors: Elias K. Maragos, Petros E. Maravelakis, Apostolos I. Linardis
Abstract:
Health2020 is a promising framework of policies provided by the World Health Organization (WHO) and aiming to diminish the health and well-being inequalities among the citizens of the European Union (EU) countries. The major demographic, social and environmental changes, in addition to the resent economic crisis prevent the unobstructed and successful implementation of this framework. The unemployment rates and the percentage of people at risk of poverty have increased among the citizens of EU countries. At the same time, the adopted fiscal, economic policies do not help governments to serve their social role and mitigate social and health inequalities. In those circumstances, there is a strong pressure to organize all health system resources efficiently and wisely. In order to provide a unified and value-based framework of valuation, we propose a valuation framework using data envelopment analysis (DEA) and dynamic DEA. We believe that the adopted methodology could provide a robust tool which can capture the degree of success with which policies have been implemented and is capable to determine which of the countries developed the requested policies efficiently and which of the countries have been lagged. Using the proposed methodology, we evaluated the performance of 28 EU member-countries in relation to the Health2020 peripheral targets. We adopted several versions of evaluation, measuring the effectiveness and the efficiency of EU countries from 2011 to 2016. Our results showed stability in technological changes and revealed a group of countries which were benchmarks in most of the years for the inefficient countries.Keywords: DEA, Health2020, health inequalities, malmquist index, policies evaluation, well-being
Procedia PDF Downloads 14414031 A Large Dataset Imputation Approach Applied to Country Conflict Prediction Data
Authors: Benjamin Leiby, Darryl Ahner
Abstract:
This study demonstrates an alternative stochastic imputation approach for large datasets when preferred commercial packages struggle to iterate due to numerical problems. A large country conflict dataset motivates the search to impute missing values well over a common threshold of 20% missingness. The methodology capitalizes on correlation while using model residuals to provide the uncertainty in estimating unknown values. Examination of the methodology provides insight toward choosing linear or nonlinear modeling terms. Static tolerances common in most packages are replaced with tailorable tolerances that exploit residuals to fit each data element. The methodology evaluation includes observing computation time, model fit, and the comparison of known values to replaced values created through imputation. Overall, the country conflict dataset illustrates promise with modeling first-order interactions while presenting a need for further refinement that mimics predictive mean matching.Keywords: correlation, country conflict, imputation, stochastic regression
Procedia PDF Downloads 12014030 CDM-Based Controller Design for High-Frequency Induction Heating System with LLC Tank
Authors: M. Helaimi, R. Taleb, D. Benyoucef, B. Belmadani
Abstract:
This paper presents the design of a polynomial controller with coefficient diagram method (CDM). This controller is used to control the output power of high frequency resonant inverter with LLC tank. One of the most important problems associated with the proposed inverter is achieving ZVS operating during the induction heating process. To overcome this problem, asymmetrical voltage cancellation (AVC) control technique is proposed. The phased look loop (PLL) is used to track the natural frequency of the system. The small signal model of the system with the proposed control is obtained using extending describing function method (EDM). The validity of the proposed control is verified by simulation results.Keywords: induction heating, AVC control, CDM, PLL, resonant inverter
Procedia PDF Downloads 66414029 Developing Primary Care Datasets for a National Asthma Audit
Authors: Rachael Andrews, Viktoria McMillan, Shuaib Nasser, Christopher M. Roberts
Abstract:
Background and objective: The National Review of Asthma Deaths (NRAD) found that asthma management and care was inadequate in 26% of cases reviewed. Major shortfalls identified were adherence to national guidelines and standards and, particularly, the organisation of care, including supervision and monitoring in primary care, with 70% of cases reviewed having at least one avoidable factor in this area. 5.4 million people in the UK are diagnosed with and actively treated for asthma, and approximately 60,000 are admitted to hospital with acute exacerbations each year. The majority of people with asthma receive management and treatment solely in primary care. This has therefore created concern that many people within the UK are receiving sub-optimal asthma care resulting in unnecessary morbidity and risk of adverse outcome. NRAD concluded that a national asthma audit programme should be established to measure and improve processes, organisation, and outcomes of asthma care. Objective: To develop a primary care dataset enabling extraction of information from GP practices in Wales and providing robust data by which results and lessons could be drawn and drive service development and improvement. Methods: A multidisciplinary group of experts, including general practitioners, primary care organisation representatives, and asthma patients was formed and used as a source of governance and guidance. A review of asthma literature, guidance, and standards took place and was used to identify areas of asthma care which, if improved, would lead to better patient outcomes. Modified Delphi methodology was used to gain consensus from the expert group on which of the areas identified were to be prioritised, and an asthma patient and carer focus group held to seek views and feedback on areas of asthma care that were important to them. Areas of asthma care identified by both groups were mapped to asthma guidelines and standards to inform and develop primary and secondary care datasets covering both adult and pediatric care. Dataset development consisted of expert review and a targeted consultation process in order to seek broad stakeholder views and feedback. Results: Areas of asthma care identified as requiring prioritisation by the National Asthma Audit were: (i) Prescribing, (ii) Asthma diagnosis (iii) Asthma Reviews (iv) Personalised Asthma Action Plans (PAAPs) (v) Primary care follow-up after discharge from hospital (vi) Methodologies and primary care queries were developed to cover each of the areas of poor and variable asthma care identified and the queries designed to extract information directly from electronic patients’ records. Conclusion: This paper describes the methodological approach followed to develop primary care datasets for a National Asthma Audit. It sets out the principles behind the establishment of a National Asthma Audit programme in response to a national asthma mortality review and describes the development activities undertaken. Key process elements included: (i) mapping identified areas of poor and variable asthma care to national guidelines and standards, (ii) early engagement of experts, including clinicians and patients in the process, and (iii) targeted consultation of the queries to provide further insight into measures that were collectable, reproducible and relevant.Keywords: asthma, primary care, general practice, dataset development
Procedia PDF Downloads 17614028 A Virtual Reality Cybersecurity Training Knowledge-Based Ontology
Authors: Shaila Rana, Wasim Alhamdani
Abstract:
Effective cybersecurity learning relies on an engaging, interactive, and entertaining activity that fosters positive learning outcomes. VR cybersecurity training may promote these aforementioned variables. However, a methodological approach and framework have not yet been created to allow trainers and educators to employ VR cybersecurity training methods to promote positive learning outcomes to the author’s best knowledge. Thus, this paper aims to create an approach that cybersecurity trainers can follow to create a VR cybersecurity training module. This methodology utilizes concepts from other cybersecurity training frameworks, such as NICE and CyTrONE. Other cybersecurity training frameworks do not incorporate the use of VR. VR training proposes unique challenges that cannot be addressed in current cybersecurity training frameworks. Subsequently, this ontology utilizes concepts unique to developing VR training to create a relevant methodology for creating VR cybersecurity training modules. The outcome of this research is to create a methodology that is relevant and useful for designing VR cybersecurity training modules.Keywords: virtual reality cybersecurity training, VR cybersecurity training, traditional cybersecurity training, ontology
Procedia PDF Downloads 28914027 Robust Single/Multi bit Memristor Based Memory
Authors: Ahmed Emara, Maged Ghoneima, Mohamed Dessouky
Abstract:
Demand for low power fast memories is increasing with the increase in IC’s complexity, in this paper we introduce a proposal for a compact SRAM based on memristor devices. The compact size of the proposed cell (1T2M compared to 6T of traditional SRAMs) allows denser memories on the same area. In this paper, we will discuss the proposed memristor memory cell for single/multi bit data storing configurations along with the writing and reading operations. Stored data stability across successive read operation will be illustrated, operational simulation results and a comparison of our proposed design with previously conventional SRAM and previously proposed memristor cells will be provided.Keywords: memristor, multi-bit, single-bit, circuits, systems
Procedia PDF Downloads 37414026 Multi-Scale Damage Modelling for Microstructure Dependent Short Fiber Reinforced Composite Structure Design
Authors: Joseph Fitoussi, Mohammadali Shirinbayan, Abbas Tcharkhtchi
Abstract:
Due to material flow during processing, short fiber reinforced composites structures obtained by injection or compression molding generally present strong spatial microstructure variation. On the other hand, quasi-static, dynamic, and fatigue behavior of these materials are highly dependent on microstructure parameters such as fiber orientation distribution. Indeed, because of complex damage mechanisms, SFRC structures design is a key challenge for safety and reliability. In this paper, we propose a micromechanical model allowing prediction of damage behavior of real structures as a function of microstructure spatial distribution. To this aim, a statistical damage criterion including strain rate and fatigue effect at the local scale is introduced into a Mori and Tanaka model. A critical local damage state is identified, allowing fatigue life prediction. Moreover, the multi-scale model is coupled with an experimental intrinsic link between damage under monotonic loading and fatigue life in order to build an abacus giving Tsai-Wu failure criterion parameters as a function of microstructure and targeted fatigue life. On the other hand, the micromechanical damage model gives access to the evolution of the anisotropic stiffness tensor of SFRC submitted to complex thermomechanical loading, including quasi-static, dynamic, and cyclic loading with temperature and amplitude variations. Then, the latter is used to fill out microstructure dependent material cards in finite element analysis for design optimization in the case of complex loading history. The proposed methodology is illustrated in the case of a real automotive component made of sheet molding compound (PSA 3008 tailgate). The obtained results emphasize how the proposed micromechanical methodology opens a new path for the automotive industry to lighten vehicle bodies and thereby save energy and reduce gas emission.Keywords: short fiber reinforced composite, structural design, damage, micromechanical modelling, fatigue, strain rate effect
Procedia PDF Downloads 10914025 Estimation of Rare and Clustered Population Mean Using Two Auxiliary Variables in Adaptive Cluster Sampling
Authors: Muhammad Nouman Qureshi, Muhammad Hanif
Abstract:
Adaptive cluster sampling (ACS) is specifically developed for the estimation of highly clumped populations and applied to a wide range of situations like animals of rare and endangered species, uneven minerals, HIV patients and drug users. In this paper, we proposed a generalized semi-exponential estimator with two auxiliary variables under the framework of ACS design. The expressions of approximate bias and mean square error (MSE) of the proposed estimator are derived. Theoretical comparisons of the proposed estimator have been made with existing estimators. A numerical study is conducted on real and artificial populations to demonstrate and compare the efficiencies of the proposed estimator. The results indicate that the proposed generalized semi-exponential estimator performed considerably better than all the adaptive and non-adaptive estimators considered in this paper.Keywords: auxiliary information, adaptive cluster sampling, clustered populations, Hansen-Hurwitz estimation
Procedia PDF Downloads 23914024 A Comprehensive Methodology for Voice Segmentation of Large Sets of Speech Files Recorded in Naturalistic Environments
Authors: Ana Londral, Burcu Demiray, Marcus Cheetham
Abstract:
Speech recording is a methodology used in many different studies related to cognitive and behaviour research. Modern advances in digital equipment brought the possibility of continuously recording hours of speech in naturalistic environments and building rich sets of sound files. Speech analysis can then extract from these files multiple features for different scopes of research in Language and Communication. However, tools for analysing a large set of sound files and automatically extract relevant features from these files are often inaccessible to researchers that are not familiar with programming languages. Manual analysis is a common alternative, with a high time and efficiency cost. In the analysis of long sound files, the first step is the voice segmentation, i.e. to detect and label segments containing speech. We present a comprehensive methodology aiming to support researchers on voice segmentation, as the first step for data analysis of a big set of sound files. Praat, an open source software, is suggested as a tool to run a voice detection algorithm, label segments and files and extract other quantitative features on a structure of folders containing a large number of sound files. We present the validation of our methodology with a set of 5000 sound files that were collected in the daily life of a group of voluntary participants with age over 65. A smartphone device was used to collect sound using the Electronically Activated Recorder (EAR): an app programmed to record 30-second sound samples that were randomly distributed throughout the day. Results demonstrated that automatic segmentation and labelling of files containing speech segments was 74% faster when compared to a manual analysis performed with two independent coders. Furthermore, the methodology presented allows manual adjustments of voiced segments with visualisation of the sound signal and the automatic extraction of quantitative information on speech. In conclusion, we propose a comprehensive methodology for voice segmentation, to be used by researchers that have to work with large sets of sound files and are not familiar with programming tools.Keywords: automatic speech analysis, behavior analysis, naturalistic environments, voice segmentation
Procedia PDF Downloads 28214023 Measuring E-Learning Effectiveness Using a Three-Way Comparison
Authors: Matthew Montebello
Abstract:
The way e-learning effectiveness has been notoriously measured within an academic setting is by comparing the e-learning medium to the traditional face-to-face teaching methodology. In this paper, a simple yet innovative comparison methodology is introduced, whereby the effectiveness of next generation e-learning systems are assessed in contrast not only to the face-to-face mode, but also to the classical e-learning modality. Ethical and logistical issues are also discussed, as this three-way approach to compare teaching methodologies was applied and documented in a real empirical study within a higher education institution.Keywords: e-learning effectiveness, higher education, teaching modality comparison
Procedia PDF Downloads 38714022 Multi-Objective Optimization for the Green Vehicle Routing Problem: Approach to Case Study of the Newspaper Distribution Problem
Authors: Julio C. Ferreira, Maria T. A. Steiner
Abstract:
The aim of this work is to present a solution procedure referred to here as the Multi-objective Optimization for Green Vehicle Routing Problem (MOOGVRP) to provide solutions for a case study. The proposed methodology consists of three stages to resolve Scenario A. Stage 1 consists of the “treatment” of data; Stage 2 consists of applying mathematical models of the p-Median Capacitated Problem (with the objectives of minimization of distances and homogenization of demands between groups) and the Asymmetric Traveling Salesman Problem (with the objectives of minimizing distances and minimizing time). The weighted method was used as the multi-objective procedure. In Stage 3, an analysis of the results is conducted, taking into consideration the environmental aspects related to the case study, more specifically with regard to fuel consumption and air pollutant emission. This methodology was applied to a (partial) database that addresses newspaper distribution in the municipality of Curitiba, Paraná State, Brazil. The preliminary findings for Scenario A showed that it was possible to improve the distribution of the load, reduce the mileage and the greenhouse gas by 17.32% and the journey time by 22.58% in comparison with the current scenario. The intention for future works is to use other multi-objective techniques and an expanded version of the database and explore the triple bottom line of sustainability.Keywords: Asymmetric Traveling Salesman Problem, Green Vehicle Routing Problem, Multi-objective Optimization, p-Median Capacitated Problem
Procedia PDF Downloads 11314021 Visual Inspection of Road Conditions Using Deep Convolutional Neural Networks
Authors: Christos Theoharatos, Dimitris Tsourounis, Spiros Oikonomou, Andreas Makedonas
Abstract:
This paper focuses on the problem of visually inspecting and recognizing the road conditions in front of moving vehicles, targeting automotive scenarios. The goal of road inspection is to identify whether the road is slippery or not, as well as to detect possible anomalies on the road surface like potholes or body bumps/humps. Our work is based on an artificial intelligence methodology for real-time monitoring of road conditions in autonomous driving scenarios, using state-of-the-art deep convolutional neural network (CNN) techniques. Initially, the road and ego lane are segmented within the field of view of the camera that is integrated into the front part of the vehicle. A novel classification CNN is utilized to identify among plain and slippery road textures (e.g., wet, snow, etc.). Simultaneously, a robust detection CNN identifies severe surface anomalies within the ego lane, such as potholes and speed bumps/humps, within a distance of 5 to 25 meters. The overall methodology is illustrated under the scope of an integrated application (or system), which can be integrated into complete Advanced Driver-Assistance Systems (ADAS) systems that provide a full range of functionalities. The outcome of the proposed techniques present state-of-the-art detection and classification results and real-time performance running on AI accelerator devices like Intel’s Myriad 2/X Vision Processing Unit (VPU).Keywords: deep learning, convolutional neural networks, road condition classification, embedded systems
Procedia PDF Downloads 13514020 Pilot Induced Oscillations Adaptive Suppression in Fly-By-Wire Systems
Authors: Herlandson C. Moura, Jorge H. Bidinotto, Eduardo M. Belo
Abstract:
The present work proposes the development of an adaptive control system which enables the suppression of Pilot Induced Oscillations (PIO) in Digital Fly-By-Wire (DFBW) aircrafts. The proposed system consists of a Modified Model Reference Adaptive Control (M-MRAC) integrated with the Gain Scheduling technique. The PIO oscillations are detected using a Real Time Oscillation Verifier (ROVER) algorithm, which then enables the system to switch between two reference models; one in PIO condition, with low proneness to the phenomenon and another one in normal condition, with high (or medium) proneness. The reference models are defined in a closed loop condition using the Linear Quadratic Regulator (LQR) control methodology for Multiple-Input-Multiple-Output (MIMO) systems. The implemented algorithms are simulated in software implementations with state space models and commercial flight simulators as the controlled elements and with pilot dynamics models. A sequence of pitch angles is considered as the reference signal, named as Synthetic Task (Syntask), which must be tracked by the pilot models. The initial outcomes show that the proposed system can detect and suppress (or mitigate) the PIO oscillations in real time before it reaches high amplitudes.Keywords: adaptive control, digital Fly-By-Wire, oscillations suppression, PIO
Procedia PDF Downloads 13414019 An Energy Integration Study While Utilizing Heat of Flue Gas: Sponge Iron Process
Authors: Venkata Ramanaiah, Shabina Khanam
Abstract:
Enormous potential for saving energy is available in coal-based sponge iron plants as these are associated with the high percentage of energy wastage per unit sponge iron production. An energy integration option is proposed, in the present paper, to a coal based sponge iron plant of 100 tonnes per day production capacity, being operated in India using SL/RN (Stelco-Lurgi/Republic Steel-National Lead) process. It consists of the rotary kiln, rotary cooler, dust settling chamber, after burning chamber, evaporating cooler, electrostatic precipitator (ESP), wet scrapper and chimney as important equipment. Principles of process integration are used in the proposed option. It accounts for preheating kiln inlet streams like kiln feed and slinger coal up to 170ᴼC using waste gas exiting ESP. Further, kiln outlet stream is cooled from 1020ᴼC to 110ᴼC using kiln air. The working areas in the plant where energy is being lost and can be conserved are identified. Detailed material and energy balances are carried out around the sponge iron plant, and a modified model is developed, to find coal requirement of proposed option, based on hot utility, heat of reactions, kiln feed and air preheating, radiation losses, dolomite decomposition, the heat required to vaporize the coal volatiles, etc. As coal is used as utility and process stream, an iterative approach is used in solution methodology to compute coal consumption. Further, water consumption, operating cost, capital investment, waste gas generation, profit, and payback period of the modification are computed. Along with these, operational aspects of the proposed design are also discussed. To recover and integrate waste heat available in the plant, three gas-solid heat exchangers and four insulated ducts with one FD fan for each are installed additionally. Thus, the proposed option requires total capital investment of $0.84 million. Preheating of kiln feed, slinger coal and kiln air streams reduce coal consumption by 24.63% which in turn reduces waste gas generation by 25.2% in comparison to the existing process. Moreover, 96% reduction in water is also observed, which is the added advantage of the modification. Consequently, total profit is found as $2.06 million/year with payback period of 4.97 months only. The energy efficient factor (EEF), which is the % of the maximum energy that can be saved through design, is found to be 56.7%. Results of the proposed option are also compared with literature and found in good agreement.Keywords: coal consumption, energy conservation, process integration, sponge iron plant
Procedia PDF Downloads 14414018 Impact Analysis of a School-Based Oral Health Program in Brazil
Authors: Fabio L. Vieira, Micaelle F. C. Lemos, Luciano C. Lemos, Rafaela S. Oliveira, Ian A. Cunha
Abstract:
Brazil has some challenges ahead related to population oral health, most of them associated with the need of expanding into the local level its promotion and prevention activities, offer equal access to services and promote changes in the lifestyle of the population. The program implemented an oral health initiative in public schools in the city of Salvador, Bahia. The mission was to improve oral health among students on primary and secondary education, from 2 to 15 years old, using the school as a pathway to increase access to healthcare. The main actions consisted of a team's visit to the schools with educational sessions for dental cavity prevention and individual assessment. The program incorporated a clinical surveillance component through a dental evaluation of every student searching for dental disease and caries, standardization of the dentists’ team to reach uniform classification on the assessments, and the use of an online platform to register data directly from the schools. Sequentially, the students with caries were referred for free clinical treatment on the program’s Health Centre. The primary purpose of this study was to analyze the effects and outcomes of this school-based oral health program. The study sample was composed by data of a period of 3 years - 2015 to 2017 - from 13 public schools on the suburb of the city of Salvador with a total number of assessments of 9,278 on this period. From the data collected the prevalence of children with decay on permanent teeth was chosen as the most reliable indicator. The prevalence was calculated for each one of the 13 schools using the number of children with 1 or more dental caries on permanent teeth divided by the total number of students assessed for school each year. Then the percentage change per year was calculated for each school. Some schools presented a higher variation on the total number of assessments in one of the three years, so for these, the percentage change calculation was done using the two years with less variation. The results show that 10 of the 13 schools presented significative improvements for the indicator of caries in permanent teeth. The mean for the number of students with caries percentage reduction on the 13 schools was 26.8%, and the median was 32.2% caries in permanent teeth institution. The highest percentage of improvement reached a decrease of 65.6% on the indicator. Three schools presented a rise in caries prevalence (8.9, 18.9 and 37.2% increase) that, on an initial analysis, seems to be explained with the students’ cohort rotation among other schools, as well as absenteeism on the treatment. In conclusion, the program shows a relevant impact on the reduction of caries in permanent teeth among students and the need for the continuity and expansion of this integrated healthcare approach. It has also been evident the significative of the articulation between health and educational systems representing a fundamental approach to improve healthcare access for children especially in scenarios such as presented in Brazil.Keywords: primary care, public health, oral health, school-based oral health, data management
Procedia PDF Downloads 136