Search results for: operational data
24426 The Case of ESPRIT (HigherSchool of Engineering)
Authors: Amira Potter
Abstract:
Since three years, ESPRIT has adopted project-based learning across its curricula. The philosophy behind this reform is to prepare its future engineers to become more operational once they integrate the workplace. It allows them to learn all the required skills expected from them by their future employers. This learner-centered method helps the students take responsibility for their own learning, solve real-world problems and carry out muli-faceted projects. Therefore, the teacher who used to be considered as the detainer of the knowledge has become more of a facilitator and a coach, encouraging their students’ learning process. This innovative way to English teaching has enabled the students to learn the English language differently. The target language is learnt cooperatively through group work, presentations, debating and many other communicative activities. The speaking skill in English language remains by far the most challenging skill for Tunisian students with an educational background based on Arabic as a first language and French as a second language. The student’s initial resistance to speak English in front of their classmates and the way they end up performing their work, shows the real progress they managed to achieve through PBL approach. The article will focus on the positive impact PBL has had on oral fluency among Esprit engineering students and how it has been achieved. It will also describe how speaking skill is taught and assessed at ESPRIT.Keywords: cooperative, engineer, innovative, project-based learning
Procedia PDF Downloads 31724425 Data-Focused Digital Transformation for Smart Net-Zero Cities: A Systems Thinking Approach
Authors: Farzaneh Mohammadi Jouzdani, Vahid Javidroozi, Monica Mateo Garcia, Hanifa Shah
Abstract:
The emergence of developing smart net-zero cities in recent years has attracted significant attention and interest from worldwide communities and scholars as a potential solution to the critical requirement for urban sustainability. This research-in-progress paper aims to investigate the development of smart net-zero cities to propose a digital transformation roadmap for smart net-zero cities with a primary focus on data. Employing systems thinking as an underpinning theory, the study advocates for the necessity of utilising a holistic strategy for understanding the complex interdependencies and interrelationships that characterise urban systems. The proposed methodology will involve an in-depth investigation of current data-driven approaches in the smart net-zero city. This is followed by utilising predictive analysis methods to evaluate the holistic impact of the approaches on moving toward a Smart net-zero city. It is expected to achieve systemic intervention followed by a data-focused and systemic digital transformation roadmap for smart net-zero, contributing to a more holistic understanding of urban sustainability.Keywords: smart city, net-zero city, digital transformation, systems thinking, data integration, data-driven approach
Procedia PDF Downloads 2124424 Market Solvency Capital Requirement Minimization: How Non-linear Solvers Provide Portfolios Complying with Solvency II Regulation
Authors: Abraham Castellanos, Christophe Durville, Sophie Echenim
Abstract:
In this article, a portfolio optimization problem is performed in a Solvency II context: it illustrates how advanced optimization techniques can help to tackle complex operational pain points around the monitoring, control, and stability of Solvency Capital Requirement (SCR). The market SCR of a portfolio is calculated as a combination of SCR sub-modules. These sub-modules are the results of stress-tests on interest rate, equity, property, credit and FX factors, as well as concentration on counter-parties. The market SCR is non convex and non differentiable, which does not make it a natural optimization criteria candidate. In the SCR formulation, correlations between sub-modules are fixed, whereas risk-driven portfolio allocation is usually driven by the dynamics of the actual correlations. Implementing a portfolio construction approach that is efficient on both a regulatory and economic standpoint is not straightforward. Moreover, the challenge for insurance portfolio managers is not only to achieve a minimal SCR to reduce non-invested capital but also to ensure stability of the SCR. Some optimizations have already been performed in the literature, simplifying the standard formula into a quadratic function. But to our knowledge, it is the first time that the standard formula of the market SCR is used in an optimization problem. Two solvers are combined: a bundle algorithm for convex non- differentiable problems, and a BFGS (Broyden-Fletcher-Goldfarb- Shanno)-SQP (Sequential Quadratic Programming) algorithm, to cope with non-convex cases. A market SCR minimization is then performed with historical data. This approach results in significant reduction of the capital requirement, compared to a classical Markowitz approach based on the historical volatility. A comparative analysis of different optimization models (equi-risk-contribution portfolio, minimizing volatility portfolio and minimizing value-at-risk portfolio) is performed and the impact of these strategies on risk measures including market SCR and its sub-modules is evaluated. A lack of diversification of market SCR is observed, specially for equities. This was expected since the market SCR strongly penalizes this type of financial instrument. It was shown that this direct effect of the regulation can be attenuated by implementing constraints in the optimization process or minimizing the market SCR together with the historical volatility, proving the interest of having a portfolio construction approach that can incorporate such features. The present results are further explained by the Market SCR modelling.Keywords: financial risk, numerical optimization, portfolio management, solvency capital requirement
Procedia PDF Downloads 11624423 Monitoring the Vegetation Cover Dynamics of the African Great Green Wall in Yobe State Nigeria
Authors: Isa Muhammad Zumo
Abstract:
The African Great Green Wall (GGW) is a significant initiative in northern Nigeria because it promotes land restoration and conservation utilizing both commercial and species of forest trees while also helping to mitigate desertification and hazards from the sand dunes and shifting Sahara deserts. Conflicts and weather, however, pose a significant danger to the achievement of these goals. The scientific method for monitoring the vegetation dynamics since inception has not received the required attention, despite the African Development Bank (ADB)'s help in funding the project and its integration into the state's development plans for GGW initiatives. This study will monitor the changes in the vegetation cover of the great green wall within Yobe State Nigeria from 2014 to 2023. The vegetation dynamics will be monitored using Landsat 8 Operational Land Imager (OLI) for 6 years at 2 years intervals. The result will show the fluctuations in the vegetation cover density within the period of study. This will guide the design and implementation of policies of the GGW in achieving its objectives. The result can also contribute to the realization of Sustainable Development Goal (SDG) Target 13.2: Integrate climate change measures into national policies, strategies, and planning.Keywords: monitoring, green wall, Landsat 8, Nigeria
Procedia PDF Downloads 8124422 Analysis of an Alternative Data Base for the Estimation of Solar Radiation
Authors: Graciela Soares Marcelli, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Claudineia Brazil, Rafael Haag
Abstract:
The sun is a source of renewable energy, and its use as both a source of heat and light is one of the most promising energy alternatives for the future. To measure the thermal or photovoltaic systems a solar irradiation database is necessary. Brazil still has a reduced number of meteorological stations that provide frequency tests, as an alternative to the radio data platform, with reanalysis systems, quite significant. ERA-Interim is a global fire reanalysis by the European Center for Medium-Range Weather Forecasts (ECMWF). The data assimilation system used for the production of ERA-Interim is based on a 2006 version of the IFS (Cy31r2). The system includes a 4-dimensional variable analysis (4D-Var) with a 12-hour analysis window. The spatial resolution of the dataset is approximately 80 km at 60 vertical levels from the surface to 0.1 hPa. This work aims to make a comparative analysis between the ERA-Interim data and the data observed in the Solarimmetric Atlas of the State of Rio Grande do Sul, to verify its applicability in the absence of an observed data network. The analysis of the results obtained for a study region as an alternative to the energy potential of a given region.Keywords: energy potential, reanalyses, renewable energy, solar radiation
Procedia PDF Downloads 16224421 Big Data Analytics and Public Policy: A Study in Rural India
Authors: Vasantha Gouri Prathapagiri
Abstract:
Innovations in ICT sector facilitate qualitative life style for citizens across the globe. Countries that facilitate usage of new techniques in ICT, i.e., big data analytics find it easier to fulfil the needs of their citizens. Big data is characterised by its volume, variety, and speed. Analytics involves its processing in a cost effective way in order to draw conclusion for their useful application. Big data also involves into the field of machine learning, artificial intelligence all leading to accuracy in data presentation useful for public policy making. Hence using data analytics in public policy making is a proper way to march towards all round development of any country. The data driven insights can help the government to take important strategic decisions with regard to socio-economic development of her country. Developed nations like UK and USA are already far ahead on the path of digitization with the support of Big Data analytics. India is a huge country and is currently on the path of massive digitization being realised through Digital India Mission. Internet connection per household is on the rise every year. This transforms into a massive data set that has the potential to improvise the public services delivery system into an effective service mechanism for Indian citizens. In fact, when compared to developed nations, this capacity is being underutilized in India. This is particularly true for administrative system in rural areas. The present paper focuses on the need for big data analytics adaptation in Indian rural administration and its contribution towards development of the country on a faster pace. Results of the research focussed on the need for increasing awareness and serious capacity building of the government personnel working for rural development with regard to big data analytics and its utility for development of the country. Multiple public policies are framed and implemented for rural development yet the results are not as effective as they should be. Big data has a major role to play in this context as can assist in improving both policy making and implementation aiming at all round development of the country.Keywords: Digital India Mission, public service delivery system, public policy, Indian administration
Procedia PDF Downloads 15924420 A Simulation of Patient Queuing System on Radiology Department at Tertiary Specialized Referral Hospital in Indonesia
Authors: Yonathan Audhitya Suthihono, Ratih Dyah Kusumastuti
Abstract:
The radiology department in a tertiary referral hospital faces service operation challenges such as huge and various patient arrival, which can increase the probability of patient queuing. During the COVID-19 pandemic, it is mandatory to apply social distancing protocol in the radiology department. A strategy to prevent the accumulation of patients at one spot would be required. The aim of this study is to identify an alternative solution which can reduce the patient’s waiting time in radiology department. Discrete event simulation (DES) is used for this study by constructing several improvement scenarios with Arena simulation software. Statistical analysis is used to test the validity of the base case scenario model and to investigate the performance of the improvement scenarios. The result of this study shows that the selected scenario is able to reduce patient waiting time significantly, which leads to more efficient services in a radiology department, be able to serve patients more effectively, and thus increase patient satisfaction. The result of the simulation can be used by the hospital management to improve the operational performance of the radiology department.Keywords: discrete event simulation, hospital management patient queuing model, radiology department services
Procedia PDF Downloads 11824419 4G LTE Dynamic Pricing: The Drivers, Benefits, and Challenges
Authors: Ahmed Rashad Harb Riad Ismail
Abstract:
The purpose of this research is to study the potential of Dynamic Pricing if deployed by mobile operators and analyse its effects from both operators and consumers side. Furthermore, to conclude, throughout the research study, the recommended conditions for successful Dynamic Pricing deployment, recommended factors identifying the type of markets where Dynamic Pricing can be effective, and proposal for a Dynamic Pricing stakeholders’ framework were presented. Currently, the mobile telecommunications industry is witnessing a dramatic growth rate in the data consumption, being fostered mainly by higher data speed technology as the 4G LTE and by the smart devices penetration rates. However, operators’ revenue from data services lags behind and is decupled from this data consumption growth. Pricing strategy is a key factor affecting this ecosystem. Since the introduction of the 4G LTE technology will increase the pace of data growth in multiples, consequently, if pricing strategies remain constant, then the revenue and usage gap will grow wider, risking the sustainability of the ecosystem. Therefore, this research study is focused on Dynamic Pricing for 4G LTE data services, researching the drivers, benefits and challenges of 4G LTE Dynamic Pricing and the feasibility of its deployment in practice from different perspectives including operators, regulators, consumers, and telecommunications equipment manufacturers point of views.Keywords: LTE, dynamic pricing, EPC, research
Procedia PDF Downloads 33024418 Prediction of Wind Speed by Artificial Neural Networks for Energy Application
Authors: S. Adjiri-Bailiche, S. M. Boudia, H. Daaou, S. Hadouche, A. Benzaoui
Abstract:
In this work the study of changes in the wind speed depending on the altitude is calculated and described by the model of the neural networks, the use of measured data, the speed and direction of wind, temperature and the humidity at 10 m are used as input data and as data targets at 50m above sea level. Comparing predict wind speeds and extrapolated at 50 m above sea level is performed. The results show that the prediction by the method of artificial neural networks is very accurate.Keywords: MATLAB, neural network, power low, vertical extrapolation, wind energy, wind speed
Procedia PDF Downloads 69124417 A Case Study at PT Bank XYZ on The Role of Compensation, Career Development, and Employee Engagement towards Employee Performance
Authors: Ahmad Badawi Saluy, Novawiguna Kemalasari
Abstract:
This study aims to examine, analyze and explain the impacts of compensation, career development and employee engagement to employee’s performance partially and simultaneously (Case Study at PT Bank XYZ). The research design used is quantitative descriptive research causality involving 30 respondents. Sources of data are from primary and secondary data, primary data obtained from questionnaires distribution and secondary data obtained from journals and books. Data analysis used model test using smart application PLS 3 that consists of test outer model and inner model. The results showed that compensation, career development and employee engagement partially have a positive impact on employee performance, while they have a positive and significant impact on employee performance simultaneously. The independent variable has the greatest impact is the employee engagement.Keywords: compensation, career development, employee engagement, employee performance
Procedia PDF Downloads 15024416 Spectral Anomaly Detection and Clustering in Radiological Search
Authors: Thomas L. McCullough, John D. Hague, Marylesa M. Howard, Matthew K. Kiser, Michael A. Mazur, Lance K. McLean, Johanna L. Turk
Abstract:
Radiological search and mapping depends on the successful recognition of anomalies in large data sets which contain varied and dynamic backgrounds. We present a new algorithmic approach for real-time anomaly detection which is resistant to common detector imperfections, avoids the limitations of a source template library and provides immediate, and easily interpretable, user feedback. This algorithm is based on a continuous wavelet transform for variance reduction and evaluates the deviation between a foreground measurement and a local background expectation using methods from linear algebra. We also present a technique for recognizing and visualizing spectrally similar clusters of data. This technique uses Laplacian Eigenmap Manifold Learning to perform dimensional reduction which preserves the geometric "closeness" of the data while maintaining sensitivity to outlying data. We illustrate the utility of both techniques on real-world data sets.Keywords: radiological search, radiological mapping, radioactivity, radiation protection
Procedia PDF Downloads 69124415 Knowledge Engineering Based Smart Healthcare Solution
Authors: Rhaed Khiati, Muhammad Hanif
Abstract:
In the past decade, smart healthcare systems have been on an ascendant drift, especially with the evolution of hospitals and their increasing reliance on bioinformatics and software specializing in healthcare. Doctors have become reliant on technology more than ever, something that in the past would have been looked down upon, as technology has become imperative in reducing overall costs and improving the quality of patient care. With patient-doctor interactions becoming more necessary and more complicated than ever, systems must be developed while taking into account costs, patient comfort, and patient data, among other things. In this work, we proposed a smart hospital bed, which mixes the complexity and big data usage of traditional healthcare systems with the comfort found in soft beds while taking certain concerns like data confidentiality, security, and maintaining SLA agreements, etc. into account. This research work potentially provides users, namely patients and doctors, with a seamless interaction with to their respective nurses, as well as faster access to up-to-date personal data, including prescriptions and severity of the condition in contrast to the previous research in the area where there is lack of consideration of such provisions.Keywords: big data, smart healthcare, distributed systems, bioinformatics
Procedia PDF Downloads 19624414 Transformation of the Business Model in an Occupational Health Care Company Embedded in an Emerging Personal Data Ecosystem: A Case Study in Finland
Authors: Tero Huhtala, Minna Pikkarainen, Saila Saraniemi
Abstract:
Information technology has long been used as an enabler of exchange for goods and services. Services are evolving from generic to personalized, and the reverse use of customer data has been discussed in both academia and industry for the past few years. This article presents the results of an empirical case study in the area of preventive health care services. The primary data were gathered in workshops, in which future personal data-based services were conceptualized by analyzing future scenarios from a business perspective. The aim of this study is to understand business model transformation in emerging personal data ecosystems. The work was done as a case study in the context of occupational healthcare. The results have implications to theory and practice, indicating that adopting personal data management principles requires transformation of the business model, which, if successfully managed, may provide access to more resources, potential to offer better value, and additional customer channels. These advantages correlate with the broadening of the business ecosystem. Expanding the scope of this study to include more actors would improve the validity of the research. The results draw from existing literature and are based on findings from a case study and the economic properties of the healthcare industry in Finland.Keywords: ecosystem, business model, personal data, preventive healthcare
Procedia PDF Downloads 24924413 Design of an Instrumentation Setup and Data Acquisition System for a GAS Turbine Engine Using Suitable DAQ Software
Authors: Syed Nauman Bin Asghar Bukhari, Mohtashim Mansoor, Mohammad Nouman
Abstract:
Engine test-Bed system is a fundamental tool to measure dynamic parameters, economic performance, and reliability of an aircraft Engine, and its automation and accuracy directly influences the precision of acquired and analysed data. In this paper, we present the design of digital Data Acquisition (DAQ) system for a vintage aircraft engine test bed that lacks the capability of displaying all the analyzed parameters at one convenient location (one panel-one screen). Recording such measurements in the vintage test bed is not only time consuming but also prone to human errors. Digitizing such measurement system requires a Data Acquisition (DAQ) system capable of recording these parameters and displaying them on one screen-one panel monitor. The challenge in designing upgrade to the vintage systems arises with a need to build and integrate digital measurement system from scratch with a minimal budget and modifications to the existing vintage system. The proposed design not only displays all the key performance / maintenance parameters of the gas turbine engines for operator as well as quality inspector on separate screens but also records the data for further processing / archiving.Keywords: Gas turbine engine, engine test cell, data acquisition, instrumentation
Procedia PDF Downloads 12124412 Economic Evaluation of Bowland Shale Gas Wells Development in the UK
Authors: Elijah Acquah-Andoh
Abstract:
The UK has had its fair share of the shale gas revolutionary waves blowing across the global oil and gas industry at present. Although, its exploitation is widely agreed to have been delayed, shale gas was looked upon favorably by the UK Parliament when they recognized it as genuine energy source and granted licenses to industry to search and extract the resource. This, although a significant progress by industry, there yet remains another test the UK fracking resource must pass in order to render shale gas extraction feasible – it must be economically extractible and sustainably so. Developing unconventional resources is much more expensive and risky, and for shale gas wells, producing in commercial volumes is conditional upon drilling horizontal wells and hydraulic fracturing, techniques which increase CAPEX. Meanwhile, investment in shale gas development projects is sensitive to gas price and technical and geological risks. Using a Two-Factor Model, the economics of the Bowland shale wells were analyzed and the operational conditions under which fracking is profitable in the UK was characterized. We find that there is a great degree of flexibility about Opex spending; hence Opex does not pose much threat to the fracking industry in the UK. However, we discover Bowland shale gas wells fail to add value at gas price of $8/ Mmbtu. A minimum gas price of $12/Mmbtu at Opex of no more than $2/ Mcf and no more than $14.95M Capex are required to create value within the present petroleum tax regime, in the UK fracking industry.Keywords: capex, economical, investment, profitability, shale gas development, sustainable
Procedia PDF Downloads 57724411 Water End-Use Classification with Contemporaneous Water-Energy Data and Deep Learning Network
Authors: Khoi A. Nguyen, Rodney A. Stewart, Hong Zhang
Abstract:
‘Water-related energy’ is energy use which is directly or indirectly influenced by changes to water use. Informatics applying a range of mathematical, statistical and rule-based approaches can be used to reveal important information on demand from the available data provided at second, minute or hourly intervals. This study aims to combine these two concepts to improve the current water end use disaggregation problem through applying a wide range of most advanced pattern recognition techniques to analyse the concurrent high-resolution water-energy consumption data. The obtained results have shown that recognition accuracies of all end-uses have significantly increased, especially for mechanised categories, including clothes washer, dishwasher and evaporative air cooler where over 95% of events were correctly classified.Keywords: deep learning network, smart metering, water end use, water-energy data
Procedia PDF Downloads 30424410 Risk Management Approach for Lean, Agile, Resilient and Green Supply Chain
Authors: Benmoussa Rachid, Deguio Roland, Dubois Sebastien, Rasovska Ivana
Abstract:
Implementation of LARG (Lean, Agile, Resilient, Green) practices in the supply chain management is a complex task mainly because ecological, economical and operational goals are usually in conflict. To implement these LARG practices successfully, companies’ need relevant decision making tools allowing processes performance control and improvement strategies visibility. To contribute to this issue, this work tries to answer the following research question: How to master performance and anticipate problems in supply chain LARG practices implementation? To answer this question, a risk management approach (RMA) is adopted. Indeed, the proposed RMA aims basically to assess the ability of a supply chain, guided by “Lean, Green and Achievement” performance goals, to face “agility and resilience risk” factors. To proof its relevance, a logistics academic case study based on simulation is used to illustrate all its stages. It shows particularly how to build the “LARG risk map” which is the main output of this approach.Keywords: agile supply chain, lean supply chain, green supply chain, resilient supply chain, risk approach
Procedia PDF Downloads 31024409 Autonomic Management for Mobile Robot Battery Degradation
Authors: Martin Doran, Roy Sterritt, George Wilkie
Abstract:
The majority of today’s mobile robots are very dependent on battery power. Mobile robots can operate untethered for a number of hours but eventually they will need to recharge their batteries in-order to continue to function. While computer processing and sensors have become cheaper and more powerful each year, battery development has progress very little. They are slow to re-charge, inefficient and lagging behind in the general progression of robotic development we see today. However, batteries are relatively cheap and when fully charged, can supply high power output necessary for operating heavy mobile robots. As there are no cheap alternatives to batteries, we need to find efficient ways to manage the power that batteries provide during their operational lifetime. This paper proposes the use of autonomic principles of self-adaption to address the behavioral changes a battery experiences as it gets older. In life, as we get older, we cannot perform tasks in the same way as we did in our youth; these tasks generally take longer to perform and require more of our energy to complete. Batteries also suffer from a form of degradation. As a battery gets older, it loses the ability to retain the same charge capacity it would have when brand new. This paper investigates how we can adapt the current state of a battery charge and cycle count, to the requirements of a mobile robot to perform its tasks.Keywords: autonomic, self-adaptive, self-optimising, degradation
Procedia PDF Downloads 38324408 Short Review on Models to Estimate the Risk in the Financial Area
Authors: Tiberiu Socaciu, Tudor Colomeischi, Eugenia Iancu
Abstract:
Business failure affects in various proportions shareholders, managers, lenders (banks), suppliers, customers, the financial community, government and society as a whole. In the era in which we have telecommunications networks, exists an interdependence of markets, the effect of a failure of a company is relatively instant. To effectively manage risk exposure is thus require sophisticated support systems, supported by analytical tools to measure, monitor, manage and control operational risks that may arise. As we know, bankruptcy is a phenomenon that managers do not want no matter what stage of life is the company they direct / lead. In the analysis made by us, by the nature of economic models that are reviewed (Altman, Conan-Holder etc.), estimating the risk of bankruptcy of a company corresponds to some extent with its own business cycle tracing of the company. Various models for predicting bankruptcy take into account direct / indirect aspects such as market position, company growth trend, competition structure, characteristics and customer retention, organization and distribution, location etc. From the perspective of our research we will now review the economic models known in theory and practice for estimating the risk of bankruptcy; such models are based on indicators drawn from major accounting firms.Keywords: Anglo-Saxon models, continental models, national models, statistical models
Procedia PDF Downloads 40524407 Evaluating Multiple Diagnostic Tests: An Application to Cervical Intraepithelial Neoplasia
Authors: Areti Angeliki Veroniki, Sofia Tsokani, Evangelos Paraskevaidis, Dimitris Mavridis
Abstract:
The plethora of diagnostic test accuracy (DTA) studies has led to the increased use of systematic reviews and meta-analysis of DTA studies. Clinicians and healthcare professionals often consult DTA meta-analyses to make informed decisions regarding the optimum test to choose and use for a given setting. For example, the human papilloma virus (HPV) DNA, mRNA, and cytology can be used for the cervical intraepithelial neoplasia grade 2+ (CIN2+) diagnosis. But which test is the most accurate? Studies directly comparing test accuracy are not always available, and comparisons between multiple tests create a network of DTA studies that can be synthesized through a network meta-analysis of diagnostic tests (DTA-NMA). The aim is to summarize the DTA-NMA methods for at least three index tests presented in the methodological literature. We illustrate the application of the methods using a real data set for the comparative accuracy of HPV DNA, HPV mRNA, and cytology tests for cervical cancer. A search was conducted in PubMed, Web of Science, and Scopus from inception until the end of July 2019 to identify full-text research articles that describe a DTA-NMA method for three or more index tests. Since the joint classification of the results from one index against the results of another index test amongst those with the target condition and amongst those without the target condition are rarely reported in DTA studies, only methods requiring the 2x2 tables of the results of each index test against the reference standard were included. Studies of any design published in English were eligible for inclusion. Relevant unpublished material was also included. Ten relevant studies were finally included to evaluate their methodology. DTA-NMA methods that have been presented in the literature together with their advantages and disadvantages are described. In addition, using 37 studies for cervical cancer obtained from a published Cochrane review as a case study, an application of the identified DTA-NMA methods to determine the most promising test (in terms of sensitivity and specificity) for use as the best screening test to detect CIN2+ is presented. As a conclusion, different approaches for the comparative DTA meta-analysis of multiple tests may conclude to different results and hence may influence decision-making. Acknowledgment: This research is co-financed by Greece and the European Union (European Social Fund- ESF) through the Operational Programme «Human Resources Development, Education and Lifelong Learning 2014-2020» in the context of the project “Extension of Network Meta-Analysis for the Comparison of Diagnostic Tests ” (MIS 5047640).Keywords: colposcopy, diagnostic test, HPV, network meta-analysis
Procedia PDF Downloads 13824406 Strategy Management of Soybean (Glycine max L.) for Dealing with Extreme Climate through the Use of Cropsyst Model
Authors: Aminah Muchdar, Nuraeni, Eddy
Abstract:
The aims of the research are: (1) to verify the cropsyst plant model of experimental data in the field of soybean plants and (2) to predict planting time and potential yield soybean plant with the use of cropsyst model. This research is divided into several stages: (1) first calibration stage which conducted in the field from June until September 2015.(2) application models stage, where the data obtained from calibration in the field will be included in cropsyst models. The required data models are climate data, ground data/soil data,also crop genetic data. The relationship between the obtained result in field with simulation cropsyst model indicated by Efficiency Index (EF) which the value is 0,939.That is showing that cropsyst model is well used. From the calculation result RRMSE which the value is 1,922%.That is showing that comparative fault prediction results from simulation with result obtained in the field is 1,92%. The conclusion has obtained that the prediction of soybean planting time cropsyst based models that have been made valid for use. and the appropriate planting time for planting soybeans mainly on rain-fed land is at the end of the rainy season, in which the above study first planting time (June 2, 2015) which gives the highest production, because at that time there was still some rain. Tanggamus varieties more resistant to slow planting time cause the percentage decrease in the yield of each decade is lower than the average of all varieties.Keywords: soybean, Cropsyst, calibration, efficiency Index, RRMSE
Procedia PDF Downloads 17824405 Impact of Marketing towards Behavior Intention
Authors: Sathyamangalam Rangasamy Guru Prasath
Abstract:
Due to the increasing homogeneity in product offerings, the attendant services provided are emerging as a key differentiator in the mind of the consumers. Services marketing are a sub field of marketing which covers the marketing of both goods and services. Service marketing differs from product marketing due to the face that services are intangible and typically require personal interaction with the customer. Relationships are a key factor when it comes to the marketing of services. The role of interpersonal relationships distinguishes service and product marketing in strategic vision and organizational considerations. This paper explores some of the trends in service marketing as they relate to strategic vision, operational and organizational changes, and marketing tactics. The presence of the customer in the service facility means that capacity management becomes an important driver of the firm’s profitability service marketing is a process from the organization’s point of view, but an experience from the customer’s perspective. The quality of the experience is a function of the careful design of customer service processes, adoption of standardized procedures, rigorous management of service quality, high standards of training and automation. Services marketing helps to ensure that these processes are designed from the customer’s perspective. Services marketing includes customer loyalty, managing relationships, complaint handling, improving service quality and productivity of service operations, and how to become a service leader in your industry.Keywords: customer perspective, product marketing, service marketing, rigorous management
Procedia PDF Downloads 36824404 Comparing Performance of Neural Network and Decision Tree in Prediction of Myocardial Infarction
Authors: Reza Safdari, Goli Arji, Robab Abdolkhani Maryam zahmatkeshan
Abstract:
Background and purpose: Cardiovascular diseases are among the most common diseases in all societies. The most important step in minimizing myocardial infarction and its complications is to minimize its risk factors. The amount of medical data is increasingly growing. Medical data mining has a great potential for transforming these data into information. Using data mining techniques to generate predictive models for identifying those at risk for reducing the effects of the disease is very helpful. The present study aimed to collect data related to risk factors of heart infarction from patients’ medical record and developed predicting models using data mining algorithm. Methods: The present work was an analytical study conducted on a database containing 350 records. Data were related to patients admitted to Shahid Rajaei specialized cardiovascular hospital, Iran, in 2011. Data were collected using a four-sectioned data collection form. Data analysis was performed using SPSS and Clementine version 12. Seven predictive algorithms and one algorithm-based model for predicting association rules were applied to the data. Accuracy, precision, sensitivity, specificity, as well as positive and negative predictive values were determined and the final model was obtained. Results: five parameters, including hypertension, DLP, tobacco smoking, diabetes, and A+ blood group, were the most critical risk factors of myocardial infarction. Among the models, the neural network model was found to have the highest sensitivity, indicating its ability to successfully diagnose the disease. Conclusion: Risk prediction models have great potentials in facilitating the management of a patient with a specific disease. Therefore, health interventions or change in their life style can be conducted based on these models for improving the health conditions of the individuals at risk.Keywords: decision trees, neural network, myocardial infarction, Data Mining
Procedia PDF Downloads 42924403 Sparse Coding Based Classification of Electrocardiography Signals Using Data-Driven Complete Dictionary Learning
Authors: Fuad Noman, Sh-Hussain Salleh, Chee-Ming Ting, Hadri Hussain, Syed Rasul
Abstract:
In this paper, a data-driven dictionary approach is proposed for the automatic detection and classification of cardiovascular abnormalities. Electrocardiography (ECG) signal is represented by the trained complete dictionaries that contain prototypes or atoms to avoid the limitations of pre-defined dictionaries. The data-driven trained dictionaries simply take the ECG signal as input rather than extracting features to study the set of parameters that yield the most descriptive dictionary. The approach inherently learns the complicated morphological changes in ECG waveform, which is then used to improve the classification. The classification performance was evaluated with ECG data under two different preprocessing environments. In the first category, QT-database is baseline drift corrected with notch filter and it filters the 60 Hz power line noise. In the second category, the data are further filtered using fast moving average smoother. The experimental results on QT database confirm that our proposed algorithm shows a classification accuracy of 92%.Keywords: electrocardiogram, dictionary learning, sparse coding, classification
Procedia PDF Downloads 38024402 Uptake of Off-Site Construction: Benefit and Future Application
Authors: Faisal Alazzaz, Andrew Whyte
Abstract:
Off-site construction methods have played an important role in the construction sector in the past few decades. It is increasingly becoming a major alternative technique and strategic direction compared to traditional in-situ method. It produces a significant amount of value for the construction industry and the economy more generally. To date, an impressive number of studies have been lunched on the perceived perception of off-site construction. However, it seems that a quantifying benefit on the offsite construction area is lacking. Therefore, this paper examines the recent research literature on the benefits of off- site construction and provides future direction. In the beginning, this paper provides a brief history and current value of the off-site construction followed by a detailed discussion on the benefit of off-site construction. These benefits include but not limited to time saving, quality improvement, relieving skills shortages, cost reduction and productivity improvement. Toward this end, off-site construction should learn from other productive industry similar to services or manufacturing industry by applying operational management tools and techniques with extensive focus on employee empowerment will shed the light on future uptake of Off-site construction. This study is of value in providing scholars have a clear picture of perceived benefit of off-site construction research and give an opportunities for future uptake of off-site method.Keywords: building projects, employer empowerment, off-site construction benefits, productivity
Procedia PDF Downloads 43424401 A Deletion-Cost Based Fast Compression Algorithm for Linear Vector Data
Authors: Qiuxiao Chen, Yan Hou, Ning Wu
Abstract:
As there are deficiencies of the classic Douglas-Peucker Algorithm (DPA), such as high risks of deleting key nodes by mistake, high complexity, time consumption and relatively slow execution speed, a new Deletion-Cost Based Compression Algorithm (DCA) for linear vector data was proposed. For each curve — the basic element of linear vector data, all the deletion costs of its middle nodes were calculated, and the minimum deletion cost was compared with the pre-defined threshold. If the former was greater than or equal to the latter, all remaining nodes were reserved and the curve’s compression process was finished. Otherwise, the node with the minimal deletion cost was deleted, its two neighbors' deletion costs were updated, and the same loop on the compressed curve was repeated till the termination. By several comparative experiments using different types of linear vector data, the comparison between DPA and DCA was performed from the aspects of compression quality and computing efficiency. Experiment results showed that DCA outperformed DPA in compression accuracy and execution efficiency as well.Keywords: Douglas-Peucker algorithm, linear vector data, compression, deletion cost
Procedia PDF Downloads 24924400 Multimedia Container for Autonomous Car
Authors: Janusz Bobulski, Mariusz Kubanek
Abstract:
The main goal of the research is to develop a multimedia container structure containing three types of images: RGB, lidar and infrared, properly calibrated to each other. An additional goal is to develop program libraries for creating and saving this type of file and for restoring it. It will also be necessary to develop a method of data synchronization from lidar and RGB cameras as well as infrared. This type of file could be used in autonomous vehicles, which would certainly facilitate data processing by the intelligent autonomous vehicle management system. Autonomous cars are increasingly breaking into our consciousness. No one seems to have any doubts that self-driving cars are the future of motoring. Manufacturers promise that moving the first of them to showrooms is the prospect of the next few years. Many experts believe that creating a network of communicating autonomous cars will be able to completely eliminate accidents. However, to make this possible, it is necessary to develop effective methods of detection of objects around the moving vehicle. In bad weather conditions, this task is difficult on the basis of the RGB(red, green, blue) image. Therefore, in such situations, you should be supported by information from other sources, such as lidar or infrared cameras. The problem is the different data formats that individual types of devices return. In addition to these differences, there is a problem with the synchronization of these data and the formatting of this data. The goal of the project is to develop a file structure that could be containing a different type of data. This type of file is calling a multimedia container. A multimedia container is a container that contains many data streams, which allows you to store complete multimedia material in one file. Among the data streams located in such a container should be indicated streams of images, films, sounds, subtitles, as well as additional information, i.e., metadata. This type of file could be used in autonomous vehicles, which would certainly facilitate data processing by the intelligent autonomous vehicle management system. As shown by preliminary studies, the use of combining RGB and InfraRed images with Lidar data allows for easier data analysis. Thanks to this application, it will be possible to display the distance to the object in a color photo. Such information can be very useful for drivers and for systems in autonomous cars.Keywords: an autonomous car, image processing, lidar, obstacle detection
Procedia PDF Downloads 22324399 Mobile Crowdsensing Scheme by Predicting Vehicle Mobility Using Deep Learning Algorithm
Authors: Monojit Manna, Arpan Adhikary
Abstract:
In Mobile cloud sensing across the globe, an emerging paradigm is selected by the user to compute sensing tasks. In urban cities current days, Mobile vehicles are adapted to perform the task of data sensing and data collection for universality and mobility. In this work, we focused on the optimality and mobile nodes that can be selected in order to collect the maximum amount of data from urban areas and fulfill the required data in the future period within a couple of minutes. We map out the requirement of the vehicle to configure the maximum data optimization problem and budget. The Application implementation is basically set up to generalize a realistic online platform in which real-time vehicles are moving apparently in a continuous manner. The data center has the authority to select a set of vehicles immediately. A deep learning-based scheme with the help of mobile vehicles (DLMV) will be proposed to collect sensing data from the urban environment. From the future time perspective, this work proposed a deep learning-based offline algorithm to predict mobility. Therefore, we proposed a greedy approach applying an online algorithm step into a subset of vehicles for an NP-complete problem with a limited budget. Real dataset experimental extensive evaluations are conducted for the real mobility dataset in Rome. The result of the experiment not only fulfills the efficiency of our proposed solution but also proves the validity of DLMV and improves the quantity of collecting the sensing data compared with other algorithms.Keywords: mobile crowdsensing, deep learning, vehicle recruitment, sensing coverage, data collection
Procedia PDF Downloads 7624398 Risk Based Maintenance Planning for Loading Equipment in Underground Hard Rock Mine: Case Study
Authors: Sidharth Talan, Devendra Kumar Yadav, Yuvraj Singh Rajput, Subhajit Bhattacharjee
Abstract:
Mining industry is known for its appetite to spend sizeable capital on mine equipment. However, in the current scenario, the mining industry is challenged by daunting factors of non-uniform geological conditions, uneven ore grade, uncontrollable and volatile mineral commodity prices and the ever increasing quest to optimize the capital and operational costs. Thus, the role of equipment reliability and maintenance planning inherits a significant role in augmenting the equipment availability for the operation and in turn boosting the mine productivity. This paper presents the Risk Based Maintenance (RBM) planning conducted on mine loading equipment namely Load Haul Dumpers (LHDs) at Vedanta Resources Ltd subsidiary Hindustan Zinc Limited operated Sindesar Khurd Mines, an underground zinc and lead mine situated in Dariba, Rajasthan, India. The mining equipment at the location is maintained by the Original Equipment Manufacturers (OEMs) namely Sandvik and Atlas Copco, who carry out the maintenance and inspection operations for the equipment. Based on the downtime data extracted for the equipment fleet over the period of 6 months spanning from 1st January 2017 until 30th June 2017, it was revealed that significant contribution of three downtime issues related to namely Engine, Hydraulics, and Transmission to be common among all the loading equipment fleet and substantiated by Pareto Analysis. Further scrutiny through Bubble Matrix Analysis of the given factors revealed the major influence of selective factors namely Overheating, No Load Taken (NTL) issues, Gear Changing issues and Hose Puncture and leakage issues. Utilizing the equipment wise analysis of all the downtime factors obtained, spares consumed, and the alarm logs extracted from the machines, technical design changes in the equipment and pre shift critical alarms checklist were proposed for the equipment maintenance. The given analysis is beneficial to allow OEMs or mine management to focus on the critical issues hampering the reliability of mine equipment and design necessary maintenance strategies to mitigate them.Keywords: bubble matrix analysis, LHDs, OEMs, Pareto chart analysis, spares consumption matrix, critical alarms checklist
Procedia PDF Downloads 15224397 A Biometric Template Security Approach to Fingerprints Based on Polynomial Transformations
Authors: Ramon Santana
Abstract:
The use of biometric identifiers in the field of information security, access control to resources, authentication in ATMs and banking among others, are of great concern because of the safety of biometric data. In the general architecture of a biometric system have been detected eight vulnerabilities, six of them allow obtaining minutiae template in plain text. The main consequence of obtaining minutia templates is the loss of biometric identifier for life. To mitigate these vulnerabilities several models to protect minutiae templates have been proposed. Several vulnerabilities in the cryptographic security of these models allow to obtain biometric data in plain text. In order to increase the cryptographic security and ease of reversibility, a minutiae templates protection model is proposed. The model aims to make the cryptographic protection and facilitate the reversibility of data using two levels of security. The first level of security is the data transformation level. In this level generates invariant data to rotation and translation, further transformation is irreversible. The second level of security is the evaluation level, where the encryption key is generated and data is evaluated using a defined evaluation function. The model is aimed at mitigating known vulnerabilities of the proposed models, basing its security on the impossibility of the polynomial reconstruction.Keywords: fingerprint, template protection, bio-cryptography, minutiae protection
Procedia PDF Downloads 168