Search results for: data sensitivity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26513

Search results for: data sensitivity

24833 Analysis of Business Intelligence Tools in Healthcare

Authors: Avishkar Gawade, Omkar Bansode, Ketan Bhambure, Bhargav Deore

Abstract:

In recent year wide range of business intelligence technology have been applied to different area in order to support decision making process BI enables extraction of knowledge from data store. BI tools usually used in public health field for financial and administrative purposes.BI uses a dashboard in presentation stage to deliver information to information to end users.In this paper,we intend to analyze some open source BI tools on the market and their applicability in the clinical sphere taking into consideration the general characteristics of the clinical environment.A pervasive BI platform was developed using a real case in order to prove the tool viability.Analysis of various BI Tools in done with the help of several parameters such as data security,data integration,data quality reporting and anlaytics,performance,scalability and cost effectivesness.

Keywords: CDSS, EHR, business intelliegence, tools

Procedia PDF Downloads 137
24832 Comparative Analysis of Different Land Use Land Cover (LULC) Maps in WRF Modelling Over Indian Region

Authors: Sen Tanmoy, Jain Sarika, Panda Jagabandhu

Abstract:

The studies regarding the impact of urbanization using the WRF-ARW model rely heavily on the static geographical information selected, including domain configuration and land use land cover (LULC) data. Accurate representation of LULC data provides essential information for understanding urban growth and simulating meteorological parameters such as temperature, precipitation etc. Researchers are using different LULC data as per availability and their requirements. As far as India is concerned, we have very limited resources and data availability. So, it is important to understand how we can optimize our results using limited LULC data. In this review article, we explored how a LULC map is generated from different sources in the Indian context and what its significance is in WRF-ARW modeling to study urbanization/Climate change or any other meteorological parameters. Bibliometric analyses were also performed in this review article based on countries of study and indexed keywords. Finally, some key points are marked out for selecting the most suitable LULC map for any urbanization-related study.

Keywords: LULC, LULC mapping, LANDSAT, WRF-ARW, ISRO, bibliometric Analysis.

Procedia PDF Downloads 27
24831 Prevalence, Antimicrobial Susceptibility Pattern and Public Health Significance for Staphylococcus aureus of Isolated From Raw Red Meat at Butchery and Abattoir House in Mekelle, Northern Ethiopia

Authors: Haftay Abraha Tadesse

Abstract:

Background: Staphylococcus is a genus of worldwide distributed bacteria correlated to several infectious of different sites in human and animals. They are among the most important causes of infection that are associated with the consumption of contaminated food. Objective: The objective of this study was to determine the isolates, antimicrobial susceptibility patterns and public health significance for Staphylococcus aureus in raw meat from butchery and abattoir houses of Mekelle, Northern Ethiopia. Methodology: A cross-sectional study was conducted from April to October 2019. Sociodemographic data and public health significance were collected using predesigned questionnaire. The raw meat samples were collected aseptically in the butchery and abattoir houses and transported using ice box to Mekelle University, College of Veterinary Sciences for isolating and identification of Staphylococcus aureus. Antimicrobial susceptibility tests were determined by disc diffusion method. Data obtained were cleaned and entered in to STATA 22.0 and logistic regression model with odds ratio were calculated to assess the association of risk factors with bacterial contamination. P-value < 0.05 was considered as statistically significant. Results: In present study, 88 out of 250 (35.2%) were found to be contamination with Staphylococcus aureus. Among the raw meat specimens to be positivity rate of Staphylococcus aureus were 37.6% (n=47) and (32.8% (n=41), butchery and abattoir houses, respectively. Among the associated risk factories not using gloves reduces risk was found to (AOR=0.222; 95% CI: 0.104-0.473), Strict Separation b/n clean & dirty (AOR= 1.37; 95% CI: 0.66-2.86) and poor habit of hand washing (AOR=1.08; 95%CI: 0.35-3.35) were found to be statistically significant and ha ve associated with Staphylococcus aureus contamination. All isolates thirty sevevn of Staphyloco ccus aureus were checked displayed (100%) sensitive to doxycycline, trimethoprim, gentamicin, sulphamethoxazole, amikacin, CN, Co trimoxazole and nitrofurantoi. whereas the showed resistance of cefotaxime (100%), ampicillin (87.5%), Penicillin (75%), B (75%), and nalidixic acid (50%) from butchery houses. On the other hand, all isolates of Staphylococcus aur eu isolate 100% (n= 10) showed sensitive chloramphenicol, gentamicin and nitrofurantoin whereas the showed 100% resistance of Penicillin, B, AMX, ceftriaxone, ampicillin and cefotaxime from abattoirs houses. The overall multi drug resistance pattern for Staphylococcus aureus were 90% and 100% of butchery and abattoirs houses, respectively. Conclusion: 35.3% Staphylococcus aureus isolated were recovered from the raw meat samples collected from the butchery and abattoirs houses. More has to be done in the developed of hand washing behavior, and availability of safe water in the butchery houses to reduce burden of bacterial contamination. The results of the present finding highlight the need to implement protective measures against the levels of food contamination and alternative drug options. The development of antimicrobial resistance is nearly always as a result of repeated therapeutic and/or indiscriminate use of them. Regular antimicrobial sensitivity testing helps to select effective antibiotics and to reduce the problems of drug resistance development towards commonly used antibiotics. Key words: abattoir houses, antimicrobial resistance, butchery houses, Ethiopia,

Keywords: abattoir houses, antimicrobial resistance, butchery houses, Ethiopia, staphylococcus aureuse, MDR

Procedia PDF Downloads 74
24830 Evaluation of P16, Human Papillomavirus Capsid Protein L1 and Ki67 in Cervical Intraepithelial Lesions: Potential Utility in Diagnosis and Prognosis

Authors: Hanan Alsaeid Alshenawy

Abstract:

Background: Cervical dysplasia, which is potentially precancerous, has increased in young women. Detection of cervical is important for reducing morbidity and mortality in cervical cancer. This study analyzes the immunohistochemical expression of p16, HPV L1 capsid protein and Ki67 in cervical intraepithelial lesions and correlates them with lesion grade to develop a set of markers for diagnosis and detect the prognosis of cervical cancer precursors. Methods: 75 specimens were analyzed including 15 cases CIN 1, 28 CIN 2, 20 CIN 3, and 12 cervical squamous carcinoma, besides 10 normal cervical tissues. They were stained for p16, HPV L1 and Ki-67. Sensitivity, specificity, predictive values and accuracy were evaluated for each marker. Results: p16 expression increased during the progression from CIN 1 to carcinoma. HPV L1 positivity was detected in CIN 2 and decreased gradually as the CIN grade increased but disappear in carcinoma. Strong Ki-67 expression was observed with high grades CIN and carcinoma. p16, HPV L1 and Ki67 were sensitive but with variable specificity in detecting CIN lesions. Conclusions: p16, HPV L1 and Ki67 are useful set of markers in establishing the risk of high-grade CIN. They complete each other to reach accurate diagnosis and prognosis.

Keywords: p16, HPV L1, Ki67, CIN, cervical carcinoma

Procedia PDF Downloads 341
24829 Data Projects for “Social Good”: Challenges and Opportunities

Authors: Mikel Niño, Roberto V. Zicari, Todor Ivanov, Kim Hee, Naveed Mushtaq, Marten Rosselli, Concha Sánchez-Ocaña, Karsten Tolle, José Miguel Blanco, Arantza Illarramendi, Jörg Besier, Harry Underwood

Abstract:

One of the application fields for data analysis techniques and technologies gaining momentum is the area of social good or “common good”, covering cases related to humanitarian crises, global health care, or ecology and environmental issues, among others. The promotion of data-driven projects in this field aims at increasing the efficacy and efficiency of social initiatives, improving the way these actions help humanity in general and people in need in particular. This application field, however, poses its own barriers and challenges when developing data-driven projects, lagging behind in comparison with other scenarios. These challenges derive from aspects such as the scope and scale of the social issue to solve, cultural and political barriers, the skills of main stakeholders and the technological resources available, the motivation to be engaged in such projects, or the ethical and legal issues related to sensitive data. This paper analyzes the application of data projects in the field of social good, reviewing its current state and noteworthy initiatives, and presenting a framework covering the key aspects to analyze in such projects. The goal is to provide guidelines to understand the main challenges and opportunities for this type of data project, as well as identifying the main differential issues compared to “classical” data projects in general. A case study is presented on the initial steps and stakeholder analysis of a data project for the inclusion of refugees in the city of Frankfurt, Germany, in order to empirically confront the framework with a real example.

Keywords: data-driven projects, humanitarian operations, personal and sensitive data, social good, stakeholders analysis

Procedia PDF Downloads 327
24828 Slugging Frequency Correlation for High Viscosity Oil-Gas Flow in Horizontal Pipeline

Authors: B. Y. Danjuma, A. Archibong-Eso, Aliyu M. Aliyu, H. Yeung

Abstract:

In this experimental investigation, a new data for slugging frequency for high viscosity oil-gas flow are reported. Scale experiments were carried out using a mixture of air and mineral oil as the liquid phase in a 17 m long horizontal pipe with 0.0762 ID. The data set was acquired using two high-speed Gamma Densitometers at a data acquisition frequency of 250 Hz over a time interval of 30 seconds. For the range of flow conditions investigated, increase in liquid oil viscosity was observed to strongly influence the slug frequency. A comparison of the present data with prediction models available in the literature revealed huge discrepancies. A new correlation incorporating the effect of viscosity on slug frequency has been proposed for the horizontal flow, which represents the main contribution of this work.

Keywords: gamma densitometer, flow pattern, pressure gradient, slug frequency

Procedia PDF Downloads 412
24827 Transferring Data from Glucometer to Mobile Device via Bluetooth with Arduino Technology

Authors: Tolga Hayit, Ucman Ergun, Ugur Fidan

Abstract:

Being healthy is undoubtedly an indispensable necessity for human life. With technological improvements, in the literature, various health monitoring and imaging systems have been developed to satisfy your health needs. In this context, the work of monitoring and recording the data of individual health monitoring data via wireless technology is also being part of these studies. Nowadays, mobile devices which are located in almost every house and which become indispensable of our life and have wireless technology infrastructure have an important place of making follow-up health everywhere and every time because these devices were using in the health monitoring systems. In this study, Arduino an open-source microcontroller card was used in which a sample sugar measuring device was connected in series. In this way, the glucose data (glucose ratio, time) obtained with the glucometer is transferred to the mobile device based on the Android operating system with the Bluetooth technology channel. A mobile application was developed using the Apache Cordova framework for listing data, presenting graphically and reading data over Arduino. Apache Cordova, HTML, Javascript and CSS are used in coding section. The data received from the glucometer is stored in the local database of the mobile device. It is intended that people can transfer their measurements to their mobile device by using wireless technology and access the graphical representations of their data. In this context, the aim of the study is to be able to perform health monitoring by using different wireless technologies in mobile devices that can respond to different wireless technologies at present. Thus, that will contribute the other works done in this area.

Keywords: Arduino, Bluetooth, glucose measurement, mobile health monitoring

Procedia PDF Downloads 322
24826 Analyzing the Risk Based Approach in General Data Protection Regulation: Basic Challenges Connected with Adapting the Regulation

Authors: Natalia Kalinowska

Abstract:

The adoption of the General Data Protection Regulation, (GDPR) finished the four-year work of the European Commission in this area in the European Union. Considering far-reaching changes, which will be applied by GDPR, the European legislator envisaged two-year transitional period. Member states and companies have to prepare for a new regulation until 25 of May 2018. The idea, which becomes a new look at an attitude to data protection in the European Union is risk-based approach. So far, as a result of implementation of Directive 95/46/WE, in many European countries (including Poland) there have been adopted very particular regulations, specifying technical and organisational security measures e.g. Polish implementing rules indicate even how long password should be. According to the new approach from May 2018, controllers and processors will be obliged to apply security measures adequate to level of risk associated with specific data processing. The risk in GDPR should be interpreted as the likelihood of a breach of the rights and freedoms of the data subject. According to Recital 76, the likelihood and severity of the risk to the rights and freedoms of the data subject should be determined by reference to the nature, scope, context and purposes of the processing. GDPR does not indicate security measures which should be applied – in recitals there are only examples such as anonymization or encryption. It depends on a controller’s decision what type of security measures controller considered as sufficient and he will be responsible if these measures are not sufficient or if his identification of risk level is incorrect. Data protection regulation indicates few levels of risk. Recital 76 indicates risk and high risk, but some lawyers think, that there is one more category – low risk/now risk. Low risk/now risk data processing is a situation when it is unlikely to result in a risk to the rights and freedoms of natural persons. GDPR mentions types of data processing when a controller does not have to evaluate level of risk because it has been classified as „high risk” processing e.g. processing on a large scale of special categories of data, processing with using new technologies. The methodology will include analysis of legal regulations e.g. GDPR, the Polish Act on the Protection of personal data. Moreover: ICO Guidelines and articles concerning risk based approach in GDPR. The main conclusion is that an appropriate risk assessment is a key to keeping data safe and avoiding financial penalties. On the one hand, this approach seems to be more equitable, not only for controllers or processors but also for data subjects, but on the other hand, it increases controllers’ uncertainties in the assessment which could have a direct impact on incorrect data protection and potential responsibility for infringement of regulation.

Keywords: general data protection regulation, personal data protection, privacy protection, risk based approach

Procedia PDF Downloads 252
24825 UAV’s Enhanced Data Collection for Heterogeneous Wireless Sensor Networks

Authors: Kamel Barka, Lyamine Guezouli, Assem Rezki

Abstract:

In this article, we propose a protocol called DataGA-DRF (a protocol for Data collection using a Genetic Algorithm through Dynamic Reference Points) that collects data from Heterogeneous wireless sensor networks. This protocol is based on DGA (Destination selection according to Genetic Algorithm) to control the movement of the UAV (Unmanned aerial vehicle) between dynamic reference points that virtually represent the sensor node deployment. The dynamics of these points ensure an even distribution of energy consumption among the sensors and also improve network performance. To determine the best points, DataGA-DRF uses a classification algorithm such as K-Means.

Keywords: heterogeneous wireless networks, unmanned aerial vehicles, reference point, collect data, genetic algorithm

Procedia PDF Downloads 82
24824 Mesoporous Carbon Ceramic SiO2/C Prepared by Sol-Gel Method and Modified with Cobalt Phthalocyanine and Used as an Electrochemical Sensor for Nitrite

Authors: Abdur Rahim, Lauro Tatsuo Kubota, Yoshitaka Gushikem

Abstract:

Carbon ceramic mesoporous SiO2/50wt%C (SBET= 170 m2g-1), where C is graphite, was prepared by the sol gel method. Scanning electron microscopy images and the respective element mapping showed that, within the magnification used, no phase segregation was detectable. It presented the electric conductivities of 0.49 S cm-1. This material was used to support cobalt phthalocyanine, prepared in situ, to assure a homogeneous dispersion of the electro active complex in the pores of the matrix. The surface density of cobalt phthalocyanine, on the matrix surfaces was 0.015 mol cm-2. Pressed disk, made with SiO2/50wt%C/CoPc, was used to fabricate an electrode and tested as sensors for nitrite determination by electro chemical technique. A linear response range between 0.039 and 0.42 mmol l−1,and correlation coefficient r=0.9996 was obtained. The electrode was chemically very stable and presented very high sensitivity for this analyte, with a limit of detection, LOD = 1.087 x 10-6 mol L-1.

Keywords: SiO2/C/CoPc, sol-gel method, electrochemical sensor, nitrite oxidation, carbon ceramic material, cobalt phthalocyanine

Procedia PDF Downloads 317
24823 HPPDFIM-HD: Transaction Distortion and Connected Perturbation Approach for Hierarchical Privacy Preserving Distributed Frequent Itemset Mining over Horizontally-Partitioned Dataset

Authors: Fuad Ali Mohammed Al-Yarimi

Abstract:

Many algorithms have been proposed to provide privacy preserving in data mining. These protocols are based on two main approaches named as: the perturbation approach and the Cryptographic approach. The first one is based on perturbation of the valuable information while the second one uses cryptographic techniques. The perturbation approach is much more efficient with reduced accuracy while the cryptographic approach can provide solutions with perfect accuracy. However, the cryptographic approach is a much slower method and requires considerable computation and communication overhead. In this paper, a new scalable protocol is proposed which combines the advantages of the perturbation and distortion along with cryptographic approach to perform privacy preserving in distributed frequent itemset mining on horizontally distributed data. Both the privacy and performance characteristics of the proposed protocol are studied empirically.

Keywords: anonymity data, data mining, distributed frequent itemset mining, gaussian perturbation, perturbation approach, privacy preserving data mining

Procedia PDF Downloads 505
24822 Implementation of Data Science in Field of Homologation

Authors: Shubham Bhonde, Nekzad Doctor, Shashwat Gawande

Abstract:

For the use and the import of Keys and ID Transmitter as well as Body Control Modules with radio transmission in a lot of countries, homologation is required. Final deliverables in homologation of the product are certificates. In considering the world of homologation, there are approximately 200 certificates per product, with most of the certificates in local languages. It is challenging to manually investigate each certificate and extract relevant data from the certificate, such as expiry date, approval date, etc. It is most important to get accurate data from the certificate as inaccuracy may lead to missing re-homologation of certificates that will result in an incompliance situation. There is a scope of automation in reading the certificate data in the field of homologation. We are using deep learning as a tool for automation. We have first trained a model using machine learning by providing all country's basic data. We have trained this model only once. We trained the model by feeding pdf and jpg files using the ETL process. Eventually, that trained model will give more accurate results later. As an outcome, we will get the expiry date and approval date of the certificate with a single click. This will eventually help to implement automation features on a broader level in the database where certificates are stored. This automation will help to minimize human error to almost negligible.

Keywords: homologation, re-homologation, data science, deep learning, machine learning, ETL (extract transform loading)

Procedia PDF Downloads 163
24821 Additive Weibull Model Using Warranty Claim and Finite Element Analysis Fatigue Analysis

Authors: Kanchan Mondal, Dasharath Koulage, Dattatray Manerikar, Asmita Ghate

Abstract:

This paper presents an additive reliability model using warranty data and Finite Element Analysis (FEA) data. Warranty data for any product gives insight to its underlying issues. This is often used by Reliability Engineers to build prediction model to forecast failure rate of parts. But there is one major limitation in using warranty data for prediction. Warranty periods constitute only a small fraction of total lifetime of a product, most of the time it covers only the infant mortality and useful life zone of a bathtub curve. Predicting with warranty data alone in these cases is not generally provide results with desired accuracy. Failure rate of a mechanical part is driven by random issues initially and wear-out or usage related issues at later stages of the lifetime. For better predictability of failure rate, one need to explore the failure rate behavior at wear out zone of a bathtub curve. Due to cost and time constraints, it is not always possible to test samples till failure, but FEA-Fatigue analysis can provide the failure rate behavior of a part much beyond warranty period in a quicker time and at lesser cost. In this work, the authors proposed an Additive Weibull Model, which make use of both warranty and FEA fatigue analysis data for predicting failure rates. It involves modeling of two data sets of a part, one with existing warranty claims and other with fatigue life data. Hazard rate base Weibull estimation has been used for the modeling the warranty data whereas S-N curved based Weibull parameter estimation is used for FEA data. Two separate Weibull models’ parameters are estimated and combined to form the proposed Additive Weibull Model for prediction.

Keywords: bathtub curve, fatigue, FEA, reliability, warranty, Weibull

Procedia PDF Downloads 73
24820 Mechanistic Modelling to De-risk Process Scale-up

Authors: Edwin Cartledge, Jack Clark, Mazaher Molaei-Chalchooghi

Abstract:

The mixing in the crystallization step of active pharmaceutical ingredient manufacturers was studied via advanced modeling tools to enable a successful scale-up. A virtual representation of the vessel was created, and computational fluid dynamics were used to simulate multiphase flow and, thus, the mixing environment within this vessel. The study identified a significant dead zone in the vessel underneath the impeller and found that increasing the impeller speed and power did not improve the mixing. A series of sensitivity analyses found that to improve mixing, the vessel had to be redesigned, and found that optimal mixing could be obtained by adding two extra cylindrical baffles. The same two baffles from the simulated environment were then constructed and added to the process vessel. By identifying these potential issues before starting the manufacture and modifying the vessel to ensure good mixing, this study mitigated a failed crystallization and potential batch disposal, which could have resulted in a significant loss of high-value material.

Keywords: active pharmaceutical ingredient, baffles, computational fluid dynamics, mixing, modelling

Procedia PDF Downloads 97
24819 Data-Focused Digital Transformation for Smart Net-Zero Cities: A Systems Thinking Approach

Authors: Farzaneh Mohammadi Jouzdani, Vahid Javidroozi, Monica Mateo Garcia, Hanifa Shah

Abstract:

The emergence of developing smart net-zero cities in recent years has attracted significant attention and interest from worldwide communities and scholars as a potential solution to the critical requirement for urban sustainability. This research-in-progress paper aims to investigate the development of smart net-zero cities to propose a digital transformation roadmap for smart net-zero cities with a primary focus on data. Employing systems thinking as an underpinning theory, the study advocates for the necessity of utilising a holistic strategy for understanding the complex interdependencies and interrelationships that characterise urban systems. The proposed methodology will involve an in-depth investigation of current data-driven approaches in the smart net-zero city. This is followed by utilising predictive analysis methods to evaluate the holistic impact of the approaches on moving toward a Smart net-zero city. It is expected to achieve systemic intervention followed by a data-focused and systemic digital transformation roadmap for smart net-zero, contributing to a more holistic understanding of urban sustainability.

Keywords: smart city, net-zero city, digital transformation, systems thinking, data integration, data-driven approach

Procedia PDF Downloads 23
24818 Chinese College Students’ Intercultural Competence and Culture Learning Through Telecollaboration

Authors: Li Yuqing

Abstract:

Fostering the development of intercultural (communicative) competence (IC) is one way to equip our students with the linguistic and cultural skills to communicate effectively with people from diverse backgrounds, particularly English majors who are most likely to encounter multicultural work environments in the future. The purpose of this study is to compare the English majors' intercultural competence in terms of cognitive, affective, and behavioral aspects before and after a ten-week telecollaboration program between 23 English majors at a Chinese university and 23 American students enrolled in a Chinese class at an American university, and analyze their development during the program. The results indicate that subjects' cognitive, affective, and behavioral perceptions of IC improved significantly over time. In addition, the program had significant effects on the participants' “Interaction Confidence,” “Interaction Engagement,” and “Interaction Enjoyment” - three components of intercultural sensitivity - as well as their overall intercultural effectiveness (except for “Message Skills”). With the widespread use of the internet, this type of online cultural exchange has a promising future, as suggested by the findings of the current study.

Keywords: intercultural competence, English majors, computer-mediated communication, telecollaboration

Procedia PDF Downloads 74
24817 Optimal Configuration for Polarimetric Surface Plasmon Resonance Sensors

Authors: Ibrahim Watad, Ibrahim Abdulhalim

Abstract:

Conventional spectroscopic surface plasmon resonance (SPR) sensors are widely used, both in fundamental research and environmental monitoring as well as healthcare diagnostics. However, they still lack the low limit of detection (LOD) and there still a place for improvement. SPR conventional sensors are based on the detection of a dip in the reflectivity spectrum which is relatively wide. To improve the performance of these sensors, many techniques and methods proposed either to reduce the width of the dip or to increase the sensitivity. Together with that, profiting from the sharp jump in the phase spectrum under SPR, several works suggested the extraction of the phase of the reflected wave. However, existing phase measurement setups are in general more complicated compared to the conventional setups, require more stability and are very sensitive to external vibrations and noises. In this study, a simple polarimetric technique for phase extraction under SPR is presented, followed by a theoretical error analysis and an experimental verification. The advantages of the proposed technique upon existing techniques will be elaborated, together with conclusions regarding the best polarimetric function, and its corresponding optimal metal layer range of thicknesses to use under the conventional Kretschmann-Raether configuration.

Keywords: plasmonics, polarimetry, thin films, optical sensors

Procedia PDF Downloads 404
24816 Effect of Machining Induced Microstructure Changes on the Edge Formability of Titanium Alloys at Room Temperature

Authors: James S. Kwame, E. Yakushina, P. Blackwell

Abstract:

The challenges in forming titanium alloys at room temperature are well researched and are linked both to the limitations imposed by the basic crystal structure and their ability to form texture during plastic deformation. One major issue of concern for the sheet forming of titanium alloys is their high sensitivity to surface inhomogeneity. Various machining processes are utilised in preparing sheet hole edges for edge flanging applications. However, the response of edge forming tendencies of titanium to different edge surface finishes is not well investigated. The hole expansion test is used in this project to elucidate the impact of abrasive water jet (AWJ) and electro-discharge machining (EDM) cutting techniques on the edge formability of CP-Ti (Grade 2) and Ti-3Al-2.5V alloys at room temperature. The results show that the quality of the edge surface finish has a major effect on the edge formability of the materials. The work also found that the variations in the edge forming performance are mainly the result of the influence of machining induced edge surface defects.

Keywords: titanium alloys, hole expansion test, edge formability, non-conventional machining

Procedia PDF Downloads 137
24815 Optimizing Exposure Parameters in Digital Mammography: A Study in Morocco

Authors: Talbi Mohammed, Oustous Aziz, Ben Messaoud Mounir, Sebihi Rajaa, Khalis Mohammed

Abstract:

Background: Breast cancer is the leading cause of death for women around the world. Screening mammography is the reference examination, due to its sensitivity for detecting small lesions and micro-calcifications. Therefore, it is essential to ensure quality mammographic examinations with the most optimal dose. These conditions depend on the choice of exposure parameters. Clinically, practices must be evaluated in order to determine the most appropriate exposure parameters. Material and Methods: We performed our measurements on a mobile mammography unit (PLANMED Sofie-classic.) in Morocco. A solid dosimeter (AGMS Radcal) and a MTM 100 phantom allow to quantify the delivered dose and the image quality. For image quality assessment, scores are defined by the rate of visible inserts (MTM 100 phantom), obtained and compared for each acquisition. Results: The results show that the parameters of the mammography unit on which we have made our measurements can be improved in order to offer a better compromise between image quality and breast dose. The last one can be reduced up from 13.27% to 22.16%, while preserving comparable image quality.

Keywords: Mammography, Breast Dose, Image Quality, Phantom

Procedia PDF Downloads 172
24814 Analysis of an Alternative Data Base for the Estimation of Solar Radiation

Authors: Graciela Soares Marcelli, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Claudineia Brazil, Rafael Haag

Abstract:

The sun is a source of renewable energy, and its use as both a source of heat and light is one of the most promising energy alternatives for the future. To measure the thermal or photovoltaic systems a solar irradiation database is necessary. Brazil still has a reduced number of meteorological stations that provide frequency tests, as an alternative to the radio data platform, with reanalysis systems, quite significant. ERA-Interim is a global fire reanalysis by the European Center for Medium-Range Weather Forecasts (ECMWF). The data assimilation system used for the production of ERA-Interim is based on a 2006 version of the IFS (Cy31r2). The system includes a 4-dimensional variable analysis (4D-Var) with a 12-hour analysis window. The spatial resolution of the dataset is approximately 80 km at 60 vertical levels from the surface to 0.1 hPa. This work aims to make a comparative analysis between the ERA-Interim data and the data observed in the Solarimmetric Atlas of the State of Rio Grande do Sul, to verify its applicability in the absence of an observed data network. The analysis of the results obtained for a study region as an alternative to the energy potential of a given region.

Keywords: energy potential, reanalyses, renewable energy, solar radiation

Procedia PDF Downloads 164
24813 Big Data Analytics and Public Policy: A Study in Rural India

Authors: Vasantha Gouri Prathapagiri

Abstract:

Innovations in ICT sector facilitate qualitative life style for citizens across the globe. Countries that facilitate usage of new techniques in ICT, i.e., big data analytics find it easier to fulfil the needs of their citizens. Big data is characterised by its volume, variety, and speed. Analytics involves its processing in a cost effective way in order to draw conclusion for their useful application. Big data also involves into the field of machine learning, artificial intelligence all leading to accuracy in data presentation useful for public policy making. Hence using data analytics in public policy making is a proper way to march towards all round development of any country. The data driven insights can help the government to take important strategic decisions with regard to socio-economic development of her country. Developed nations like UK and USA are already far ahead on the path of digitization with the support of Big Data analytics. India is a huge country and is currently on the path of massive digitization being realised through Digital India Mission. Internet connection per household is on the rise every year. This transforms into a massive data set that has the potential to improvise the public services delivery system into an effective service mechanism for Indian citizens. In fact, when compared to developed nations, this capacity is being underutilized in India. This is particularly true for administrative system in rural areas. The present paper focuses on the need for big data analytics adaptation in Indian rural administration and its contribution towards development of the country on a faster pace. Results of the research focussed on the need for increasing awareness and serious capacity building of the government personnel working for rural development with regard to big data analytics and its utility for development of the country. Multiple public policies are framed and implemented for rural development yet the results are not as effective as they should be. Big data has a major role to play in this context as can assist in improving both policy making and implementation aiming at all round development of the country.

Keywords: Digital India Mission, public service delivery system, public policy, Indian administration

Procedia PDF Downloads 159
24812 Parametric Optimization of Wire Electric Discharge Machining (WEDM) for Aluminium Metal Matrix Composites

Authors: G. Rajyalakhmi, C. Karthik, Gerson Desouza, Rimmie Duraisamy

Abstract:

In this present work, metal matrix composites with combination of aluminium with (Sic/Al2O3) were fabricated using stir casting technique. The objective of the present work is to optimize the process parameters of Wire Electric Discharge Machining (WEDM) composites. Pulse ON Time, Pulse OFF Time, wire feed and sensitivity are considered as input process parameters with responses Material Removal Rate (MRR), Surface Roughness (SR) for optimization of WEDM process. Taguchi L18 Orthogonal Array (OA) is used for experimentation. Grey Relational Analysis (GRA) is coupled with Taguchi technique for multiple process parameters optimization. ANOVA (Analysis of Variance) is used for finding the impact of process parameters individually. Finally confirmation experiments were carried out to validate the predicted results.

Keywords: parametric optimization, particulate reinforced metal matrix composites, Taguchi-grey relational analysis, WEDM

Procedia PDF Downloads 580
24811 Search for Flavour Changing Neutral Current Couplings of Higgs-up Sector Quarks at Future Circular Collider (FCC-eh)

Authors: I. Turk Cakir, B. Hacisahinoglu, S. Kartal, A. Yilmaz, A. Yilmaz, Z. Uysal, O. Cakir

Abstract:

In the search for new physics beyond the Standard Model, Flavour Changing Neutral Current (FCNC) is a good research field in terms of the observability at future colliders. Increased Higgs production with higher energy and luminosity in colliders is essential for verification or falsification of our knowledge of physics and predictions, and the search for new physics. Prospective electron-proton collider constituent of the Future Circular Collider project is FCC-eh. It offers great sensitivity due to its high luminosity and low interference. In this work, thq FCNC interaction vertex with off-shell top quark decay at electron-proton colliders is studied. By using MadGraph5_aMC@NLO multi-purpose event generator, observability of tuh and tch couplings are obtained with equal coupling scenario. Upper limit on branching ratio of tree level top quark FCNC decay is determined as 0.012% at FCC-eh with 1 ab ^−1 luminosity.

Keywords: FCC, FCNC, Higgs Boson, Top Quark

Procedia PDF Downloads 212
24810 Modeling and Optimal Control of Pneumonia Disease with Cost Effective Strategies

Authors: Getachew Tilahun, Oluwole Makinde, David Malonza

Abstract:

We propose and analyze a non-linear mathematical model for the transmission dynamics of pneumonia disease in a population of varying size. The deterministic compartmental model is studied using stability theory of differential equations. The effective reproduction number is obtained and also the local and global asymptotically stability conditions for the disease free and as well as for the endemic equilibria are established. The model exhibit a backward bifurcation and the sensitivity indices of the basic reproduction number to the key parameters are determined. Using Pontryagin’s maximum principle, the optimal control problem is formulated with three control strategies; namely disease prevention through education, treatment and screening. The cost effectiveness analysis of the adopted control strategies revealed that the combination of prevention and treatment is the most cost effective intervention strategies to combat the pneumonia pandemic. Numerical simulation is performed and pertinent results are displayed graphically.

Keywords: cost effectiveness analysis, optimal control, pneumonia dynamics, stability analysis, numerical simulation

Procedia PDF Downloads 327
24809 4G LTE Dynamic Pricing: The Drivers, Benefits, and Challenges

Authors: Ahmed Rashad Harb Riad Ismail

Abstract:

The purpose of this research is to study the potential of Dynamic Pricing if deployed by mobile operators and analyse its effects from both operators and consumers side. Furthermore, to conclude, throughout the research study, the recommended conditions for successful Dynamic Pricing deployment, recommended factors identifying the type of markets where Dynamic Pricing can be effective, and proposal for a Dynamic Pricing stakeholders’ framework were presented. Currently, the mobile telecommunications industry is witnessing a dramatic growth rate in the data consumption, being fostered mainly by higher data speed technology as the 4G LTE and by the smart devices penetration rates. However, operators’ revenue from data services lags behind and is decupled from this data consumption growth. Pricing strategy is a key factor affecting this ecosystem. Since the introduction of the 4G LTE technology will increase the pace of data growth in multiples, consequently, if pricing strategies remain constant, then the revenue and usage gap will grow wider, risking the sustainability of the ecosystem. Therefore, this research study is focused on Dynamic Pricing for 4G LTE data services, researching the drivers, benefits and challenges of 4G LTE Dynamic Pricing and the feasibility of its deployment in practice from different perspectives including operators, regulators, consumers, and telecommunications equipment manufacturers point of views.

Keywords: LTE, dynamic pricing, EPC, research

Procedia PDF Downloads 332
24808 Prediction of Wind Speed by Artificial Neural Networks for Energy Application

Authors: S. Adjiri-Bailiche, S. M. Boudia, H. Daaou, S. Hadouche, A. Benzaoui

Abstract:

In this work the study of changes in the wind speed depending on the altitude is calculated and described by the model of the neural networks, the use of measured data, the speed and direction of wind, temperature and the humidity at 10 m are used as input data and as data targets at 50m above sea level. Comparing predict wind speeds and extrapolated at 50 m above sea level is performed. The results show that the prediction by the method of artificial neural networks is very accurate.

Keywords: MATLAB, neural network, power low, vertical extrapolation, wind energy, wind speed

Procedia PDF Downloads 692
24807 A Case Study at PT Bank XYZ on The Role of Compensation, Career Development, and Employee Engagement towards Employee Performance

Authors: Ahmad Badawi Saluy, Novawiguna Kemalasari

Abstract:

This study aims to examine, analyze and explain the impacts of compensation, career development and employee engagement to employee’s performance partially and simultaneously (Case Study at PT Bank XYZ). The research design used is quantitative descriptive research causality involving 30 respondents. Sources of data are from primary and secondary data, primary data obtained from questionnaires distribution and secondary data obtained from journals and books. Data analysis used model test using smart application PLS 3 that consists of test outer model and inner model. The results showed that compensation, career development and employee engagement partially have a positive impact on employee performance, while they have a positive and significant impact on employee performance simultaneously. The independent variable has the greatest impact is the employee engagement.

Keywords: compensation, career development, employee engagement, employee performance

Procedia PDF Downloads 152
24806 Knowledge Engineering Based Smart Healthcare Solution

Authors: Rhaed Khiati, Muhammad Hanif

Abstract:

In the past decade, smart healthcare systems have been on an ascendant drift, especially with the evolution of hospitals and their increasing reliance on bioinformatics and software specializing in healthcare. Doctors have become reliant on technology more than ever, something that in the past would have been looked down upon, as technology has become imperative in reducing overall costs and improving the quality of patient care. With patient-doctor interactions becoming more necessary and more complicated than ever, systems must be developed while taking into account costs, patient comfort, and patient data, among other things. In this work, we proposed a smart hospital bed, which mixes the complexity and big data usage of traditional healthcare systems with the comfort found in soft beds while taking certain concerns like data confidentiality, security, and maintaining SLA agreements, etc. into account. This research work potentially provides users, namely patients and doctors, with a seamless interaction with to their respective nurses, as well as faster access to up-to-date personal data, including prescriptions and severity of the condition in contrast to the previous research in the area where there is lack of consideration of such provisions.

Keywords: big data, smart healthcare, distributed systems, bioinformatics

Procedia PDF Downloads 198
24805 Transformation of the Business Model in an Occupational Health Care Company Embedded in an Emerging Personal Data Ecosystem: A Case Study in Finland

Authors: Tero Huhtala, Minna Pikkarainen, Saila Saraniemi

Abstract:

Information technology has long been used as an enabler of exchange for goods and services. Services are evolving from generic to personalized, and the reverse use of customer data has been discussed in both academia and industry for the past few years. This article presents the results of an empirical case study in the area of preventive health care services. The primary data were gathered in workshops, in which future personal data-based services were conceptualized by analyzing future scenarios from a business perspective. The aim of this study is to understand business model transformation in emerging personal data ecosystems. The work was done as a case study in the context of occupational healthcare. The results have implications to theory and practice, indicating that adopting personal data management principles requires transformation of the business model, which, if successfully managed, may provide access to more resources, potential to offer better value, and additional customer channels. These advantages correlate with the broadening of the business ecosystem. Expanding the scope of this study to include more actors would improve the validity of the research. The results draw from existing literature and are based on findings from a case study and the economic properties of the healthcare industry in Finland.

Keywords: ecosystem, business model, personal data, preventive healthcare

Procedia PDF Downloads 249
24804 Surface-Enhanced Raman Detection in Chip-Based Chromatography via a Droplet Interface

Authors: Renata Gerhardt, Detlev Belder

Abstract:

Raman spectroscopy has attracted much attention as a structurally descriptive and label-free detection method. It is particularly suited for chemical analysis given as it is non-destructive and molecules can be identified via the fingerprint region of the spectra. In this work possibilities are investigated how to integrate Raman spectroscopy as a detection method for chip-based chromatography, making use of a droplet interface. A demanding task in lab-on-a-chip applications is the specific and sensitive detection of low concentrated analytes in small volumes. Fluorescence detection is frequently utilized but restricted to fluorescent molecules. Furthermore, no structural information is provided. Another often applied technique is mass spectrometry which enables the identification of molecules based on their mass to charge ratio. Additionally, the obtained fragmentation pattern gives insight into the chemical structure. However, it is only applicable as an end-of-the-line detection because analytes are destroyed during measurements. In contrast to mass spectrometry, Raman spectroscopy can be applied on-chip and substances can be processed further downstream after detection. A major drawback of Raman spectroscopy is the inherent weakness of the Raman signal, which is due to the small cross-sections associated with the scattering process. Enhancement techniques, such as surface enhanced Raman spectroscopy (SERS), are employed to overcome the poor sensitivity even allowing detection on a single molecule level. In SERS measurements, Raman signal intensity is improved by several orders of magnitude if the analyte is in close proximity to nanostructured metal surfaces or nanoparticles. The main gain of lab-on-a-chip technology is the building block-like ability to seamlessly integrate different functionalities, such as synthesis, separation, derivatization and detection on a single device. We intend to utilize this powerful toolbox to realize Raman detection in chip-based chromatography. By interfacing on-chip separations with a droplet generator, the separated analytes are encapsulated into numerous discrete containers. These droplets can then be injected with a silver nanoparticle solution and investigated via Raman spectroscopy. Droplet microfluidics is a sub-discipline of microfluidics which instead of a continuous flow operates with the segmented flow. Segmented flow is created by merging two immiscible phases (usually an aqueous phase and oil) thus forming small discrete volumes of one phase in the carrier phase. The study surveys different chip designs to realize coupling of chip-based chromatography with droplet microfluidics. With regards to maintaining a sufficient flow rate for chromatographic separation and ensuring stable eluent flow over the column different flow rates of eluent and oil phase are tested. Furthermore, the detection of analytes in droplets with surface enhanced Raman spectroscopy is examined. The compartmentalization of separated compounds preserves the analytical resolution since the continuous phase restricts dispersion between the droplets. The droplets are ideal vessels for the insertion of silver colloids thus making use of the surface enhancement effect and improving the sensitivity of the detection. The long-term goal of this work is the first realization of coupling chip based chromatography with droplets microfluidics to employ surface enhanced Raman spectroscopy as means of detection.

Keywords: chip-based separation, chip LC, droplets, Raman spectroscopy, SERS

Procedia PDF Downloads 245