Search results for: data visualization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25408

Search results for: data visualization

24988 The Role of Data Protection Officer in Managing Individual Data: Issues and Challenges

Authors: Nazura Abdul Manap, Siti Nur Farah Atiqah Salleh

Abstract:

For decades, the misuse of personal data has been a critical issue. Malaysia has accepted responsibility by implementing the Malaysian Personal Data Protection Act 2010 to secure personal data (PDPA 2010). After more than a decade, this legislation is set to be revised by the current PDPA 2023 Amendment Bill to align with the world's key personal data protection regulations, such as the European Union General Data Protection Regulations (GDPR). Among the other suggested adjustments is the Data User's appointment of a Data Protection Officer (DPO) to ensure the commercial entity's compliance with the PDPA 2010 criteria. The change is expected to be enacted in parliament fairly soon; nevertheless, based on the experience of the Personal Data Protection Department (PDPD) in implementing the Act, it is projected that there will be a slew of additional concerns associated with the DPO mandate. Consequently, the goal of this article is to highlight the issues that the DPO will encounter and how the Personal Data Protection Department should respond to this subject. The study result was produced using a qualitative technique based on an examination of the current literature. This research reveals that there are probable obstacles experienced by the DPO, and thus, there should be a definite, clear guideline in place to aid DPO in executing their tasks. It is argued that appointing a DPO is a wise measure in ensuring that the legal data security requirements are met.

Keywords: guideline, law, data protection officer, personal data

Procedia PDF Downloads 78
24987 Design and Fabrication of an Array Microejector Driven by a Shear-Mode Piezoelectric Actuator

Authors: Chiang-Ho Cheng, Hong-Yih Cheng, An-Shik Yang, Tung-Hsun Hsu

Abstract:

This paper reports a novel actuating design that uses the shear deformation of a piezoelectric actuator to deflect a bulge-diaphragm for driving an array microdroplet ejector. In essence, we employed a circular-shaped actuator poled radial direction with remnant polarization normal to the actuating electric field for inducing the piezoelectric shear effect. The array microdroplet ejector consists of a shear type piezoelectric actuator, a vibration plate, two chamber plates, two channel plates and a nozzle plate. The vibration, chamber and nozzle plate components are fabricated using nickel electroforming technology, whereas the channel plate is fabricated by etching of stainless steel. The diaphragm displacement was measured by the laser two-dimensional scanning vibrometer. The ejected droplets of the microejector were also observed via an optic visualization system.

Keywords: actuator, nozzle, microejector, piezoelectric

Procedia PDF Downloads 427
24986 Data Collection Based on the Questionnaire Survey In-Hospital Emergencies

Authors: Nouha Mhimdi, Wahiba Ben Abdessalem Karaa, Henda Ben Ghezala

Abstract:

The methods identified in data collection are diverse: electronic media, focus group interviews and short-answer questionnaires [1]. The collection of poor-quality data resulting, for example, from poorly designed questionnaires, the absence of good translators or interpreters, and the incorrect recording of data allow conclusions to be drawn that are not supported by the data or to focus only on the average effect of the program or policy. There are several solutions to avoid or minimize the most frequent errors, including obtaining expert advice on the design or adaptation of data collection instruments; or use technologies allowing better "anonymity" in the responses [2]. In this context, we opted to collect good quality data by doing a sizeable questionnaire-based survey on hospital emergencies to improve emergency services and alleviate the problems encountered. At the level of this paper, we will present our study, and we will detail the steps followed to achieve the collection of relevant, consistent and practical data.

Keywords: data collection, survey, questionnaire, database, data analysis, hospital emergencies

Procedia PDF Downloads 108
24985 Federated Learning in Healthcare

Authors: Ananya Gangavarapu

Abstract:

Convolutional Neural Networks (CNN) based models are providing diagnostic capabilities on par with the medical specialists in many specialty areas. However, collecting the medical data for training purposes is very challenging because of the increased regulations around data collections and privacy concerns around personal health data. The gathering of the data becomes even more difficult if the capture devices are edge-based mobile devices (like smartphones) with feeble wireless connectivity in rural/remote areas. In this paper, I would like to highlight Federated Learning approach to mitigate data privacy and security issues.

Keywords: deep learning in healthcare, data privacy, federated learning, training in distributed environment

Procedia PDF Downloads 141
24984 The Utilization of Big Data in Knowledge Management Creation

Authors: Daniel Brian Thompson, Subarmaniam Kannan

Abstract:

The huge weightage of knowledge in this world and within the repository of organizations has already reached immense capacity and is constantly increasing as time goes by. To accommodate these constraints, Big Data implementation and algorithms are utilized to obtain new or enhanced knowledge for decision-making. With the transition from data to knowledge provides the transformational changes which will provide tangible benefits to the individual implementing these practices. Today, various organization would derive knowledge from observations and intuitions where this information or data will be translated into best practices for knowledge acquisition, generation and sharing. Through the widespread usage of Big Data, the main intention is to provide information that has been cleaned and analyzed to nurture tangible insights for an organization to apply to their knowledge-creation practices based on facts and figures. The translation of data into knowledge will generate value for an organization to make decisive decisions to proceed with the transition of best practices. Without a strong foundation of knowledge and Big Data, businesses are not able to grow and be enhanced within the competitive environment.

Keywords: big data, knowledge management, data driven, knowledge creation

Procedia PDF Downloads 116
24983 Survey on Data Security Issues Through Cloud Computing Amongst Sme’s in Nairobi County, Kenya

Authors: Masese Chuma Benard, Martin Onsiro Ronald

Abstract:

Businesses have been using cloud computing more frequently recently because they wish to take advantage of its advantages. However, employing cloud computing also introduces new security concerns, particularly with regard to data security, potential risks and weaknesses that could be exploited by attackers, and various tactics and strategies that could be used to lessen these risks. This study examines data security issues on cloud computing amongst sme’s in Nairobi county, Kenya. The study used the sample size of 48, the research approach was mixed methods, The findings show that data owner has no control over the cloud merchant's data management procedures, there is no way to ensure that data is handled legally. This implies that you will lose control over the data stored in the cloud. Data and information stored in the cloud may face a range of availability issues due to internet outages; this can represent a significant risk to data kept in shared clouds. Integrity, availability, and secrecy are all mentioned.

Keywords: data security, cloud computing, information, information security, small and medium-sized firms (SMEs)

Procedia PDF Downloads 84
24982 Cloud Design for Storing Large Amount of Data

Authors: M. Strémy, P. Závacký, P. Cuninka, M. Juhás

Abstract:

Main goal of this paper is to introduce our design of private cloud for storing large amount of data, especially pictures, and to provide good technological backend for data analysis based on parallel processing and business intelligence. We have tested hypervisors, cloud management tools, storage for storing all data and Hadoop to provide data analysis on unstructured data. Providing high availability, virtual network management, logical separation of projects and also rapid deployment of physical servers to our environment was also needed.

Keywords: cloud, glusterfs, hadoop, juju, kvm, maas, openstack, virtualization

Procedia PDF Downloads 353
24981 Environmental and Toxicological Impacts of Glyphosate with Its Formulating Adjuvant

Authors: I. Székács, Á. Fejes, S. Klátyik, E. Takács, D. Patkó, J. Pomóthy, M. Mörtl, R. Horváth, E. Madarász, B. Darvas, A. Székács

Abstract:

Environmental and toxicological characteristics of formulated pesticides may substantially differ from those of their active ingredients or other components alone. This phenomenon is demonstrated in the case of the herbicide active ingredient glyphosate. Due to its extensive application, this active ingredient was found in surface and ground water samples collected in Békés County, Hungary, in the concentration range of 0.54–0.98 ng/ml. The occurrence of glyphosate appeared to be somewhat higher at areas under intensive agriculture, industrial activities and public road services, but the compound was detected at areas under organic (ecological) farming or natural grasslands, indicating environmental mobility. Increased toxicity of the formulated herbicide product Roundup, compared to that of glyphosate was observed on the indicator aquatic organism Daphnia magna Straus. Acute LC50 values of Roundup and its formulating adjuvant Polyethoxylated Tallowamine (POEA) exceeded 20 and 3.1 mg/ml, respectively, while that of glyphosate (as isopropyl salt) was found to be substantially lower (690-900 mg/ml) showing good agreement with literature data. Cytotoxicity of Roundup, POEA and glyphosate has been determined on the neuroectodermal cell line, NE-4C measured both by cell viability test and holographic microscopy. Acute toxicity (LC50) of Roundup, POEA and glyphosate on NE-4C cells was found to be 0.013±0.002%, 0.017±0.009% and 6.46±2.25%, respectively (in equivalents of diluted Roundup solution), corresponding to 0.022±0.003 and 53.1±18.5 mg/ml for POEA and glyphosate, respectively, indicating no statistical difference between Roundup and POEA and 2.5 orders of magnitude difference between these and glyphosate. The same order of cellular toxicity seen in average cell area has been indicated under quantitative cell visualization. The results indicate that toxicity of the formulated herbicide is caused by the formulating agent, but in some parameters toxicological synergy occurs between POEA and glyphosate.

Keywords: glyphosate, polyethoxylated tallowamine, Roundup, combined aquatic and cellular toxicity, synergy

Procedia PDF Downloads 318
24980 Estimation of Missing Values in Aggregate Level Spatial Data

Authors: Amitha Puranik, V. S. Binu, Seena Biju

Abstract:

Missing data is a common problem in spatial analysis especially at the aggregate level. Missing can either occur in covariate or in response variable or in both in a given location. Many missing data techniques are available to estimate the missing data values but not all of these methods can be applied on spatial data since the data are autocorrelated. Hence there is a need to develop a method that estimates the missing values in both response variable and covariates in spatial data by taking account of the spatial autocorrelation. The present study aims to develop a model to estimate the missing data points at the aggregate level in spatial data by accounting for (a) Spatial autocorrelation of the response variable (b) Spatial autocorrelation of covariates and (c) Correlation between covariates and the response variable. Estimating the missing values of spatial data requires a model that explicitly account for the spatial autocorrelation. The proposed model not only accounts for spatial autocorrelation but also utilizes the correlation that exists between covariates, within covariates and between a response variable and covariates. The precise estimation of the missing data points in spatial data will result in an increased precision of the estimated effects of independent variables on the response variable in spatial regression analysis.

Keywords: spatial regression, missing data estimation, spatial autocorrelation, simulation analysis

Procedia PDF Downloads 382
24979 Micro-Oscillator: Passive Production and Manipulation of Microdrops

Authors: Khelfaoui Rachid, Chekifi Tawfiq, Dennai Brahim, Maazouzi A. Hak

Abstract:

A numerical and experimental studies of passive micro drops production have been presented. This paper focuses on the modeling of micro-oscillators systems which are composed by passive amplifier without moving part. The micro-system modeling is based on geometrical oscillators form. An asymmetric micro-oscillator design that is based on a bistable fluidic amplifier is proposed. The characteristic size of the channels is generally about 35 microns of depth. The numerical results indicate that the production and manipulation of microdrops are possible with passive device within a typical oscillators chamber of 2.25 mm diameter and 0.20 mm length when the Reynolds number is Re = 490. The novel micro drops method that is presented in this study provides a simple solution about the production of microdrops problems in micro system. We undertake an experimental step. The first part is based on the realisation of sample oscillator; the second part is consisted of visualization, production and manipulation of microdrops.

Keywords: modelling, miscible, micro drops, production, oscillator sample, capillary

Procedia PDF Downloads 378
24978 Association Rules Mining and NOSQL Oriented Document in Big Data

Authors: Sarra Senhadji, Imene Benzeguimi, Zohra Yagoub

Abstract:

Big Data represents the recent technology of manipulating voluminous and unstructured data sets over multiple sources. Therefore, NOSQL appears to handle the problem of unstructured data. Association rules mining is one of the popular techniques of data mining to extract hidden relationship from transactional databases. The algorithm for finding association dependencies is well-solved with Map Reduce. The goal of our work is to reduce the time of generating of frequent itemsets by using Map Reduce and NOSQL database oriented document. A comparative study is given to evaluate the performances of our algorithm with the classical algorithm Apriori.

Keywords: Apriori, Association rules mining, Big Data, Data Mining, Hadoop, MapReduce, MongoDB, NoSQL

Procedia PDF Downloads 161
24977 Immunization-Data-Quality in Public Health Facilities in the Pastoralist Communities: A Comparative Study Evidence from Afar and Somali Regional States, Ethiopia

Authors: Melaku Tsehay

Abstract:

The Consortium of Christian Relief and Development Associations (CCRDA), and the CORE Group Polio Partners (CGPP) Secretariat have been working with Global Alliance for Vac-cines and Immunization (GAVI) to improve the immunization data quality in Afar and Somali Regional States. The main aim of this study was to compare the quality of immunization data before and after the above interventions in health facilities in the pastoralist communities in Ethiopia. To this end, a comparative-cross-sectional study was conducted on 51 health facilities. The baseline data was collected in May 2019, while the end line data in August 2021. The WHO data quality self-assessment tool (DQS) was used to collect data. A significant improvment was seen in the accuracy of the pentavalent vaccine (PT)1 (p = 0.012) data at the health posts (HP), while PT3 (p = 0.010), and Measles (p = 0.020) at the health centers (HC). Besides, a highly sig-nificant improvment was observed in the accuracy of tetanus toxoid (TT)2 data at HP (p < 0.001). The level of over- or under-reporting was found to be < 8%, at the HP, and < 10% at the HC for PT3. The data completeness was also increased from 72.09% to 88.89% at the HC. Nearly 74% of the health facilities timely reported their respective immunization data, which is much better than the baseline (7.1%) (p < 0.001). These findings may provide some hints for the policies and pro-grams targetting on improving immunization data qaulity in the pastoralist communities.

Keywords: data quality, immunization, verification factor, pastoralist region

Procedia PDF Downloads 123
24976 Identifying Critical Success Factors for Data Quality Management through a Delphi Study

Authors: Maria Paula Santos, Ana Lucas

Abstract:

Organizations support their operations and decision making on the data they have at their disposal, so the quality of these data is remarkably important and Data Quality (DQ) is currently a relevant issue, the literature being unanimous in pointing out that poor DQ can result in large costs for organizations. The literature review identified and described 24 Critical Success Factors (CSF) for Data Quality Management (DQM) that were presented to a panel of experts, who ordered them according to their degree of importance, using the Delphi method with the Q-sort technique, based on an online questionnaire. The study shows that the five most important CSF for DQM are: definition of appropriate policies and standards, control of inputs, definition of a strategic plan for DQ, organizational culture focused on quality of the data and obtaining top management commitment and support.

Keywords: critical success factors, data quality, data quality management, Delphi, Q-Sort

Procedia PDF Downloads 217
24975 '3D City Model' through Quantum Geographic Information System: A Case Study of Gujarat International Finance Tec-City, Gujarat, India

Authors: Rahul Jain, Pradhir Parmar, Dhruvesh Patel

Abstract:

Planning and drawing are the important aspects of civil engineering. For testing theories about spatial location and interaction between land uses and related activities the computer based solution of urban models are used. The planner’s primary interest is in creation of 3D models of building and to obtain the terrain surface so that he can do urban morphological mappings, virtual reality, disaster management, fly through generation, visualization etc. 3D city models have a variety of applications in urban studies. Gujarat International Finance Tec-City (GIFT) is an ongoing construction site between Ahmedabad and Gandhinagar, Gujarat, India. It will be built on 3590000 m2 having a geographical coordinates of North Latitude 23°9’5’’N to 23°10’55’’ and East Longitude 72°42’2’’E to 72°42’16’’E. Therefore to develop 3D city models of GIFT city, the base map of the city is collected from GIFT office. Differential Geographical Positioning System (DGPS) is used to collect the Ground Control Points (GCP) from the field. The GCP points are used for the registration of base map in QGIS. The registered map is projected in WGS 84/UTM zone 43N grid and digitized with the help of various shapefile tools in QGIS. The approximate height of the buildings that are going to build is collected from the GIFT office and placed on the attribute table of each layer created using shapefile tools. The Shuttle Radar Topography Mission (SRTM) 1 Arc-Second Global (30 m X 30 m) grid data is used to generate the terrain of GIFT city. The Google Satellite Map is used to place on the background to get the exact location of the GIFT city. Various plugins and tools in QGIS are used to convert the raster layer of the base map of GIFT city into 3D model. The fly through tool is used for capturing and viewing the entire area in 3D of the city. This paper discusses all techniques and their usefulness in 3D city model creation from the GCP, base map, SRTM and QGIS.

Keywords: 3D model, DGPS, GIFT City, QGIS, SRTM

Procedia PDF Downloads 246
24974 Modernization of the Economic Price Adjustment Software

Authors: Roger L. Goodwin

Abstract:

The US Consumer Price Indices (CPIs) measures hundreds of items in the US economy. Many social programs and government benefits index to the CPIs. In mid to late 1990, much research went into changes to the CPI by a Congressional Advisory Committee. One thing can be said from the research is that, aside from there are alternative estimators for the CPI; any fundamental change to the CPI will affect many government programs. The purpose of this project is to modernize an existing process. This paper will show the development of a small, visual, software product that documents the Economic Price Adjustment (EPA) for long-term contracts. The existing workbook does not provide the flexibility to calculate EPAs where the base-month and the option-month are different. Nor does the workbook provide automated error checking. The small, visual, software product provides the additional flexibility and error checking. This paper presents the feedback to project.

Keywords: Consumer Price Index, Economic Price Adjustment, contracts, visualization tools, database, reports, forms, event procedures

Procedia PDF Downloads 317
24973 Data Mining in Medicine Domain Using Decision Trees and Vector Support Machine

Authors: Djamila Benhaddouche, Abdelkader Benyettou

Abstract:

In this paper, we used data mining to extract biomedical knowledge. In general, complex biomedical data collected in studies of populations are treated by statistical methods, although they are robust, they are not sufficient in themselves to harness the potential wealth of data. For that you used in step two learning algorithms: the Decision Trees and Support Vector Machine (SVM). These supervised classification methods are used to make the diagnosis of thyroid disease. In this context, we propose to promote the study and use of symbolic data mining techniques.

Keywords: biomedical data, learning, classifier, algorithms decision tree, knowledge extraction

Procedia PDF Downloads 559
24972 Analysis of Different Classification Techniques Using WEKA for Diabetic Disease

Authors: Usama Ahmed

Abstract:

Data mining is the process of analyze data which are used to predict helpful information. It is the field of research which solve various type of problem. In data mining, classification is an important technique to classify different kind of data. Diabetes is most common disease. This paper implements different classification technique using Waikato Environment for Knowledge Analysis (WEKA) on diabetes dataset and find which algorithm is suitable for working. The best classification algorithm based on diabetic data is Naïve Bayes. The accuracy of Naïve Bayes is 76.31% and take 0.06 seconds to build the model.

Keywords: data mining, classification, diabetes, WEKA

Procedia PDF Downloads 147
24971 Comprehensive Study of Data Science

Authors: Asifa Amara, Prachi Singh, Kanishka, Debargho Pathak, Akshat Kumar, Jayakumar Eravelly

Abstract:

Today's generation is totally dependent on technology that uses data as its fuel. The present study is all about innovations and developments in data science and gives an idea about how efficiently to use the data provided. This study will help to understand the core concepts of data science. The concept of artificial intelligence was introduced by Alan Turing in which the main principle was to create an artificial system that can run independently of human-given programs and can function with the help of analyzing data to understand the requirements of the users. Data science comprises business understanding, analyzing data, ethical concerns, understanding programming languages, various fields and sources of data, skills, etc. The usage of data science has evolved over the years. In this review article, we have covered a part of data science, i.e., machine learning. Machine learning uses data science for its work. Machines learn through their experience, which helps them to do any work more efficiently. This article includes a comparative study image between human understanding and machine understanding, advantages, applications, and real-time examples of machine learning. Data science is an important game changer in the life of human beings. Since the advent of data science, we have found its benefits and how it leads to a better understanding of people, and how it cherishes individual needs. It has improved business strategies, services provided by them, forecasting, the ability to attend sustainable developments, etc. This study also focuses on a better understanding of data science which will help us to create a better world.

Keywords: data science, machine learning, data analytics, artificial intelligence

Procedia PDF Downloads 82
24970 Recent Advancement in Dendrimer Based Nanotechnology for the Treatment of Brain Tumor

Authors: Nitin Dwivedi, Jigna Shah

Abstract:

Brain tumor is metastatic neoplasm of central nervous system, in most of cases it is life threatening disease with low survival rate. Despite of enormous efforts in the development of therapeutics and diagnostic tools, the treatment of brain tumors and gliomas remain a considerable challenge in the area of neuro-oncology. The most reason behind of this the presence of physiological barriers including blood brain barrier and blood brain tumor barrier, lead to insufficient reach ability of therapeutic agents at the site of tumor, result of inadequate destruction of gliomas. So there is an indeed need empowerment of brain tumor imaging for better characterization and delineation of tumors, visualization of malignant tissue during surgery, and tracking of response to chemotherapy and radiotherapy. Multifunctional different generations of dendrimer offer an improved effort for potentiate drug delivery at the site of brain tumor and gliomas. So this article emphasizes the innovative dendrimer approaches in tumor targeting, tumor imaging and delivery of therapeutic agent.

Keywords: blood brain barrier, dendrimer, gliomas, nanotechnology

Procedia PDF Downloads 561
24969 Design and Fabrication of Micro-Bubble Oxygenator

Authors: Chiang-Ho Cheng, An-Shik Yang, Hong-Yih Cheng

Abstract:

This paper applies the MEMS technology to design and fabricate a micro-bubble generator by a piezoelectric actuator. Coupled with a nickel nozzle plate, an annular piezoelectric ceramic was utilized as the primary structure of the generator. In operations, the piezoelectric element deforms transversely under an electric field applied across the thickness of the generator. The surface of the nozzle plate can expand or contract because of the induction of radial strain, resulting in the whole structure to bend, and successively transport oxygen micro-bubbles into the blood flow for enhancing the oxygen content in blood. In the tests, a high magnification microscope and a high speed CCD camera were employed to photograph the time evolution of meniscus shape of gaseous bubbles dispensed from the micro-bubble generator for flow visualization. This investigation thus explored the bubble formation process including the influences of inlet gas pressure along with driving voltage and resonance frequency on the formed bubble extent.

Keywords: micro-bubble, oxygenator, nozzle, piezoelectric

Procedia PDF Downloads 319
24968 Study on Discontinuity Properties of Phased-Array Ultrasound Transducer Affecting to Sound Pressure Fields Pattern

Authors: Tran Trong Thang, Nguyen Phan Kien, Trinh Quang Duc

Abstract:

The phased-array ultrasound transducer types are utilities for medical ultrasonography as well as optical imaging. However, their discontinuity characteristic limits the applications due to the artifacts contaminated into the reconstructed images. Because of the effects of the ultrasound pressure field pattern to the echo ultrasonic waves as well as the optical modulated signal, the side lobes of the focused ultrasound beam induced by discontinuity of the phased-array ultrasound transducer might the reason of the artifacts. In this paper, a simple method in approach of numerical simulation was used to investigate the limitation of discontinuity of the elements in phased-array ultrasound transducer and their effects to the ultrasound pressure field. Take into account the change of ultrasound pressure field patterns in the conditions of variation of the pitches between elements of the phased-array ultrasound transducer, the appropriated parameters for phased-array ultrasound transducer design were asserted quantitatively.

Keywords: phased-array ultrasound transducer, sound pressure pattern, discontinuous sound field, numerical visualization

Procedia PDF Downloads 506
24967 Application of Artificial Neural Network Technique for Diagnosing Asthma

Authors: Azadeh Bashiri

Abstract:

Introduction: Lack of proper diagnosis and inadequate treatment of asthma leads to physical and financial complications. This study aimed to use data mining techniques and creating a neural network intelligent system for diagnosis of asthma. Methods: The study population is the patients who had visited one of the Lung Clinics in Tehran. Data were analyzed using the SPSS statistical tool and the chi-square Pearson's coefficient was the basis of decision making for data ranking. The considered neural network is trained using back propagation learning technique. Results: According to the analysis performed by means of SPSS to select the top factors, 13 effective factors were selected, in different performances, data was mixed in various forms, so the different models were made for training the data and testing networks and in all different modes, the network was able to predict correctly 100% of all cases. Conclusion: Using data mining methods before the design structure of system, aimed to reduce the data dimension and the optimum choice of the data, will lead to a more accurate system. Therefore, considering the data mining approaches due to the nature of medical data is necessary.

Keywords: asthma, data mining, Artificial Neural Network, intelligent system

Procedia PDF Downloads 273
24966 Dambreak Flood Analysis Using HEC-RAS and GIS Technologies

Authors: Oussama Derdous, Lakhdar Djemili, Hamza Bouchehed

Abstract:

The potential risks associated with dam break flooding could be considerable and result in major damage, including loss of life and property destruction. In the past, Algeria experienced such flood disasters; let’s recall the failure of Fergoug dam in 1881, this accident cost 200 lives, many houses and bridges were destroyed by the flooding. Recently the Algerian government have obligated to dam owners the development of detailed dam break Emergency Action Plans for its 64 major dams. The research presented here was conducted within this framework, Zardezas dam which is located in the city of Skikda in the North East of Algeria was the case of study. The model HEC-RAS was used for the hydrodynamic routing of the dam break flood wave. In addition, Geographic Information System (GIS) was used to create inundation maps and produce a visualization of the flood propagation in the Saf-Saf River.The simulation results that demonstrate the significance of Zardezas dam break flooding; constitute a real tool for developing emergency response plans and assisting territorial communities in land use planning.

Keywords: dam break, HEC-RAS, GIS, inundation maps, Emergency Action Plan

Procedia PDF Downloads 395
24965 Design of an Electric Vehicle Model with a Dynamo Drive Setup Using Model-Based Development (MBD) (EV Using MBD)

Authors: Gondu Vykunta Rao, Madhuri Bayya, Aruna Bharathi M., Paramesw Chidamparam, B. Murali

Abstract:

The increase in software content in today’s electric vehicles is increasing attention to having vast, unique topographies from low emission to high efficiency, whereas the chemical batteries have huge short comes, such as limited cycle life, power density, and cost. As for understanding and visualization, the companies are turning toward the virtual vehicle to test their design in software which is known as a simulation in the loop (SIL). In this project, in addition to the electric vehicle (EV) technology, we are adding a dynamo with the vehicle for regenerative braking. Traditionally the principle of dynamos is used in lighting the purpose of the bicycle. Here by using the same mechanism, we are running the vehicle as well as charging the vehicle from system-level simulation to the model in the loop and then to the Hardware in Loop (HIL) by using model-based development.

Keywords: electric vehicle, simulation in the loop (SIL), model in loop (MIL), hardware in loop (HIL), dynamos, model-based development (MBD), permanent magnet synchronous motor (PMSM), current control (CC), field-oriented control (FOC), regenerative braking

Procedia PDF Downloads 122
24964 Interpreting Privacy Harms from a Non-Economic Perspective

Authors: Christopher Muhawe, Masooda Bashir

Abstract:

With increased Internet Communication Technology(ICT), the virtual world has become the new normal. At the same time, there is an unprecedented collection of massive amounts of data by both private and public entities. Unfortunately, this increase in data collection has been in tandem with an increase in data misuse and data breach. Regrettably, the majority of data breach and data misuse claims have been unsuccessful in the United States courts for the failure of proof of direct injury to physical or economic interests. The requirement to express data privacy harms from an economic or physical stance negates the fact that not all data harms are physical or economic in nature. The challenge is compounded by the fact that data breach harms and risks do not attach immediately. This research will use a descriptive and normative approach to show that not all data harms can be expressed in economic or physical terms. Expressing privacy harms purely from an economic or physical harm perspective negates the fact that data insecurity may result into harms which run counter the functions of privacy in our lives. The promotion of liberty, selfhood, autonomy, promotion of human social relations and the furtherance of the existence of a free society. There is no economic value that can be placed on these functions of privacy. The proposed approach addresses data harms from a psychological and social perspective.

Keywords: data breach and misuse, economic harms, privacy harms, psychological harms

Procedia PDF Downloads 195
24963 Machine Learning Analysis of Student Success in Introductory Calculus Based Physics I Course

Authors: Chandra Prayaga, Aaron Wade, Lakshmi Prayaga, Gopi Shankar Mallu

Abstract:

This paper presents the use of machine learning algorithms to predict the success of students in an introductory physics course. Data having 140 rows pertaining to the performance of two batches of students was used. The lack of sufficient data to train robust machine learning models was compensated for by generating synthetic data similar to the real data. CTGAN and CTGAN with Gaussian Copula (Gaussian) were used to generate synthetic data, with the real data as input. To check the similarity between the real data and each synthetic dataset, pair plots were made. The synthetic data was used to train machine learning models using the PyCaret package. For the CTGAN data, the Ada Boost Classifier (ADA) was found to be the ML model with the best fit, whereas the CTGAN with Gaussian Copula yielded Logistic Regression (LR) as the best model. Both models were then tested for accuracy with the real data. ROC-AUC analysis was performed for all the ten classes of the target variable (Grades A, A-, B+, B, B-, C+, C, C-, D, F). The ADA model with CTGAN data showed a mean AUC score of 0.4377, but the LR model with the Gaussian data showed a mean AUC score of 0.6149. ROC-AUC plots were obtained for each Grade value separately. The LR model with Gaussian data showed consistently better AUC scores compared to the ADA model with CTGAN data, except in two cases of the Grade value, C- and A-.

Keywords: machine learning, student success, physics course, grades, synthetic data, CTGAN, gaussian copula CTGAN

Procedia PDF Downloads 44
24962 Political Communication in Twitter Interactions between Government, News Media and Citizens in Mexico

Authors: Jorge Cortés, Alejandra Martínez, Carlos Pérez, Anaid Simón

Abstract:

The presence of government, news media, and general citizenry in social media allows considering interactions between them as a form of political communication (i.e. the public exchange of contradictory discourses about politics). Twitter’s asymmetrical following model (users can follow, mention or reply to other users that do not follow them) could foster alternative democratic practices and have an impact on Mexican political culture, which has been marked by a lack of direct communication channels between these actors. The research aim is to assess Twitter’s role in political communication practices through the analysis of interaction dynamics between government, news media, and citizens by extracting and visualizing data from Twitter’s API to observe general behavior patterns. The hypothesis is that regardless the fact that Twitter’s features enable direct and horizontal interactions between actors, users repeat traditional dynamics of interaction, without taking full advantage of the possibilities of this medium. Through an interdisciplinary team including Communication Strategies, Information Design, and Interaction Systems, the activity on Twitter generated by the controversy over the presence of Uber in Mexico City was analysed; an issue of public interest, involving aspects such as public opinion, economic interests and a legal dimension. This research includes techniques from social network analysis (SNA), a methodological approach focused on the comprehension of the relationships between actors through the visual representation and measurement of network characteristics. The analysis of the Uber event comprised data extraction, data categorization, corpus construction, corpus visualization and analysis. On the recovery stage TAGS, a Google Sheet template, was used to extract tweets that included the hashtags #UberSeQueda and #UberSeVa, posts containing the string Uber and tweets directed to @uber_mx. Using scripts written in Python, the data was filtered, discarding tweets with no interaction (replies, retweets or mentions) and locations outside of México. Considerations regarding bots and the omission of anecdotal posts were also taken into account. The utility of graphs to observe interactions of political communication in general was confirmed by the analysis of visualizations generated with programs such as Gephi and NodeXL. However, some aspects require improvements to obtain more useful visual representations for this type of research. For example, link¬crossings complicates following the direction of an interaction forcing users to manipulate the graph to see it clearly. It was concluded that some practices prevalent in political communication in Mexico are replicated in Twitter. Media actors tend to group together instead of interact with others. The political system tends to tweet as an advertising strategy rather than to generate dialogue. However, some actors were identified as bridges establishing communication between the three spheres, generating a more democratic exercise and taking advantage of Twitter’s possibilities. Although interactions in Twitter could become an alternative to political communication, this potential depends on the intentions of the participants and to what extent they are aiming for collaborative and direct communications. Further research is needed to get a deeper understanding on the political behavior of Twitter users and the possibilities of SNA for its analysis.

Keywords: interaction, political communication, social network analysis, Twitter

Procedia PDF Downloads 221
24961 Measuring Text-Based Semantics Relatedness Using WordNet

Authors: Madiha Khan, Sidrah Ramzan, Seemab Khan, Shahzad Hassan, Kamran Saeed

Abstract:

Measuring semantic similarity between texts is calculating semantic relatedness between texts using various techniques. Our web application (Measuring Relatedness of Concepts-MRC) allows user to input two text corpuses and get semantic similarity percentage between both using WordNet. Our application goes through five stages for the computation of semantic relatedness. Those stages are: Preprocessing (extracts keywords from content), Feature Extraction (classification of words into Parts-of-Speech), Synonyms Extraction (retrieves synonyms against each keyword), Measuring Similarity (using keywords and synonyms, similarity is measured) and Visualization (graphical representation of similarity measure). Hence the user can measure similarity on basis of features as well. The end result is a percentage score and the word(s) which form the basis of similarity between both texts with use of different tools on same platform. In future work we look forward for a Web as a live corpus application that provides a simpler and user friendly tool to compare documents and extract useful information.

Keywords: Graphviz representation, semantic relatedness, similarity measurement, WordNet similarity

Procedia PDF Downloads 237
24960 Data Access, AI Intensity, and Scale Advantages

Authors: Chuping Lo

Abstract:

This paper presents a simple model demonstrating that ceteris paribus countries with lower barriers to accessing global data tend to earn higher incomes than other countries. Therefore, large countries that inherently have greater data resources tend to have higher incomes than smaller countries, such that the former may be more hesitant than the latter to liberalize cross-border data flows to maintain this advantage. Furthermore, countries with higher artificial intelligence (AI) intensity in production technologies tend to benefit more from economies of scale in data aggregation, leading to higher income and more trade as they are better able to utilize global data.

Keywords: digital intensity, digital divide, international trade, scale of economics

Procedia PDF Downloads 68
24959 Secured Transmission and Reserving Space in Images Before Encryption to Embed Data

Authors: G. R. Navaneesh, E. Nagarajan, C. H. Rajam Raju

Abstract:

Nowadays the multimedia data are used to store some secure information. All previous methods allocate a space in image for data embedding purpose after encryption. In this paper, we propose a novel method by reserving space in image with a boundary surrounded before encryption with a traditional RDH algorithm, which makes it easy for the data hider to reversibly embed data in the encrypted images. The proposed method can achieve real time performance, that is, data extraction and image recovery are free of any error. A secure transmission process is also discussed in this paper, which improves the efficiency by ten times compared to other processes as discussed.

Keywords: secure communication, reserving room before encryption, least significant bits, image encryption, reversible data hiding

Procedia PDF Downloads 412