Search results for: data protection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26397

Search results for: data protection

25467 Association of Social Data as a Tool to Support Government Decision Making

Authors: Diego Rodrigues, Marcelo Lisboa, Elismar Batista, Marcos Dias

Abstract:

Based on data on child labor, this work arises questions about how to understand and locate the factors that make up the child labor rates, and which properties are important to analyze these cases. Using data mining techniques to discover valid patterns on Brazilian social databases were evaluated data of child labor in the State of Tocantins (located north of Brazil with a territory of 277000 km2 and comprises 139 counties). This work aims to detect factors that are deterministic for the practice of child labor and their relationships with financial indicators, educational, regional and social, generating information that is not explicit in the government database, thus enabling better monitoring and updating policies for this purpose.

Keywords: social data, government decision making, association of social data, data mining

Procedia PDF Downloads 365
25466 Outlier Detection in Stock Market Data using Tukey Method and Wavelet Transform

Authors: Sadam Alwadi

Abstract:

Outlier values become a problem that frequently occurs in the data observation or recording process. Thus, the need for data imputation has become an essential matter. In this work, it will make use of the methods described in the prior work to detect the outlier values based on a collection of stock market data. In order to implement the detection and find some solutions that maybe helpful for investors, real closed price data were obtained from the Amman Stock Exchange (ASE). Tukey and Maximum Overlapping Discrete Wavelet Transform (MODWT) methods will be used to impute the detect the outlier values.

Keywords: outlier values, imputation, stock market data, detecting, estimation

Procedia PDF Downloads 77
25465 The Concept of Anchor Hazard Potential Map

Authors: Sao-Jeng Chao, Chia-Yun Wei, Si-Han Lai, Cheng-Yu Huang, Yu-Han Teng

Abstract:

In Taiwan, the landforms are mainly dominated by mountains and hills. Many road sections of the National Highway are impossible to avoid problems such as slope excavation or slope filling. In order to increase the safety of the slope, various slope protection methods are used to stabilize the slope, especially the soil anchor technique is the most common. This study is inspired by the soil liquefaction potential map. The concept of the potential map is widely used. The typhoon, earth-rock flow, tsunami, flooded area, and the recent discussion of soil liquefaction have safety potential concepts. This paper brings the concept of safety potential to the anchored slope. Because the soil anchor inspection is only the concept of points, this study extends the concept of the point to the surface, using the Quantum GIS program to present the slope damage area, and depicts the slope appearance and soil anchor point with the slope as-built drawing. The soil anchor scores are obtained by anchor inspection data, and the low, medium and high potential areas are remitted by interpolation. Thus, the area where the anchored slope may be harmful is judged and relevant maintenance is provided. The maintenance units can thus prevent judgment and deal with the anchored slope as soon as possible.

Keywords: anchor, slope, potential map, lift-off test, existing load

Procedia PDF Downloads 134
25464 PEINS: A Generic Compression Scheme Using Probabilistic Encoding and Irrational Number Storage

Authors: P. Jayashree, S. Rajkumar

Abstract:

With social networks and smart devices generating a multitude of data, effective data management is the need of the hour for networks and cloud applications. Some applications need effective storage while some other applications need effective communication over networks and data reduction comes as a handy solution to meet out both requirements. Most of the data compression techniques are based on data statistics and may result in either lossy or lossless data reductions. Though lossy reductions produce better compression ratios compared to lossless methods, many applications require data accuracy and miniature details to be preserved. A variety of data compression algorithms does exist in the literature for different forms of data like text, image, and multimedia data. In the proposed work, a generic progressive compression algorithm, based on probabilistic encoding, called PEINS is projected as an enhancement over irrational number stored coding technique to cater to storage issues of increasing data volumes as a cost effective solution, which also offers data security as a secondary outcome to some extent. The proposed work reveals cost effectiveness in terms of better compression ratio with no deterioration in compression time.

Keywords: compression ratio, generic compression, irrational number storage, probabilistic encoding

Procedia PDF Downloads 285
25463 Iot Device Cost Effective Storage Architecture and Real-Time Data Analysis/Data Privacy Framework

Authors: Femi Elegbeleye, Omobayo Esan, Muienge Mbodila, Patrick Bowe

Abstract:

This paper focused on cost effective storage architecture using fog and cloud data storage gateway and presented the design of the framework for the data privacy model and data analytics framework on a real-time analysis when using machine learning method. The paper began with the system analysis, system architecture and its component design, as well as the overall system operations. The several results obtained from this study on data privacy model shows that when two or more data privacy model is combined we tend to have a more stronger privacy to our data, and when fog storage gateway have several advantages over using the traditional cloud storage, from our result shows fog has reduced latency/delay, low bandwidth consumption, and energy usage when been compare with cloud storage, therefore, fog storage will help to lessen excessive cost. This paper dwelt more on the system descriptions, the researchers focused on the research design and framework design for the data privacy model, data storage, and real-time analytics. This paper also shows the major system components and their framework specification. And lastly, the overall research system architecture was shown, its structure, and its interrelationships.

Keywords: IoT, fog, cloud, data analysis, data privacy

Procedia PDF Downloads 91
25462 Comparison of Selected Pier-Scour Equations for Wide Piers Using Field Data

Authors: Nordila Ahmad, Thamer Mohammad, Bruce W. Melville, Zuliziana Suif

Abstract:

Current methods for predicting local scour at wide bridge piers, were developed on the basis of laboratory studies and very limited scour prediction were tested with field data. Laboratory wide pier scour equation from previous findings with field data were presented. A wide range of field data were used and it consists of both live-bed and clear-water scour. A method for assessing the quality of the data was developed and applied to the data set. Three other wide pier-scour equations from the literature were used to compare the performance of each predictive method. The best-performing scour equation were analyzed using statistical analysis. Comparisons of computed and observed scour depths indicate that the equation from the previous publication produced the smallest discrepancy ratio and RMSE value when compared with the large amount of laboratory and field data.

Keywords: field data, local scour, scour equation, wide piers

Procedia PDF Downloads 399
25461 Influence of Water Reservoir Parameters on the Climate and Coastal Areas

Authors: Lia Matchavariani

Abstract:

Water reservoir construction on the rivers flowing into the sea complicates the coast protection, seashore starts to degrade causing coast erosion and disaster on the backdrop of current climate change. The instruments of the impact of a water reservoir on the climate and coastal areas are its contact surface with the atmosphere and the area irrigated with its water or humidified with infiltrated waters. The Black Sea coastline is characterized by the highest ecological vulnerability. The type and intensity of the water reservoir impact are determined by its morphometry, type of regulation, level regime, and geomorphological and geological characteristics of the adjoining area. Studies showed the impact of the water reservoir on the climate, on its comfort parameters is positive if it is located in the zone of insufficient humidity and vice versa, is negative if the water reservoir is found in the zone with abundant humidity. There are many natural and anthropogenic factors determining the peculiarities of the impact of the water reservoir on the climate, which can be assessed with maximum accuracy by the so-called “long series” method, which operates on the meteorological elements (temperature, wind, precipitations, etc.) with the long series formed with the stationary observation data. This is the time series, which consists of two periods with statistically sufficient duration. The first period covers the observations up to the formation of the water reservoir and another period covers the observations accomplished during its operation. If no such data are available, or their series is statistically short, “an analog” method is used. Such an analog water reservoir is selected based on the similarity of the environmental conditions. It must be located within the zone of the designed water reservoir, under similar environmental conditions, and besides, a sufficient number of observations accomplished in its coastal zone.

Keywords: coast-constituent sediment, eustasy, meteorological parameters, seashore degradation, water reservoirs impact

Procedia PDF Downloads 40
25460 The Maximum Throughput Analysis of UAV Datalink 802.11b Protocol

Authors: Inkyu Kim, SangMan Moon

Abstract:

This IEEE 802.11b protocol provides up to 11Mbps data rate, whereas aerospace industry wants to seek higher data rate COTS data link system in the UAV. The Total Maximum Throughput (TMT) and delay time are studied on many researchers in the past years This paper provides theoretical data throughput performance of UAV formation flight data link using the existing 802.11b performance theory. We operate the UAV formation flight with more than 30 quad copters with 802.11b protocol. We may be predicting that UAV formation flight numbers have to bound data link protocol performance limitations.

Keywords: UAV datalink, UAV formation flight datalink, UAV WLAN datalink application, UAV IEEE 802.11b datalink application

Procedia PDF Downloads 386
25459 Methods for Distinction of Cattle Using Supervised Learning

Authors: Radoslav Židek, Veronika Šidlová, Radovan Kasarda, Birgit Fuerst-Waltl

Abstract:

Machine learning represents a set of topics dealing with the creation and evaluation of algorithms that facilitate pattern recognition, classification, and prediction, based on models derived from existing data. The data can present identification patterns which are used to classify into groups. The result of the analysis is the pattern which can be used for identification of data set without the need to obtain input data used for creation of this pattern. An important requirement in this process is careful data preparation validation of model used and its suitable interpretation. For breeders, it is important to know the origin of animals from the point of the genetic diversity. In case of missing pedigree information, other methods can be used for traceability of animal´s origin. Genetic diversity written in genetic data is holding relatively useful information to identify animals originated from individual countries. We can conclude that the application of data mining for molecular genetic data using supervised learning is an appropriate tool for hypothesis testing and identifying an individual.

Keywords: genetic data, Pinzgau cattle, supervised learning, machine learning

Procedia PDF Downloads 541
25458 Router 1X3 - RTL Design and Verification

Authors: Nidhi Gopal

Abstract:

Routing is the process of moving a packet of data from source to destination and enables messages to pass from one computer to another and eventually reach the target machine. A router is a networking device that forwards data packets between computer networks. It is connected to two or more data lines from different networks (as opposed to a network switch, which connects data lines from one single network). This paper mainly emphasizes upon the study of router device, its top level architecture, and how various sub-modules of router i.e. Register, FIFO, FSM and Synchronizer are synthesized, and simulated and finally connected to its top module.

Keywords: data packets, networking, router, routing

Procedia PDF Downloads 798
25457 ‘The Guilt Complex’: Assessing the Guilt of Youth Returning From Terrorist Groups in the Narratives of Justice Presentation on the Methodological Opportunities and Concerns in Operational Research

Authors: Arpita Mitra

Abstract:

The research explores the concept of ‘guilt’ as understood in relation to children and young individuals associated with terrorist groups who are exiting these groups and returning to civilian lives (‘young returnees’). The study explores young returnees’ guilt – in its psychological, legal, and sociological manifestations and how it contributes to experiences of reintegration and justice administration. Streamlining it further, the research question on assessing guilt engages with young adults – between 18 and 30 years – who were part of a terrorist organization during their formative years and have returned to civilian life. Overall, the findings of the said research are intended to contribute first-hand operational research to criminological literature as well as transitional justice mechanisms with regard to narratives on truth, justice, reparations and institutional reform/guarantees of non-recurrence. Particularly for this paper, the focus of the paper shall be on one aspect of this research, that is, on the added value of conducting operational research and the methodological challenges encountered during this process with regard to informed consent, data protection, mental health and security considerations for the respondents and researcher.

Keywords: terrorism, reintegration, young returnees, criminology

Procedia PDF Downloads 55
25456 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests

Authors: Julius Onyancha, Valentina Plekhanova

Abstract:

One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.

Keywords: web log data, web user profile, user interest, noise web data learning, machine learning

Procedia PDF Downloads 259
25455 Comparison of Clinical Profiles of Patients Seen in a Women and Children Protection Unit in a Local Government Hospital in Makati, Philippines Before and During the COVID-19 Pandemic Between January 2018 to February 2020 and March 2020 to December 2021

Authors: Margaret Denise P. Del Rosario, Geraldine Alcantara

Abstract:

Background: The declaration of the COVID-19 pandemic has impacted hospital visits of child abuse cases with less consults but more severe injuries. Objective: The study aims to identify the clinical profiles of patients seen in the hospital ng Makati Women and Children Protection Unit before and during the pandemic. Design: A cross-sectional analytic study design through review of records that underwent quantitative analysis. Results: 264 cases pre-pandemic and 208 cases during the pandemic were reviewed. Most reported cases were neglect comprising of 47% of the pre-pandemic cases and 68% of cases during the pandemic. Supervisory neglect was most commonly reported. An equal distribution between males and females were seen among victims and alleged perpetrators. The age group of both victims and alleged perpetrators during the pandemic was significantly younger compared to the pre-pandemic period. Children belonging to larger family groups were commonly encountered with most of them being the eldest amongst siblings. Alleged perpetrators were mostly secondary graduates for both time periods. A significant increase of cases during the pandemic occurred at home. More patients required hospitalization during the pandemic period with 37% compared to the 23% of admissions prior to the pandemic. Furthermore, a three-fold increase of injuries sustained during the pandemic required intensive care. Conclusion: The study reflects increased severity of injuries related to abuse during the pandemic compared to pre-pandemic times. A significant increase in injuries requiring intensive care were also seen despite less reported cases.

Keywords: child abuse, COVID-19, violence against children, WCPU, neglect

Procedia PDF Downloads 47
25454 Data Mining and Knowledge Management Application to Enhance Business Operations: An Exploratory Study

Authors: Zeba Mahmood

Abstract:

The modern business organizations are adopting technological advancement to achieve competitive edge and satisfy their consumer. The development in the field of Information technology systems has changed the way of conducting business today. Business operations today rely more on the data they obtained and this data is continuously increasing in volume. The data stored in different locations is difficult to find and use without the effective implementation of Data mining and Knowledge management techniques. Organizations who smartly identify, obtain and then convert data in useful formats for their decision making and operational improvements create additional value for their customers and enhance their operational capabilities. Marketers and Customer relationship departments of firm use Data mining techniques to make relevant decisions, this paper emphasizes on the identification of different data mining and Knowledge management techniques that are applied to different business industries. The challenges and issues of execution of these techniques are also discussed and critically analyzed in this paper.

Keywords: knowledge, knowledge management, knowledge discovery in databases, business, operational, information, data mining

Procedia PDF Downloads 525
25453 Tourist’s Perception and Identification of Landscape Elements of Traditional Village

Authors: Mengxin Feng, Feng Xu, Zhiyong Lai

Abstract:

As a typical representative of the countryside, traditional Chinese villages are rich in cultural landscape resources and historical information, but they are still in continuous decline. The problems of people's weak protection awareness and low cultural recognition are still serious, and the protection of cultural heritage is imminent. At the same time, with the rapid development of rural tourism, its cultural value has been explored and paid attention to again. From the perspective of tourists, this study aimed to explore people's perception and identity of cultural landscape resources under the current cultural tourism development background. We selected eleven typical landscape elements of Lingshui Village, a traditional village in Beijing, as research objects and conducted a questionnaire survey with two scales of perception and identity to explore the characteristics of people's perception and identification of landscape elements. We found that there was a strong positive correlation between the perception and identity of each element and that geographical location influenced visitors' overall perception. The perception dimensions scored the highest in location, and the lowest in history and culture, and the identity dimensions scored the highest in meaning and lowest in emotion. We analyzed the impact of visitors' backgrounds on people's perception and identity characteristics and found that age and education were two important factors. The elderly had a higher degree of perceived identity, as the familiarity effect increased their attention. Highly educated tourists had more stringent criteria for perception and identification. The above findings suggest strategies for conserving and optimizing landscape elements in the traditional village to improve the acceptance and recognition of cultural information in traditional villages, which will inject new vitality into the development of traditional villages.

Keywords: traditional village, tourist perception, landscape elements, perception and identity

Procedia PDF Downloads 132
25452 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data

Authors: Adarsh Shroff

Abstract:

Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.

Keywords: big data, map reduce, incremental processing, iterative computation

Procedia PDF Downloads 342
25451 Intrusion Detection and Prevention System (IDPS) in Cloud Computing Using Anomaly-Based and Signature-Based Detection Techniques

Authors: John Onyima, Ikechukwu Ezepue

Abstract:

Virtualization and cloud computing are among the fast-growing computing innovations in recent times. Organisations all over the world are moving their computing services towards the cloud this is because of its rapid transformation of the organization’s infrastructure and improvement of efficient resource utilization and cost reduction. However, this technology brings new security threats and challenges about safety, reliability and data confidentiality. Evidently, no single security technique can guarantee security or protection against malicious attacks on a cloud computing network hence an integrated model of intrusion detection and prevention system has been proposed. Anomaly-based and signature-based detection techniques will be integrated to enable the network and its host defend themselves with some level of intelligence. The anomaly-base detection was implemented using the local deviation factor graph-based (LDFGB) algorithm while the signature-based detection was implemented using the snort algorithm. Results from this collaborative intrusion detection and prevention techniques show robust and efficient security architecture for cloud computing networks.

Keywords: anomaly-based detection, cloud computing, intrusion detection, intrusion prevention, signature-based detection

Procedia PDF Downloads 294
25450 Modelling Insider Attacks in Public Cloud

Authors: Roman Kulikov, Svetlana Kolesnikova

Abstract:

Last decade Cloud Computing technologies have been rapidly becoming ubiquitous. Each year more and more organizations, corporations, internet services and social networks trust their business sensitive information to Public Cloud. The data storage in Public Cloud is protected by security mechanisms such as firewalls, cryptography algorithms, backups, etc.. In this way, however, only outsider attacks can be prevented, whereas virtualization tools can be easily compromised by insider. The protection of Public Cloud’s critical elements from internal intruder remains extremely challenging. A hypervisor, also called a virtual machine manager, is a program that allows multiple operating systems (OS) to share a single hardware processor in Cloud Computing. One of the hypervisor's functions is to enforce access control policies. Furthermore, it prevents guest OS from disrupting each other and from accessing each other's memory or disk space. Hypervisor is the one of the most critical and vulnerable elements in Cloud Computing infrastructure. Nevertheless, it has been poorly protected from being compromised by insider. By exploiting certain vulnerabilities, privilege escalation can be easily achieved in insider attacks on hypervisor. In this way, an internal intruder, who has compromised one process, is able to gain control of the entire virtual machine. Thereafter, the consequences of insider attacks in Public Cloud might be more catastrophic and significant to virtual tools and sensitive data than of outsider attacks. So far, almost no preventive security countermeasures have been developed. There has been little attention paid for developing models to assist risks mitigation strategies. In this paper formal model of insider attacks on hypervisor is designed. Our analysis identifies critical hypervisor`s vulnerabilities that can be easily compromised by internal intruder. Consequently, possible conditions for successful attacks implementation are uncovered. Hence, development of preventive security countermeasures can be improved on the basis of the proposed model.

Keywords: insider attack, public cloud, cloud computing, hypervisor

Procedia PDF Downloads 357
25449 Different Data-Driven Bivariate Statistical Approaches to Landslide Susceptibility Mapping (Uzundere, Erzurum, Turkey)

Authors: Azimollah Aleshzadeh, Enver Vural Yavuz

Abstract:

The main goal of this study is to produce landslide susceptibility maps using different data-driven bivariate statistical approaches; namely, entropy weight method (EWM), evidence belief function (EBF), and information content model (ICM), at Uzundere county, Erzurum province, in the north-eastern part of Turkey. Past landslide occurrences were identified and mapped from an interpretation of high-resolution satellite images, and earlier reports as well as by carrying out field surveys. In total, 42 landslide incidence polygons were mapped using ArcGIS 10.4.1 software and randomly split into a construction dataset 70 % (30 landslide incidences) for building the EWM, EBF, and ICM models and the remaining 30 % (12 landslides incidences) were used for verification purposes. Twelve layers of landslide-predisposing parameters were prepared, including total surface radiation, maximum relief, soil groups, standard curvature, distance to stream/river sites, distance to the road network, surface roughness, land use pattern, engineering geological rock group, topographical elevation, the orientation of slope, and terrain slope gradient. The relationships between the landslide-predisposing parameters and the landslide inventory map were determined using different statistical models (EWM, EBF, and ICM). The model results were validated with landslide incidences, which were not used during the model construction. In addition, receiver operating characteristic curves were applied, and the area under the curve (AUC) was determined for the different susceptibility maps using the success (construction data) and prediction (verification data) rate curves. The results revealed that the AUC for success rates are 0.7055, 0.7221, and 0.7368, while the prediction rates are 0.6811, 0.6997, and 0.7105 for EWM, EBF, and ICM models, respectively. Consequently, landslide susceptibility maps were classified into five susceptibility classes, including very low, low, moderate, high, and very high. Additionally, the portion of construction and verification landslides incidences in high and very high landslide susceptibility classes in each map was determined. The results showed that the EWM, EBF, and ICM models produced satisfactory accuracy. The obtained landslide susceptibility maps may be useful for future natural hazard mitigation studies and planning purposes for environmental protection.

Keywords: entropy weight method, evidence belief function, information content model, landslide susceptibility mapping

Procedia PDF Downloads 128
25448 Exploitation behind the Development of Home Batik Industry in Lawean, Solo, Central Java

Authors: Mukhammad Fatkhullah, Ayla Karina Budita, Cut Rizka Al Usrah, Kanita Khoirun Nisa, Muhammad Alhada Fuadilah Habib, Siti Muslihatul Mukaromah

Abstract:

Batik industry has become one of the leading industries in the economy of Indonesia. Since the recognition of batik as one of cultural wealth and national identity of Indonesia by UNESCO, batik production keeps increasing as a result of increasing demands for batik, whether from domestically or abroad. One of the rapid development batik industries in Indonesia is batik industry in Lawean Village, Solo, Central Java, Indonesia. This batik industry generally uses putting-out system where batik workers work in their own houses. With the implementation of this system, therefore employers don’t have to prepare Environmental Impact Analysis (EIA), social security for workers, overtime payment, space for working, and equipment for working. The implementation of putting-out system causes many problems, starting from environmental pollution, the loss of social rights of workers, and even exploitation of workers by batik entrepreneurs. The data used to describe this reality is the primary data from qualitative research with in-depth interview data collection technique. Informants were determined purposively. The theory used to perform data interpretation is the phenomenology of Alfred Schutz. Both qualitative and phenomenology are used in this study to describe batik workers exploitation in terms of the implementation of putting-out system on home batik industry in Lawean. The research result showed that workers in batik industry sector in Lawean were exploited with the implementation of putting-out system. The workers were strictly employed by the entrepreneurs, so that their job cannot be called 'part-time' job anymore. In terms of labor and time, the workers often work more than 12 hours per day and they often work overtime without receiving any overtime payment. In terms of work safety, the workers often have contact with chemical substances contained in batik making materials without using any protection, such as clothes work, which is worsened by the lack of standard or procedure in work that can cause physical damage, such as burnt and peeled off skin. Moreover, exposure and contamination of chemical materials make the workers and their families vulnerable to various diseases. Meanwhile, batik entrepreneurs did not give any social security (including health cost aid). Besides that, the researchers found that batik industry in home industry sector is not environmentally friendly, even damaging ecosystem because industrial waste disposed without EIA.

Keywords: exploitation, home batik industry, occupational health and safety, putting-out system

Procedia PDF Downloads 309
25447 Laboratory Simulation of Subway Dynamic Stray Current Interference with Cathodically Protected Structures

Authors: Mohammad Derakhshani, Saeed Reza Allahkaram, Michael Isakani-Zakaria, Masoud Samadian, Hojat Sharifi Rasaey

Abstract:

Dynamic stray currents tend to change their magnitude and polarity with time at their source which will create anodic and cathodic spots on a nearby interfered structure. To date, one of the biggest known dynamic stray current sources are DC traction systems. Laboratory simulation is a suitable method to apply theoretical principles in order to identify effective parameters in dynamic stray current influenced corrosion. Simulation techniques can be utilized for various mitigation methods applied in a small scales for selection of the most efficient method with regards to field applications. In this research, laboratory simulation of potential fluctuations caused by dynamic stray current on a cathodically protected structure was investigated. A lab model capable of generating DC static and dynamic stray currents and simulating its effects on cathodically protected samples were developed based on stray current induced (contact-less) polarization technique. Stray current pick-up and discharge spots on an influenced structure were simulated by inducing fluctuations in the sample’s stationary potential. Two mitigation methods for dynamic stray current interference on buried structures namely application of sacrificial anodes as preferred discharge point for the stray current and potentially controlled cathodic protection was investigated. Results showed that the application of sacrificial anodes can be effective in reducing interference only in discharge spot. But cathodic protection through potential controlling is more suitable for mitigating dynamic stray current effects.

Keywords: simulation, dynamic stray current, fluctuating potentials, sacrificial anode

Procedia PDF Downloads 293
25446 Ontology for Cross-Site-Scripting (XSS) Attack in Cybersecurity

Authors: Jean Rosemond Dora, Karol Nemoga

Abstract:

In this work, we tackle a frequent problem that frequently occurs in the cybersecurity field which is the exploitation of websites by XSS attacks, which are nowadays considered a complicated attack. These types of attacks aim to execute malicious scripts in a web browser of the client by including code in a legitimate web page. A serious matter is when a website accepts the “user-input” option. Attackers can exploit the web application (if vulnerable), and then steal sensitive data (session cookies, passwords, credit cards, etc.) from the server and/or from the client. However, the difficulty of the exploitation varies from website to website. Our focus is on the usage of ontology in cybersecurity against XSS attacks, on the importance of the ontology, and its core meaning for cybersecurity. We explain how a vulnerable website can be exploited, and how different JavaScript payloads can be used to detect vulnerabilities. We also enumerate some tools to use for an efficient analysis. We present detailed reasoning on what can be done to improve the security of a website in order to resist attacks, and we provide supportive examples. Then, we apply an ontology model against XSS attacks to strengthen the protection of a web application. However, we note that the existence of ontology does not improve the security itself, but it has to be properly used and should require a maximum of security layers to be taken into account.

Keywords: cybersecurity, web application vulnerabilities, cyber threats, ontology model

Procedia PDF Downloads 164
25445 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach

Authors: Jerry Q. Cheng

Abstract:

Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.

Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing

Procedia PDF Downloads 158
25444 Simulation Study of Multiple-Thick Gas Electron Multiplier-Based Microdosimeters for Fast Neutron Measurements

Authors: Amir Moslehi, Gholamreza Raisali

Abstract:

Microdosimetric detectors based on multiple-thick gas electron multiplier (multiple-THGEM) configurations are being used in various fields of radiation protection and dosimetry. In the present work, microdosimetric response of these detectors to fast neutrons has been investigated by Monte Carlo method. Three similar microdosimeters made of A-150 and rexolite as the wall materials are designed; the first based on single-THGEM, the second based on double-THGEM and the third is based on triple-THGEM. Sensitive volume of the three microdosimeters is a right cylinder of 5 mm height and diameter which is filled with the propane-based tissue-equivalent (TE) gas. The TE gas with 0.11 atm pressure at the room temperature simulates 1 µm of tissue. Lineal energy distributions for several neutron energies from 10 keV to 14 MeV including 241Am-Be neutrons are calculated by the Geant4 simulation toolkit. Also, mean quality factor and dose-equivalent value for any neutron energy has been determined by these distributions. Obtained data derived from the three microdosimeters are in agreement. Therefore, we conclude that the multiple-THGEM structures present similar microdosimetric responses to fast neutrons.

Keywords: fast neutrons, geant4, multiple-thick gas electron multiplier, microdosimeter

Procedia PDF Downloads 344
25443 Web Map Service for Fragmentary Rockfall Inventory

Authors: M. Amparo Nunez-Andres, Nieves Lantada

Abstract:

One of the most harmful geological risks is rockfalls. They cause both economic lost, damaged in buildings and infrastructures, and personal ones. Therefore, in order to estimate the risk of the exposed elements, it is necessary to know the mechanism of this kind of events, since the characteristics of the rock walls, to the propagation of fragments generated by the initial detached rock mass. In the framework of the research RockModels project, several inventories of rockfalls were carried out along the northeast of the Spanish peninsula and the Mallorca island. These inventories have general information about the events, although the important fact is that they contained detailed information about fragmentation. Specifically, the IBSD (Insitu Block Size Distribution) is obtained by photogrammetry from drone or TLS (Terrestrial Laser Scanner) and the RBSD (Rock Block Size Distribution) from the volume of the fragment in the deposit measured by hand. In order to share all this information with other scientists, engineers, members of civil protection, and stakeholders, it is necessary a platform accessible from the internet and following interoperable standards. In all the process, open-software have been used: PostGIS 2.1., Geoserver, and OpenLayers library. In the first step, a spatial database was implemented to manage all the information. We have used the data specifications of INSPIRE for natural risks adding specific and detailed data about fragmentation distribution. The next step was to develop a WMS with Geoserver. A previous phase was the creation of several views in PostGIS to show the information at different scales of visualization and with different degrees of detail. In the first view, the sites are identified with a point, and basic information about the rockfall event is facilitated. In the next level of zoom, at medium scale, the convex hull of the rockfall appears with its real shape and the source of the event and fragments are represented by symbols. The queries at this level offer a major detail about the movement. Eventually, the third level shows all elements: deposit, source, and blocks, in their real size, if it is possible, and in their real localization. The last task was the publication of all information in a web mapping site (www.rockdb.upc.edu) with data classified by levels using libraries in JavaScript as OpenLayers.

Keywords: geological risk, web mapping, WMS, rockfalls

Procedia PDF Downloads 157
25442 Regulation Aspects for a Radioisotope Production Installation in Brazil

Authors: Rian O. Miranda, Lidia V. de Sa, Julio C. Suita

Abstract:

The Brazilian Nuclear Energy Commission (CNEN) is the main manufacturer of radiopharmaceuticals in Brazil. The Nuclear Engineering Institute (IEN), located at Rio de Janeiro, is one of its main centers of research and production, attending public and private hospitals in the state. This radiopharmaceutical production is used in diagnostic and therapy procedures and allows one and a half million nuclear medicine procedures annually. Despite this, the country is not self-sufficient to meet national demand, creating the need for importation and consequent dependence on other countries. However, IEN facilities were designed in the 60's, and today its structure is inadequate in relation to the good manufacturing practices established by sanitary regulator (ANVISA) and radiological protection leading to the need for a new project. In order to adapt and increase production in the country, a new plant will be built and integrated to the existing facilities with a new 30 MeV Cyclotron that is actually in project detailing process. Thus, it is proposed to survey current CNEN and ANVISA standards for radiopharmaceutical production facilities, as well as the radiological protection analysis of each area of the plant, following good manufacturing practices recommendations adopted nationally besides licensing exigencies for radioactive facilities. In this way, the main requirements for proper operation, equipment location, building materials, area classification, and maintenance program have been implemented. The access controls, interlocks, segregation zones and pass-through boxes integrated into the project were also analyzed. As a result, IEN will in future have the flexibility to produce all necessary radioisotopes for nuclear medicine application, more efficiently by simultaneously bombarding two targets, allowing the simultaneous production of two different radioisotopes, minimizing radiation exposure and saving operating costs.

Keywords: cyclotron, legislation, norms, production, radiopharmaceuticals

Procedia PDF Downloads 131
25441 Hyperspectral Mapping Methods for Differentiating Mangrove Species along Karachi Coast

Authors: Sher Muhammad, Mirza Muhammad Waqar

Abstract:

It is necessary to monitor and identify mangroves types and spatial extent near coastal areas because it plays an important role in coastal ecosystem and environmental protection. This research aims at identifying and mapping mangroves types along Karachi coast ranging from 24.79 to 24.85 degree in latitude and 66.91 to 66.97 degree in longitude using hyperspectral remote sensing data and techniques. Image acquired during February, 2012 through Hyperion sensor have been used for this research. Image preprocessing includes geometric and radiometric correction followed by Minimum Noise Fraction (MNF) and Pixel Purity Index (PPI). The output of MNF and PPI has been analyzed by visualizing it in n-dimensions for end-member extraction. Well-distributed clusters on the n-dimensional scatter plot have been selected with the region of interest (ROI) tool as end members. These end members have been used as an input for classification techniques applied to identify and map mangroves species including Spectral Angle Mapper (SAM), Spectral Feature Fitting (SFF), and Spectral Information Diversion (SID). Only two types of mangroves namely Avicennia Marina (white mangroves) and Avicennia Germinans (black mangroves) have been observed throughout the study area.

Keywords: mangrove, hyperspectral, hyperion, SAM, SFF, SID

Procedia PDF Downloads 359
25440 Adoption of Big Data by Global Chemical Industries

Authors: Ashiff Khan, A. Seetharaman, Abhijit Dasgupta

Abstract:

The new era of big data (BD) is influencing chemical industries tremendously, providing several opportunities to reshape the way they operate and help them shift towards intelligent manufacturing. Given the availability of free software and the large amount of real-time data generated and stored in process plants, chemical industries are still in the early stages of big data adoption. The industry is just starting to realize the importance of the large amount of data it owns to make the right decisions and support its strategies. This article explores the importance of professional competencies and data science that influence BD in chemical industries to help it move towards intelligent manufacturing fast and reliable. This article utilizes a literature review and identifies potential applications in the chemical industry to move from conventional methods to a data-driven approach. The scope of this document is limited to the adoption of BD in chemical industries and the variables identified in this article. To achieve this objective, government, academia, and industry must work together to overcome all present and future challenges.

Keywords: chemical engineering, big data analytics, industrial revolution, professional competence, data science

Procedia PDF Downloads 77
25439 Factors Affecting M-Government Deployment and Adoption

Authors: Saif Obaid Alkaabi, Nabil Ayad

Abstract:

Governments constantly seek to offer faster, more secure, efficient and effective services for their citizens. Recent changes and developments to communication services and technologies, mainly due the Internet, have led to immense improvements in the way governments of advanced countries carry out their interior operations Therefore, advances in e-government services have been broadly adopted and used in various developed countries, as well as being adapted to developing countries. The implementation of advances depends on the utilization of the most innovative structures of data techniques, mainly in web dependent applications, to enhance the main functions of governments. These functions, in turn, have spread to mobile and wireless techniques, generating a new advanced direction called m-government. This paper discusses a selection of available m-government applications and several business modules and frameworks in various fields. Practically, the m-government models, techniques and methods have become the improved version of e-government. M-government offers the potential for applications which will work better, providing citizens with services utilizing mobile communication and data models incorporating several government entities. Developing countries can benefit greatly from this innovation due to the fact that a large percentage of their population is young and can adapt to new technology and to the fact that mobile computing devices are more affordable. The use of models of mobile transactions encourages effective participation through the use of mobile portals by businesses, various organizations, and individual citizens. Although the application of m-government has great potential, it does have major limitations. The limitations include: the implementation of wireless networks and relative communications, the encouragement of mobile diffusion, the administration of complicated tasks concerning the protection of security (including the ability to offer privacy for information), and the management of the legal issues concerning mobile applications and the utilization of services.

Keywords: e-government, m-government, system dependability, system security, trust

Procedia PDF Downloads 376
25438 Evaluation and Proposal for Improvement of the Flow Measurement Equipment in the Bellavista Drinking Water System of the City of Azogues

Authors: David Quevedo, Diana Coronel

Abstract:

The present article carries out an evaluation of the drinking water system in the Bellavista sector of the city of Azogues, with the purpose of determining the appropriate equipment to record the actual consumption flows of the inhabitants in said sector. Taking into account that the study area is located in a rural and economically disadvantaged area, there is an urgent need to establish a control system for the consumption of drinking water in order to conserve and manage the vital resource in the best possible way, considering that the water source supplying this sector is approximately 9km away. The research began with the collection of cartographic, demographic, and statistical data of the sector, determining the coverage area, population projection, and a provision that guarantees the supply of drinking water to meet the water needs of the sector's inhabitants. By using hydraulic modeling through the United States Environmental Protection Agency Application for Modeling Drinking Water Distribution Systems EPANET 2.0 software, theoretical hydraulic data were obtained, which were used to design and justify the most suitable measuring equipment for the Bellavista drinking water system. Taking into account a minimum service life of the drinking water system of 30 years, future flow rates were calculated for the design of the macro-measuring device. After analyzing the network, it was evident that the Bellavista sector has an average consumption of 102.87 liters per person per day, but considering that Ecuadorian regulations recommend a provision of 180 liters per person per day for the geographical conditions of the sector, this value was used for the analysis. With all the collected and calculated information, the conclusion was reached that the Bellavista drinking water system needs to have a 125mm electromagnetic macro-measuring device for the first three quinquenniums of its service life and a 150mm diameter device for the following three quinquenniums. The importance of having equipment that provides real and reliable data will allow for the control of water consumption by the population of the sector, measured through micro-measuring devices installed at the entrance of each household, which should match the readings of the macro-measuring device placed after the water storage tank outlet, in order to control losses that may occur due to leaks in the drinking water system or illegal connections.

Keywords: macrometer, hydraulics, endowment, water

Procedia PDF Downloads 69