Search results for: data center
24922 Proposal to Increase the Efficiency, Reliability and Safety of the Centre of Data Collection Management and Their Evaluation Using Cluster Solutions
Authors: Martin Juhas, Bohuslava Juhasova, Igor Halenar, Andrej Elias
Abstract:
This article deals with the possibility of increasing efficiency, reliability and safety of the system for teledosimetric data collection management and their evaluation as a part of complex study for activity “Research of data collection, their measurement and evaluation with mobile and autonomous units” within project “Research of monitoring and evaluation of non-standard conditions in the area of nuclear power plants”. Possible weaknesses in existing system are identified. A study of available cluster solutions with possibility of their deploying to analysed system is presented.Keywords: teledosimetric data, efficiency, reliability, safety, cluster solution
Procedia PDF Downloads 51524921 Chemical Risk Posed by Hospital Liquid Effluents Example CHU Beni Messous Algiers
Authors: Laref Nabil
Abstract:
Ecology is at the center of many debates and international regulations. It therefore becomes a necessity and a privileged axis in many countries policy. The rise of environmental problems, the particularism of the hospital as an actor Public Health must lead by example in hygiene, prevention of risks to man and his environment. In this, it seemed interesting to make a poster on hospital liquid effluents in order to know not only the regulatory aspects but also their degree of pollution and their management in health institutions. Materials and methods: Samples taken at several looks, analysis performed at STEP Reghaia Algiers. Discussion and / or findings: In general, central gaze analysis results of water we can conclude that the contents of the various physico-chemical parameters greatly exceed the standards. Although the hypothesis of assimilating hospital liquid effluents domestic waters is confirmed, the liquid effluent from the University Hospital of Beni Messous and dumped in the natural environment still represent ecotoxicological risk.Keywords: health, hospital, liquid effluents, water
Procedia PDF Downloads 44824920 Efficient Storage in Cloud Computing by Using Index Replica
Authors: Bharat Singh Deora, Sushma Satpute
Abstract:
Cloud computing is based on resource sharing. Like other resources which can be shareable, storage is a resource which can be shared. We can use collective resources of storage from different locations and maintain a central index table for storage details. The storage combining of different places can form a suitable data storage which is operated from one location and is very economical. Proper storage of data should improve data reliability & availability and bandwidth utilization. Also, we are moving the contents of one storage to other according to our need.Keywords: cloud computing, cloud storage, Iaas, PaaS, SaaS
Procedia PDF Downloads 34024919 Atomic Decomposition Audio Data Compression and Denoising Using Sparse Dictionary Feature Learning
Authors: T. Bryan , V. Kepuska, I. Kostnaic
Abstract:
A method of data compression and denoising is introduced that is based on atomic decomposition of audio data using “basis vectors” that are learned from the audio data itself. The basis vectors are shown to have higher data compression and better signal-to-noise enhancement than the Gabor and gammatone “seed atoms” that were used to generate them. The basis vectors are the input weights of a Sparse AutoEncoder (SAE) that is trained using “envelope samples” of windowed segments of the audio data. The envelope samples are extracted from the audio data by performing atomic decomposition with Gabor or gammatone seed atoms. This process identifies segments of audio data that are locally coherent with the seed atoms. Envelope samples are extracted by identifying locally coherent audio data segments with Gabor or gammatone seed atoms, found by matching pursuit. The envelope samples are formed by taking the kronecker products of the atomic envelopes with the locally coherent data segments. Oracle signal-to-noise ratio (SNR) verses data compression curves are generated for the seed atoms as well as the basis vectors learned from Gabor and gammatone seed atoms. SNR data compression curves are generated for speech signals as well as early American music recordings. The basis vectors are shown to have higher denoising capability for data compression rates ranging from 90% to 99.84% for speech as well as music. Envelope samples are displayed as images by folding the time series into column vectors. This display method is used to compare of the output of the SAE with the envelope samples that produced them. The basis vectors are also displayed as images. Sparsity is shown to play an important role in producing the highest denoising basis vectors.Keywords: sparse dictionary learning, autoencoder, sparse autoencoder, basis vectors, atomic decomposition, envelope sampling, envelope samples, Gabor, gammatone, matching pursuit
Procedia PDF Downloads 25324918 Platform-as-a-Service Sticky Policies for Privacy Classification in the Cloud
Authors: Maha Shamseddine, Amjad Nusayr, Wassim Itani
Abstract:
In this paper, we present a Platform-as-a-Service (PaaS) model for controlling the privacy enforcement mechanisms applied on user data when stored and processed in Cloud data centers. The proposed architecture consists of establishing user configurable ‘sticky’ policies on the Graphical User Interface (GUI) data-bound components during the application development phase to specify the details of privacy enforcement on the contents of these components. Various privacy classification classes on the data components are formally defined to give the user full control on the degree and scope of privacy enforcement including the type of execution containers to process the data in the Cloud. This not only enhances the privacy-awareness of the developed Cloud services, but also results in major savings in performance and energy efficiency due to the fact that the privacy mechanisms are solely applied on sensitive data units and not on all the user content. The proposed design is implemented in a real PaaS cloud computing environment on the Microsoft Azure platform.Keywords: privacy enforcement, platform-as-a-service privacy awareness, cloud computing privacy
Procedia PDF Downloads 22724917 Compass Bar: A Visualization Technique for Out-of-View-Objects in Head-Mounted Displays
Authors: Alessandro Evangelista, Vito M. Manghisi, Michele Gattullo, Enricoandrea Laviola
Abstract:
In this work, we propose a custom visualization technique for Out-Of-View-Objects in Virtual and Augmented Reality applications using Head Mounted Displays. In the last two decades, Augmented Reality (AR) and Virtual Reality (VR) technologies experienced a remarkable growth of applications for navigation, interaction, and collaboration in different types of environments, real or virtual. Both environments can be potentially very complex, as they can include many virtual objects located in different places. Given the natural limitation of the human Field of View (about 210° horizontal and 150° vertical), humans cannot perceive objects outside this angular range. Moreover, despite recent technological advances in AR e VR Head-Mounted Displays (HMDs), these devices still suffer from a limited Field of View, especially regarding Optical See-Through displays, thus greatly amplifying the challenge of visualizing out-of-view objects. This problem is not negligible when the user needs to be aware of the number and the position of the out-of-view objects in the environment. For instance, during a maintenance operation on a construction site where virtual objects serve to improve the dangers' awareness. Providing such information can enhance the comprehension of the scene, enable fast navigation and focused search, and improve users' safety. In our research, we investigated how to represent out-of-view-objects in HMD User Interfaces (UI). Inspired by commercial video games such as Call of Duty Modern Warfare, we designed a customized Compass. By exploiting the Unity 3D graphics engine, we implemented our custom solution that can be used both in AR and VR environments. The Compass Bar consists of a graduated bar (in degrees) at the top center of the UI. The values of the bar range from -180 (far left) to +180 (far right), the zero is placed in front of the user. Two vertical lines on the bar show the amplitude of the user's field of view. Every virtual object within the scene is represented onto the compass bar as a specific color-coded proxy icon (a circular ring with a colored dot at its center). To provide the user with information about the distance, we implemented a specific algorithm that increases the size of the inner dot as the user approaches the virtual object (i.e., when the user reaches the object, the dot fills the ring). This visualization technique for out-of-view objects has some advantages. It allows users to be quickly aware of the number and the position of the virtual objects in the environment. For instance, if the compass bar displays the proxy icon at about +90, users will immediately know that the virtual object is to their right and so on. Furthermore, by having qualitative information about the distance, users can optimize their speed, thus gaining effectiveness in their work. Given the small size and position of the Compass Bar, our solution also helps lessening the occlusion problem thus increasing user acceptance and engagement. As soon as the lockdown measures will allow, we will carry out user-tests comparing this solution with other state-of-the-art existing ones such as 3D Radar, SidebARs and EyeSee360.Keywords: augmented reality, situation awareness, virtual reality, visualization design
Procedia PDF Downloads 12724916 Psychological Well-Being Among the Freed Kamhalari Girls in Dang
Authors: Jug Maya Chaudhary
Abstract:
The principal objective of this paper has been to assess the level of psychological well-being (PWB) of freed Kamhalari girls sheltered in a governmental rehabilitation center in the Dang district. All the girls (N=100) have been selected for a quantitative study, including 15 cases of in-depth interviews for qualitative study in 2013. The study results suggest that the level of psychological well-being of freed Kamhalaris has not been found to be high; rather they are moderate, with small incidences of a lower level of psychological well-being. Regarding the qualitative study, a total of six themes was identified: physical pain and fatigue then and now, the lasting experience of anxiety, unfair treatment, low self-esteem, depressed mood, and frustration due to current state and confusion. These themes reflected the unrelenting intrusive nature of painful experiences of those affected. This research will provide empathic insight into their past experience. It will add to the body of research on Psychological Well-being of Freed Kamhalari Girls and may generate ideas for intervention research.Keywords: Kamhalari, Experiences, Tharu, Psychological Wellbeing
Procedia PDF Downloads 5624915 Estimating Tree Height and Forest Classification from Multi Temporal Risat-1 HH and HV Polarized Satellite Aperture Radar Interferometric Phase Data
Authors: Saurav Kumar Suman, P. Karthigayani
Abstract:
In this paper the height of the tree is estimated and forest types is classified from the multi temporal RISAT-1 Horizontal-Horizontal (HH) and Horizontal-Vertical (HV) Polarised Satellite Aperture Radar (SAR) data. The novelty of the proposed project is combined use of the Back-scattering Coefficients (Sigma Naught) and the Coherence. It uses Water Cloud Model (WCM). The approaches use two main steps. (a) Extraction of the different forest parameter data from the Product.xml, BAND-META file and from Grid-xxx.txt file come with the HH & HV polarized data from the ISRO (Indian Space Research Centre). These file contains the required parameter during height estimation. (b) Calculation of the Vegetation and Ground Backscattering, Coherence and other Forest Parameters. (c) Classification of Forest Types using the ENVI 5.0 Tool and ROI (Region of Interest) calculation.Keywords: RISAT-1, classification, forest, SAR data
Procedia PDF Downloads 40724914 Presenting a Model for Predicting the State of Being Accident-Prone of Passages According to Neural Network and Spatial Data Analysis
Authors: Hamd Rezaeifar, Hamid Reza Sahriari
Abstract:
Accidents are considered to be one of the challenges of modern life. Due to the fact that the victims of this problem and also internal transportations are getting increased day by day in Iran, studying effective factors of accidents and identifying suitable models and parameters about this issue are absolutely essential. The main purpose of this research has been studying the factors and spatial data affecting accidents of Mashhad during 2007- 2008. In this paper it has been attempted to – through matching spatial layers on each other and finally by elaborating them with the place of accident – at the first step by adding landmarks of the accident and through adding especial fields regarding the existence or non-existence of effective phenomenon on accident, existing information banks of the accidents be completed and in the next step by means of data mining tools and analyzing by neural network, the relationship between these data be evaluated and a logical model be designed for predicting accident-prone spots with minimum error. The model of this article has a very accurate prediction in low-accident spots; yet it has more errors in accident-prone regions due to lack of primary data.Keywords: accident, data mining, neural network, GIS
Procedia PDF Downloads 4724913 Methodology of the Turkey’s National Geographic Information System Integration Project
Authors: Buse A. Ataç, Doğan K. Cenan, Arda Çetinkaya, Naz D. Şahin, Köksal Sanlı, Zeynep Koç, Akın Kısa
Abstract:
With its spatial data reliability, interpretation and questioning capabilities, Geographical Information Systems make significant contributions to scientists, planners and practitioners. Geographic information systems have received great attention in today's digital world, growing rapidly, and increasing the efficiency of use. Access to and use of current and accurate geographical data, which are the most important components of the Geographical Information System, has become a necessity rather than a need for sustainable and economic development. This project aims to enable sharing of data collected by public institutions and organizations on a web-based platform. Within the scope of the project, INSPIRE (Infrastructure for Spatial Information in the European Community) data specifications are considered as a road-map. In this context, Turkey's National Geographic Information System (TUCBS) Integration Project supports sharing spatial data within 61 pilot public institutions as complied with defined national standards. In this paper, which is prepared by the project team members in the TUCBS Integration Project, the technical process with a detailed methodology is explained. In this context, the main technical processes of the Project consist of Geographic Data Analysis, Geographic Data Harmonization (Standardization), Web Service Creation (WMS, WFS) and Metadata Creation-Publication. In this paper, the integration process carried out to provide the data produced by 61 institutions to be shared from the National Geographic Data Portal (GEOPORTAL), have been trying to be conveyed with a detailed methodology.Keywords: data specification, geoportal, GIS, INSPIRE, Turkish National Geographic Information System, TUCBS, Turkey's national geographic information system
Procedia PDF Downloads 14424912 Secure Content Centric Network
Authors: Syed Umair Aziz, Muhammad Faheem, Sameer Hussain, Faraz Idris
Abstract:
Content centric network is the network based on the mechanism of sending and receiving the data based on the interest and data request to the specified node (which has cached data). In this network, the security is bind with the content not with the host hence making it host independent and secure. In this network security is applied by taking content’s MAC (message authentication code) and encrypting it with the public key of the receiver. On the receiver end, the message is first verified and after verification message is saved and decrypted using the receiver's private key.Keywords: content centric network, client-server, host security threats, message authentication code, named data network, network caching, peer-to-peer
Procedia PDF Downloads 64424911 A One Dimensional Cdᴵᴵ Coordination Polymer: Synthesis, Structure and Properties
Authors: Z. Derikvand, M. Dusek, V. Eigner
Abstract:
One dimensional coordination polymer of Cdᴵᴵ based on pyrazine (pz) and 3-nitrophthalic acid (3-nphaH₂), namely poly[[diaqua bis(3-nitro-2-carboxylato-1-carboxylic acid)(µ₂-pyrazine) cadmium(II)]dihydrate], {[Cd(3-nphaH)2(pz)(H₂O)₂]. 2H₂O}ₙ was prepared and characterized. The asymmetric unit consists of one Cdᴵᴵ center, two (3-nphaH)– anions, two halves of two crystallographically distinct pz ligands, two coordinated and two uncoordinated water molecules. The Cdᴵᴵ cation is surrounded by four oxygen atoms from two (3-nphaH)– and two water molecules as well as two nitrogen atoms from two pz ligands in distorted octahedral geometry. Complicated hydrogen bonding network accompanied with N–O···π and C–O···π stacking interactions leads to formation of a 3D supramolecular network. Commonly, this kind of C–O–π and N–O···π interaction is detected in electron-rich CO/NO groups of (3-nphaH)– ligand and electron-deficient π-system of pyrazine.Keywords: supramolecular chemistry, Cd coordination polymer, crystal structure, 3-nithrophethalic acid
Procedia PDF Downloads 40224910 Fuel Inventory/ Depletion Analysis for a Thorium-Uranium Dioxide (Th-U) O2 Pin Cell Benchmark Using Monte Carlo and Deterministic Codes with New Version VIII.0 of the Evaluated Nuclear Data File (ENDF/B) Nuclear Data Library
Authors: Jamal Al-Zain, O. El Hajjaji, T. El Bardouni
Abstract:
A (Th-U) O2 fuel pin benchmark made up of 25 w/o U and 75 w/o Th was used. In order to analyze the depletion and inventory of the fuel for the pressurized water reactor pin-cell model. The new version VIII.0 of the ENDF/B nuclear data library was used to create a data set in ACE format at various temperatures and process the data using the MAKXSF6.2 and NJOY2016 programs to process the data at the various temperatures in order to conduct this study and analyze cross-section data. The infinite multiplication factor, the concentrations and activities of the main fission products, the actinide radionuclides accumulated in the pin cell, and the total radioactivity were all estimated and compared in this study using the Monte Carlo N-Particle 6 (MCNP6.2) and DRAGON5 programs. Additionally, the behavior of the Pressurized Water Reactor (PWR) thorium pin cell that is dependent on burn-up (BU) was validated and compared with the reference data obtained using the Massachusetts Institute of Technology (MIT-MOCUP), Idaho National Engineering and Environmental Laboratory (INEEL-MOCUP), and CASMO-4 codes. The results of this study indicate that all of the codes examined have good agreements.Keywords: PWR thorium pin cell, ENDF/B-VIII.0, MAKXSF6.2, NJOY2016, MCNP6.2, DRAGON5, fuel burn-up.
Procedia PDF Downloads 10324909 Natural Language News Generation from Big Data
Authors: Bastian Haarmann, Likas Sikorski
Abstract:
In this paper, we introduce an NLG application for the automatic creation of ready-to-publish texts from big data. The fully automatic generated stories have a high resemblance to the style in which the human writer would draw up a news story. Topics may include soccer games, stock exchange market reports, weather forecasts and many more. The generation of the texts runs according to the human language production. Each generated text is unique. Ready-to-publish stories written by a computer application can help humans to quickly grasp the outcomes of big data analyses, save time-consuming pre-formulations for journalists and cater to rather small audiences by offering stories that would otherwise not exist.Keywords: big data, natural language generation, publishing, robotic journalism
Procedia PDF Downloads 43124908 Performance Evaluation of the Classic seq2seq Model versus a Proposed Semi-supervised Long Short-Term Memory Autoencoder for Time Series Data Forecasting
Authors: Aswathi Thrivikraman, S. Advaith
Abstract:
The study is aimed at designing encoders for deciphering intricacies in time series data by redescribing the dynamics operating on a lower-dimensional manifold. A semi-supervised LSTM autoencoder is devised and investigated to see if the latent representation of the time series data can better forecast the data. End-to-end training of the LSTM autoencoder, together with another LSTM network that is connected to the latent space, forces the hidden states of the encoder to represent the most meaningful latent variables relevant for forecasting. Furthermore, the study compares the predictions with those of a traditional seq2seq model.Keywords: LSTM, autoencoder, forecasting, seq2seq model
Procedia PDF Downloads 15624907 The Analysis of Emergency Shutdown Valves Torque Data in Terms of Its Use as a Health Indicator for System Prognostics
Authors: Ewa M. Laskowska, Jorn Vatn
Abstract:
Industry 4.0 focuses on digital optimization of industrial processes. The idea is to use extracted data in order to build a decision support model enabling use of those data for real time decision making. In terms of predictive maintenance, the desired decision support tool would be a model enabling prognostics of system's health based on the current condition of considered equipment. Within area of system prognostics and health management, a commonly used health indicator is Remaining Useful Lifetime (RUL) of a system. Because the RUL is a random variable, it has to be estimated based on available health indicators. Health indicators can be of different types and come from different sources. They can be process variables, equipment performance variables, data related to number of experienced failures, etc. The aim of this study is the analysis of performance variables of emergency shutdown valves (ESV) used in oil and gas industry. ESV is inspected periodically, and at each inspection torque and time of valve operation are registered. The data will be analyzed by means of machine learning or statistical analysis. The purpose is to investigate whether the available data could be used as a health indicator for a prognostic purpose. The second objective is to examine what is the most efficient way to incorporate the data into predictive model. The idea is to check whether the data can be applied in form of explanatory variables in Markov process or whether other stochastic processes would be a more convenient to build an RUL model based on the information coming from registered data.Keywords: emergency shutdown valves, health indicator, prognostics, remaining useful lifetime, RUL
Procedia PDF Downloads 9124906 Assessment of the Possible Effects of Biological Control Agents of Lantana camara and Chromolaena odorata in Davao City, Mindanao, Philippines
Authors: Cristine P. Canlas, Crislene Mae L. Gever, Patricia Bea R. Rosialda, Ma. Nina Regina M. Quibod, Perry Archival C. Buenavente, Normandy M. Barbecho, Cynthia Adeline A. Layusa, Michael Day
Abstract:
Invasive plants have an impact on global biodiversity and ecosystem function, and their management is a complex and formidable task. Two of these invasive plant species, Lantana camara and Chromolaena odorata, are found in the Philippines. Lantana camara has the ability to suppress the growth of and outcompete neighboring plants. Chromolaena odorata causes serious agricultural and economical damage and causes fire hazards during dry season. In addition, both species has been reported to poison livestock. One of the known global management strategies to control invasive plants is the introduction of biological control agents. These natural enemies of the invasive plants reduce population density and impacts of the invasive plants, resulting in the balance of the nature in their invasion. Through secondary data sources, interviews, and field validation (e.g. microhabitat searches, sweep netting, opportunistic sampling, photo-documentation), we investigated whether the biocontrol agents previously released by the Philippine Coconut Authority (PCA) in their Davao Research Center to control these invasive plants are still present and are affecting their respective host weeds. We confirm the presence of the biocontrol agent of L. camara, Uroplata girardi, which was introduced in 1985, and Cecidochares connexa, a biocontrol agent of C. odorata released in 2003. Four other biocontrol agents were found to affect L. camara. Signs of damage (e.g. stem galls in C. odorata, and leaf mines in L. camara) signify that these biocontrol agents have successfully established outside of their release site in Davao. Further investigating the extent of the spread of these biocontrol agents in the Philippines and their damage to the two weeds will contribute to the management of invasive plant species in the country.Keywords: invasive alien species, biological control agent, entomology, worst weeds
Procedia PDF Downloads 37424905 Block Mining: Block Chain Enabled Process Mining Database
Authors: James Newman
Abstract:
Process mining is an emerging technology that looks to serialize enterprise data in time series data. It has been used by many companies and has been the subject of a variety of research papers. However, the majority of current efforts have looked at how to best create process mining from standard relational databases. This paper is the first pass at outlining a database custom-built for the minimal viable product of process mining. We present Block Miner, a blockchain protocol to store process mining data across a distributed network. We demonstrate the feasibility of storing process mining data on the blockchain. We present a proof of concept and show how the intersection of these two technologies helps to solve a variety of issues, including but not limited to ransomware attacks, tax documentation, and conflict resolution.Keywords: blockchain, process mining, memory optimization, protocol
Procedia PDF Downloads 10324904 Application of Hydrological Engineering Centre – River Analysis System (HEC-RAS) to Estuarine Hydraulics
Authors: Julia Zimmerman, Gaurav Savant
Abstract:
This study aims to evaluate the efficacy of the U.S. Army Corp of Engineers’ River Analysis System (HEC-RAS) application to modeling the hydraulics of estuaries. HEC-RAS has been broadly used for a variety of riverine applications. However, it has not been widely applied to the study of circulation in estuaries. This report details the model development and validation of a combined 1D/2D unsteady flow hydraulic model using HEC-RAS for estuaries and they are associated with tidally influenced rivers. Two estuaries, Galveston Bay and Delaware Bay, were used as case studies. Galveston Bay, a bar-built, vertically mixed estuary, was modeled for the 2005 calendar year. Delaware Bay, a drowned river valley estuary, was modeled from October 22, 2019, to November 5, 2019. Water surface elevation was used to validate both models by comparing simulation results to NOAA’s Center for Operational Oceanographic Products and Services (CO-OPS) gauge data. Simulations were run using the Diffusion Wave Equations (DW), the Shallow Water Equations, Eulerian-Lagrangian Method (SWE-ELM), and the Shallow Water Equations Eulerian Method (SWE-EM) and compared for both accuracy and computational resources required. In general, the Diffusion Wave Equations results were found to be comparable to the two Shallow Water equations sets while requiring less computational power. The 1D/2D combined approach was valid for study areas within the 2D flow area, with the 1D flow serving mainly as an inflow boundary condition. Within the Delaware Bay estuary, the HEC-RAS DW model ran in 22 minutes and had an average R² value of 0.94 within the 2-D mesh. The Galveston Bay HEC-RAS DW ran in 6 hours and 47 minutes and had an average R² value of 0.83 within the 2-D mesh. The longer run time and lower R² for Galveston Bay can be attributed to the increased length of the time frame modeled and the greater complexity of the estuarine system. The models did not accurately capture tidal effects within the 1D flow area.Keywords: Delaware bay, estuarine hydraulics, Galveston bay, HEC-RAS, one-dimensional modeling, two-dimensional modeling
Procedia PDF Downloads 19924903 Drones, Rebels and Bombs: Explaining the Role of Private Security and Expertise in a Post-piratical Indian Ocean
Authors: Jessica Kate Simonds
Abstract:
The last successful hijacking perpetrated by Somali pirates in 2012 represented a critical turning point for the identity and brand of Indian Ocean (IO) insecurity, coined in this paper as the era of the post-piratical. This paper explores the broadening of the PMSC business model to account and contribute to the design of a new IO security environment that prioritises foreign and insurgency drone activity and Houthi rebel operations as the main threat to merchant shipping in the post-2012 era. This study is situated within a longer history of analysing maritime insecurity and also contributes a bespoke conceptual framework that understands the sea as a space that is produced and reproduced relative to existing and emerging threats to merchant shipping based on bespoke models of information sharing and intelligence acquisition. This paper also makes a prominent empirical contribution by drawing on a post-positivist methodology, data drawn from original semi-structured interviews with senior maritime insurers and active merchant seafarers that is triangulated with industry-produced guidance such as the BMP series as primary data sources. Each set is analysed through qualitative discourse and content analysis and supported by the quantitative data sets provided by the IMB Piracy Reporting center and intelligence networks. This analysis reveals that mechanisms such as the IGP&I Maritime Security Committee and intelligence divisions of PMSC’s have driven the exchanges of knowledge between land and sea and thus the reproduction of the maritime security environment through new regulations and guidance to account dones, rebels and bombs as the key challenges in the IO, beyond piracy. A contribution of this paper is the argument that experts who may not be in the highest-profile jobs are the architects of maritime insecurity based on their detailed knowledge and connections to vessels in transit. This paper shares the original insights of those who have served in critical decision making spaces to demonstrate that the development and refinement of industry produced deterrence guidance that has been accredited to the mitigation of piracy, have shaped new editions such as BMP 5 that now serve to frame a new security environment that prioritises the mitigation of risks from drones and WBEID’s from both state and insurgency risk groups. By highlighting the experiences and perspectives of key players on both land and at sea, the key finding of this paper is outlining that as pirates experienced a financial boom by profiteering from their bespoke business model during the peak of successful hijackings, the private security market encountered a similar level of financial success and guaranteed risk environment in which to prospect business. Thus, the reproduction of the Indian Ocean as a maritime security environment reflects a new found purpose for PMSC’s as part of the broader conglomerate of maritime insurers, regulators, shipowners and managers who continue to redirect the security consciousness and IO brand of insecurity.Keywords: maritime security, private security, risk intelligence, political geography, international relations, political economy, maritime law, security studies
Procedia PDF Downloads 18524902 Vulnerability of Groundwater to Pollution in Akwa Ibom State, Southern Nigeria, using the DRASTIC Model and Geographic Information System (GIS)
Authors: Aniedi A. Udo, Magnus U. Igboekwe, Rasaaq Bello, Francis D. Eyenaka, Michael C. Ohakwere-Eze
Abstract:
Groundwater vulnerability to pollution was assessed in Akwa Ibom State, Southern Nigeria, with the aim of locating areas with high potentials for resource contamination, especially due to anthropogenic influence. The electrical resistivity method was utilized in the collection of the initial field data. Additional data input, which included depth to static water level, drilled well log data, aquifer recharge data, percentage slope, as well as soil information, were sourced from secondary sources. The initial field data were interpreted both manually and with computer modeling to provide information on the geoelectric properties of the subsurface. Interpreted results together with the secondary data were used to develop the DRASTIC thematic maps. A vulnerability assessment was performed using the DRASTIC model in a GIS environment and areas with high vulnerability which needed immediate attention was clearly mapped out and presented using an aquifer vulnerability map. The model was subjected to validation and the rate of validity was 73% within the area of study.Keywords: groundwater, vulnerability, DRASTIC model, pollution
Procedia PDF Downloads 20724901 A Review Paper on Data Security in Precision Agriculture Using Internet of Things
Authors: Tonderai Muchenje, Xolani Mkhwanazi
Abstract:
Precision agriculture uses a number of technologies, devices, protocols, and computing paradigms to optimize agricultural processes. Big data, artificial intelligence, cloud computing, and edge computing are all used to handle the huge amounts of data generated by precision agriculture. However, precision agriculture is still emerging and has a low level of security features. Furthermore, future solutions will demand data availability and accuracy as key points to help farmers, and security is important to build robust and efficient systems. Since precision agriculture comprises a wide variety and quantity of resources, security addresses issues such as compatibility, constrained resources, and massive data. Moreover, conventional protection schemes used in the traditional internet may not be useful for agricultural systems, creating extra demands and opportunities. Therefore, this paper aims at reviewing state of the art of precision agriculture security, particularly in open field agriculture, discussing its architecture, describing security issues, and presenting the major challenges and future directions.Keywords: precision agriculture, security, IoT, EIDE
Procedia PDF Downloads 9024900 Commercial Automobile Insurance: A Practical Approach of the Generalized Additive Model
Authors: Nicolas Plamondon, Stuart Atkinson, Shuzi Zhou
Abstract:
The insurance industry is usually not the first topic one has in mind when thinking about applications of data science. However, the use of data science in the finance and insurance industry is growing quickly for several reasons, including an abundance of reliable customer data, ferocious competition requiring more accurate pricing, etc. Among the top use cases of data science, we find pricing optimization, customer segmentation, customer risk assessment, fraud detection, marketing, and triage analytics. The objective of this paper is to present an application of the generalized additive model (GAM) on a commercial automobile insurance product: an individually rated commercial automobile. These are vehicles used for commercial purposes, but for which there is not enough volume to apply pricing to several vehicles at the same time. The GAM model was selected as an improvement over GLM for its ease of use and its wide range of applications. The model was trained using the largest split of the data to determine model parameters. The remaining part of the data was used as testing data to verify the quality of the modeling activity. We used the Gini coefficient to evaluate the performance of the model. For long-term monitoring, commonly used metrics such as RMSE and MAE will be used. Another topic of interest in the insurance industry is to process of producing the model. We will discuss at a high level the interactions between the different teams with an insurance company that needs to work together to produce a model and then monitor the performance of the model over time. Moreover, we will discuss the regulations in place in the insurance industry. Finally, we will discuss the maintenance of the model and the fact that new data does not come constantly and that some metrics can take a long time to become meaningful.Keywords: insurance, data science, modeling, monitoring, regulation, processes
Procedia PDF Downloads 7624899 Modeling Pan Evaporation Using Intelligent Methods of ANN, LSSVM and Tree Model M5 (Case Study: Shahroud and Mayamey Stations)
Authors: Hamidreza Ghazvinian, Khosro Ghazvinian, Touba Khodaiean
Abstract:
The importance of evaporation estimation in water resources and agricultural studies is undeniable. Pan evaporation are used as an indicator to determine the evaporation of lakes and reservoirs around the world due to the ease of interpreting its data. In this research, intelligent models were investigated in estimating pan evaporation on a daily basis. Shahroud and Mayamey were considered as the studied cities. These two cities are located in Semnan province in Iran. The mentioned cities have dry weather conditions that are susceptible to high evaporation potential. Meteorological data of 11 years of synoptic stations of Shahrood and Mayamey cities were used. The intelligent models used in this study are Artificial Neural Network (ANN), Least Squares Support Vector Machine (LSSVM), and M5 tree models. Meteorological parameters of minimum and maximum air temperature (Tmax, Tmin), wind speed (WS), sunshine hours (SH), air pressure (PA), relative humidity (RH) as selected input data and evaporation data from pan (EP) to The output data was considered. 70% of data is used at the education level, and 30 % of the data is used at the test level. Models used with explanation coefficient evaluation (R2) Root of Mean Squares Error (RMSE) and Mean Absolute Error (MAE). The results for the two Shahroud and Mayamey stations showed that the above three models' operations are rather appropriate.Keywords: pan evaporation, intelligent methods, shahroud, mayamey
Procedia PDF Downloads 7424898 Supply Chain Competitiveness with the Perspective of Service Performance Between Supply Chain Actors and Functions: A Theoretical Model
Authors: Umer Mukhtar
Abstract:
Supply Chain Competitiveness is the capability of a supply chain to deliver value to the customer for the sake of competitive advantage. Service Performance and Quality intervene between supply chain actors including functions inside the firm in a significant way for the supply chain to achieve a competitive position in the market to gain competitive advantage. Supply Chain competitiveness is the current issue of interest because of supply chains’ competition for competitive advantage rather than firms’. A proposed theoretical model is developed by extracting and integrating different theories to pursue further inquiry based on case studies and survey design. It is also intended to develop a scale of service performance for functions of the focal firm that is a revolving center for a whole supply chain.Keywords: supply chain competitiveness, service performance in supply chain, service quality in supply chain, competitive advantage by supply chain, networks and supply chain, customer value, value supply chain, value chain
Procedia PDF Downloads 61124897 Climate Impact-Minimizing Road Infrastructure Layout for Growing Cities
Authors: Stanislovas Buteliauskas, Aušrius Juozapavičius
Abstract:
City road transport contributes significantly to climate change, and the ongoing world urbanization is only increasing the problem. The paper describes a city planning concept minimizing the number of vehicles on the roads while increasing overall mobility. This becomes possible by utilizing a recently invented two-level road junction with a unique property of serving both as an intersection of uninterrupted traffic and an easily accessible transport hub capable of accumulating private vehicles, and therefore becoming an especially effective park-and-ride solution, and a logistics or business center. Optimized layouts of city road infrastructure, living and work areas, and major roads are presented. The layouts are suitable both for the development of new cities as well as for the expansion of existing ones. Costs of the infrastructure and a positive impact on climate are evaluated in comparison to current city growth patterns.Keywords: congestion, city infrastructure, park-and-ride, road junctions
Procedia PDF Downloads 30524896 Optimal Site Selection for Temporary Housing regarding Disaster Management Case Study: Tehran Municipality (No.6)
Authors: Ghazaleh Monazami Tehrani, Zhamak Monazami Tehrani, Raziyeh Hadavand
Abstract:
Optimal site selection for temporary housing is one of the most important issues in crisis management. In this research, district six of Tehran city with high frequency and geographical distribution of earthquakes has been selected as a case study for positioning temporary housing after a probable earthquake. For achieving this goal this study tries to identify and evaluate distribution of location according to some standards such as compatible and incompatible urban land uses with utility of GIS and AHP. The results of this study show the most susceptible parts of this region in the center. According to the maps, north eastern part of Kordestan, Shaheed Gomnam intersection possesses the highest pixels value in terms of areal extent, therefore these places are recommended as an optimum site location for construction of emergency evacuation base.Keywords: optimal site selection, temporary housing , crisis management, AHP, GIS
Procedia PDF Downloads 25724895 Vibration Frequencies Analysis of Nanoporous Graphene Membrane
Authors: Haw-Long Lee, Win-Jin Chang, Yu-Ching Yang
Abstract:
In this study, we use the atomic-scale finite element method to investigate the vibrational behavior of the armchair- and zigzag-structured nanoporous graphene layers with different size under the SFSF and CFFF boundary conditions. The fundamental frequencies computed for the graphene layers without pore are compared with the results of previous studies. We observe very good correspondence of our results with that of the other studies in all the considered cases. For the armchair- and zigzag-structured nanoporous graphene layers under the SFSF and CFFF boundary conditions, the frequencies decrease as the size of the nanopore increase. When the positions of the pore are symmetric with respect to the center of the graphene, the frequency of the zigzag pore graphene is higher than that of the armchair one.Keywords: atomic-scale finite element method, graphene, nanoporous, natural frequency
Procedia PDF Downloads 36124894 Renovation of Industrial Zones in Ho Chi Minh City: An Approach from Changing Function of Processing to Urban Warehousing
Authors: Thu Le Thi Bao
Abstract:
Industrial parks have both active roles in promoting economic development and source of appearance of boarding houses and slums in the adjacent area, lacking infrastructure, causing many social evils. The context of the recent pandemic and climate change on a global scale pose issues that need to be resolved for sustainable development. Ho Chi Minh City aims to develop housing for migrant workers to stabilize human resources and, at the same time, solve problems of social evils caused by poor living conditions. The paper focuses on the content of renovating existing industrial parks and worker accommodation in Ho Chi Minh City to propose appropriate models, contributing to the goal of urban embellishment and solutions for industrial parks to adapt to abnormal impact conditions such as pandemics, climate change, crises.Keywords: industrial park, social housing, accommodation, distribution center
Procedia PDF Downloads 11324893 Generating Insights from Data Using a Hybrid Approach
Authors: Allmin Susaiyah, Aki Härmä, Milan Petković
Abstract:
Automatic generation of insights from data using insight mining systems (IMS) is useful in many applications, such as personal health tracking, patient monitoring, and business process management. Existing IMS face challenges in controlling insight extraction, scaling to large databases, and generalising to unseen domains. In this work, we propose a hybrid approach consisting of rule-based and neural components for generating insights from data while overcoming the aforementioned challenges. Firstly, a rule-based data 2CNL component is used to extract statistically significant insights from data and represent them in a controlled natural language (CNL). Secondly, a BERTSum-based CNL2NL component is used to convert these CNLs into natural language texts. We improve the model using task-specific and domain-specific fine-tuning. Our approach has been evaluated using statistical techniques and standard evaluation metrics. We overcame the aforementioned challenges and observed significant improvement with domain-specific fine-tuning.Keywords: data mining, insight mining, natural language generation, pre-trained language models
Procedia PDF Downloads 120