Search results for: cloud operation system
19475 Parameters of Main Stage of Discharge between Artificial Charged Aerosol Cloud and Ground in Presence of Model Hydrometeor Arrays
Authors: D. S. Zhuravkova, A. G. Temnikov, O. S. Belova, L. L. Chernensky, T. K. Gerastenok, I. Y. Kalugina, N. Y. Lysov, A.V. Orlov
Abstract:
Investigation of the discharges from the artificial charged water aerosol clouds in presence of the arrays of the model hydrometeors could help to receive the new data about the peculiarities of the return stroke formation between the thundercloud and the ground when the large volumes of the hail particles participate in the lightning discharge initiation and propagation stimulation. Artificial charged water aerosol clouds of the negative or positive polarity with the potential up to one million volts have been used. Hail has been simulated by the group of the conductive model hydrometeors of the different form. Parameters of the impulse current of the main stage of the discharge between the artificial positively and negatively charged water aerosol clouds and the ground in presence of the model hydrometeors array and of its corresponding electromagnetic radiation have been determined. It was established that the parameters of the array of the model hydrometeors influence on the parameters of the main stage of the discharge between the artificial thundercloud cell and the ground. The maximal values of the main stage current impulse parameters and the electromagnetic radiation registered by the plate antennas have been found for the array of the model hydrometeors of the cylinder revolution form for the negatively charged aerosol cloud and for the array of the hydrometeors of the plate rhombus form for the positively charged aerosol cloud, correspondingly. It was found that parameters of the main stage of the discharge between the artificial charged water aerosol cloud and the ground in presence of the model hydrometeor array of the different considered forms depend on the polarity of the artificial charged aerosol cloud. In average, for all forms of the investigated model hydrometeors arrays, the values of the amplitude and the current rise of the main stage impulse current and the amplitude of the corresponding electromagnetic radiation for the artificial charged aerosol cloud of the positive polarity were in 1.1-1.9 times higher than for the charged aerosol cloud of the negative polarity. Thus, the received results could indicate to the possible more important role of the big volumes of the large hail arrays in the thundercloud on the parameters of the return stroke for the positive lightning.Keywords: main stage of discharge, hydrometeor form, lightning parameters, negative and positive artificial charged aerosol cloud
Procedia PDF Downloads 25619474 3D Design of Orthotic Braces and Casts in Medical Applications Using Microsoft Kinect Sensor
Authors: Sanjana S. Mallya, Roshan Arvind Sivakumar
Abstract:
Orthotics is the branch of medicine that deals with the provision and use of artificial casts or braces to alter the biomechanical structure of the limb and provide support for the limb. Custom-made orthoses provide more comfort and can correct issues better than those available over-the-counter. However, they are expensive and require intricate modelling of the limb. Traditional methods of modelling involve creating a plaster of Paris mould of the limb. Lately, CAD/CAM and 3D printing processes have improved the accuracy and reduced the production time. Ordinarily, digital cameras are used to capture the features of the limb from different views to create a 3D model. We propose a system to model the limb using Microsoft Kinect2 sensor. The Kinect can capture RGB and depth frames simultaneously up to 30 fps with sufficient accuracy. The region of interest is captured from three views, each shifted by 90 degrees. The RGB and depth data are fused into a single RGB-D frame. The resolution of the RGB frame is 1920px x 1080px while the resolution of the Depth frame is 512px x 424px. As the resolution of the frames is not equal, RGB pixels are mapped onto the Depth pixels to make sure data is not lost even if the resolution is lower. The resulting RGB-D frames are collected and using the depth coordinates, a three dimensional point cloud is generated for each view of the Kinect sensor. A common reference system was developed to merge the individual point clouds from the Kinect sensors. The reference system consisted of 8 coloured cubes, connected by rods to form a skeleton-cube with the coloured cubes at the corners. For each Kinect, the region of interest is the square formed by the centres of the four cubes facing the Kinect. The point clouds are merged by considering one of the cubes as the origin of a reference system. Depending on the relative distance from each cube, the three dimensional coordinate points from each point cloud is aligned to the reference frame to give a complete point cloud. The RGB data is used to correct for any errors in depth data for the point cloud. A triangular mesh is generated from the point cloud by applying Delaunay triangulation which generates the rough surface of the limb. This technique forms an approximation of the surface of the limb. The mesh is smoothened to obtain a smooth outer layer to give an accurate model of the limb. The model of the limb is used as a base for designing the custom orthotic brace or cast. It is transferred to a CAD/CAM design file to design of the brace above the surface of the limb. The proposed system would be more cost effective than current systems that use MRI or CT scans for generating 3D models and would be quicker than using traditional plaster of Paris cast modelling and the overall setup time is also low. Preliminary results indicate that the accuracy of the Kinect2 is satisfactory to perform modelling.Keywords: 3d scanning, mesh generation, Microsoft kinect, orthotics, registration
Procedia PDF Downloads 19119473 A Temporal QoS Ontology For ERTMS/ETCS
Authors: Marc Sango, Olimpia Hoinaru, Christophe Gransart, Laurence Duchien
Abstract:
Ontologies offer a means for representing and sharing information in many domains, particularly in complex domains. For example, it can be used for representing and sharing information of System Requirement Specification (SRS) of complex systems like the SRS of ERTMS/ETCS written in natural language. Since this system is a real-time and critical system, generic ontologies, such as OWL and generic ERTMS ontologies provide minimal support for modeling temporal information omnipresent in these SRS documents. To support the modeling of temporal information, one of the challenges is to enable representation of dynamic features evolving in time within a generic ontology with a minimal redesign of it. The separation of temporal information from other information can help to predict system runtime operation and to properly design and implement them. In addition, it is helpful to provide a reasoning and querying techniques to reason and query temporal information represented in the ontology in order to detect potential temporal inconsistencies. Indeed, a user operation, such as adding a new constraint on existing planning constraints can cause temporal inconsistencies, which can lead to system failures. To address this challenge, we propose a lightweight 3-layer temporal Quality of Service (QoS) ontology for representing, reasoning and querying over temporal and non-temporal information in a complex domain ontology. Representing QoS entities in separated layers can clarify the distinction between the non QoS entities and the QoS entities in an ontology. The upper generic layer of the proposed ontology provides an intuitive knowledge of domain components, specially ERTMS/ETCS components. The separation of the intermediate QoS layer from the lower QoS layer allows us to focus on specific QoS Characteristics, such as temporal or integrity characteristics. In this paper, we focus on temporal information that can be used to predict system runtime operation. To evaluate our approach, an example of the proposed domain ontology for handover operation, as well as a reasoning rule over temporal relations in this domain-specific ontology, are given.Keywords: system requirement specification, ERTMS/ETCS, temporal ontologies, domain ontologies
Procedia PDF Downloads 42219472 Multimodal Biometric Cryptography Based Authentication in Cloud Environment to Enhance Information Security
Authors: D. Pugazhenthi, B. Sree Vidya
Abstract:
Cloud computing is one of the emerging technologies that enables end users to use the services of cloud on ‘pay per usage’ strategy. This technology grows in a fast pace and so is its security threat. One among the various services provided by cloud is storage. In this service, security plays a vital factor for both authenticating legitimate users and protection of information. This paper brings in efficient ways of authenticating users as well as securing information on the cloud. Initial phase proposed in this paper deals with an authentication technique using multi-factor and multi-dimensional authentication system with multi-level security. Unique identification and slow intrusive formulates an advanced reliability on user-behaviour based biometrics than conventional means of password authentication. By biometric systems, the accounts are accessed only by a legitimate user and not by a nonentity. The biometric templates employed here do not include single trait but multiple, viz., iris and finger prints. The coordinating stage of the authentication system functions on Ensemble Support Vector Machine (SVM) and optimization by assembling weights of base SVMs for SVM ensemble after individual SVM of ensemble is trained by the Artificial Fish Swarm Algorithm (AFSA). Thus it helps in generating a user-specific secure cryptographic key of the multimodal biometric template by fusion process. Data security problem is averted and enhanced security architecture is proposed using encryption and decryption system with double key cryptography based on Fuzzy Neural Network (FNN) for data storing and retrieval in cloud computing . The proposing scheme aims to protect the records from hackers by arresting the breaking of cipher text to original text. This improves the authentication performance that the proposed double cryptographic key scheme is capable of providing better user authentication and better security which distinguish between the genuine and fake users. Thus, there are three important modules in this proposed work such as 1) Feature extraction, 2) Multimodal biometric template generation and 3) Cryptographic key generation. The extraction of the feature and texture properties from the respective fingerprint and iris images has been done initially. Finally, with the help of fuzzy neural network and symmetric cryptography algorithm, the technique of double key encryption technique has been developed. As the proposed approach is based on neural networks, it has the advantage of not being decrypted by the hacker even though the data were hacked already. The results prove that authentication process is optimal and stored information is secured.Keywords: artificial fish swarm algorithm (AFSA), biometric authentication, decryption, encryption, fingerprint, fusion, fuzzy neural network (FNN), iris, multi-modal, support vector machine classification
Procedia PDF Downloads 26019471 Information Requirements for Vessel Traffic Service Operations
Authors: Fan Li, Chun-Hsien Chen, Li Pheng Khoo
Abstract:
Operators of vessel traffic service (VTS) center provides three different types of services; namely information service, navigational assistance and traffic organization to vessels. To provide these services, operators monitor vessel traffic through computer interface and provide navigational advice based on the information integrated from multiple sources, including automatic identification system (AIS), radar system, and closed circuit television (CCTV) system. Therefore, this information is crucial in VTS operation. However, what information the VTS operator actually need to efficiently and properly offer services is unclear. The aim of this study is to investigate into information requirements for VTS operation. To achieve this aim, field observation was carried out to elicit the information requirements for VTS operation. The study revealed that the most frequent and important tasks were handling arrival vessel report, potential conflict control and abeam vessel report. Current location and vessel name were used in all tasks. Hazard cargo information was particularly required when operators handle arrival vessel report. The speed, the course, and the distance of two or several vessels were only used in potential conflict control. The information requirements identified in this study can be utilized in designing a human-computer interface that takes into consideration what and when information should be displayed, and might be further used to build the foundation of a decision support system for VTS.Keywords: vessel traffic service, information requirements, hierarchy task analysis, field observation
Procedia PDF Downloads 25119470 Modeling and Power Control of DFIG Used in Wind Energy System
Authors: Nadia Ben Si Ali, Nadia Benalia, Nora Zerzouri
Abstract:
Wind energy generation has attracted great interests in recent years. Doubly Fed Induction Generator (DFIG) for wind turbines are largely deployed because variable-speed wind turbines have many advantages over fixed-speed generation such as increased energy capture, operation at maximum power point, improved efficiency, and power quality. This paper presents the operation and vector control of a Doubly-fed Induction Generator (DFIG) system where the stator is connected directly to a stiff grid and the rotor is connected to the grid through bidirectional back-to-back AC-DC-AC converter. The basic operational characteristics, mathematical model of the aerodynamic system and vector control technique which is used to obtain decoupled control of powers are investigated using the software Mathlab/Simulink.Keywords: wind turbine, Doubly Fed Induction Generator, wind speed controller, power system stability
Procedia PDF Downloads 37919469 Reactive Power Control with Plug-In Electric Vehicles
Authors: Mostafa Dastori, Sirus Mohammadi
Abstract:
While plug-in electric vehicles (PEVs) potentially have the capability to fulfill the energy storage needs of the electric grid, the degradation on the battery during this operation makes it less preferable by the auto manufacturers and consumers. On the other hand, the on-board chargers can also supply energy storage system applications such as reactive power compensation, voltage regulation, and power factor correction without the need of engaging the battery with the grid and thereby preserving its lifetime. It presents the design motives of single-phase on-board chargers in detail and makes a classification of the chargers based on their future vehicle-to-grid usage. The pros and cons of each different ac–dc topology are discussed to shed light on their suit- ability for reactive power support. This paper also presents and analyzes the differences between charging-only operation and capacitive reactive power operation that results in increased demand from the dc-link capacitor (more charge/discharge cycles and in- creased second harmonic ripple current). Moreover, battery state of charge is spared from losses during reactive power operation, but converter output power must be limited below its rated power rating to have the same stress on the dc-link capacitor.Keywords: energy storage system, battery unit, cost, optimal sizing, plug-in electric vehicles (PEVs), smart grid
Procedia PDF Downloads 34319468 A Knowledge-As-A-Service Support Framework for Ambient Learning in Kenya
Authors: Lucy W. Mburu, Richard Karanja, Simon N. Mwendia
Abstract:
Over recent years, learners have experienced a constant need to access on demand knowledge that is fully aligned with the paradigm of cloud computing. As motivated by the global sustainable development goal to ensure inclusive and equitable learning opportunities, this research has developed a framework hinged on the knowledge-as-a-service architecture that utilizes knowledge from ambient learning systems. Through statistical analysis and decision tree modeling, the study discovers influential variables for ambient learning among university students. The main aim is to generate a platform for disseminating and exploiting the available knowledge to aid the learning process and, thus, to improve educational support on the ambient learning system. The research further explores how collaborative effort can be used to form a knowledge network that allows access to heterogeneous sources of knowledge, which benefits knowledge consumers, such as the developers of ambient learning systems.Keywords: actionable knowledge, ambient learning, cloud computing, decision trees, knowledge as a service
Procedia PDF Downloads 16119467 Prediction of Positive Cloud-to-Ground Lightning Striking Zones for Charged Thundercloud Based on Line Charge Model
Authors: Surajit Das Barman, Rakibuzzaman Shah, Apurv Kumar
Abstract:
Bushfire is known as one of the ascendant factors to create pyrocumulus thundercloud that causes the ignition of new fires by pyrocumulonimbus (pyroCb) lightning strikes and creates major losses of lives and property worldwide. A conceptual model-based risk planning would be beneficial to predict the lightning striking zones on the surface of the earth underneath the pyroCb thundercloud. PyroCb thundercloud can generate both positive cloud-to-ground (+CG) and negative cloud-to-ground (-CG) lightning in which +CG tends to ignite more bushfires and cause massive damage to nature and infrastructure. In this paper, a simple line charge structured thundercloud model is constructed in 2-D coordinates using the method of image charge to predict the probable +CG lightning striking zones on the earth’s surface for two conceptual thundercloud charge configurations: titled dipole and conventional tripole structure with excessive lower positive charge regions that lead to producing +CG lightning. The electric potential and surface charge density along the earth’s surface for both structures via continuously adjusting the position and the charge density of their charge regions is investigated. Simulation results for tilted dipole structure confirm the down-shear extension of the upper positive charge region in the direction of the cloud’s forward flank by 4 to 8 km, resulting in negative surface density, and would expect +CG lightning to strike within 7.8 km to 20 km around the earth periphery in the direction of the cloud’s forward flank. On the other hand, the conceptual tripole charge structure with enhanced lower positive charge region develops negative surface charge density on the earth’s surface in the range |x| < 6.5 km beneath the thundercloud and highly favors producing +CG lightning strikes.Keywords: pyrocumulonimbus, cloud-to-ground lightning, charge structure, surface charge density, forward flank
Procedia PDF Downloads 11319466 Fire Safety Engineering of Wood Dust Layer or Cloud
Authors: Marzena Półka, Bożena Kukfisz
Abstract:
This paper presents an analysis of dust explosion hazards in the process industries. It includes selected testing method of dust explosibility and presentation two of them according to experimental standards used by Department of Combustion and Fire Theory in The Main School of Fire Service in Warsaw. In the article are presented values of maximum acceptable surface temperature (MAST) of machines operating in the presence of dust cloud and chosen dust layer with thickness of 5 and 12,5mm. The comparative analysis, points to the conclusion that the value of the minimum ignition temperature of the layer (MITL) and the minimum ignition temperature of dust cloud (MTCD) depends on the granularity of the substance. Increasing the thickness of the dust layer reduces minimum ignition temperature of dust layer. Increasing the thickness of dust at the same time extends the flameless combustion and delays the ignition.Keywords: fire safety engineering, industrial hazards, minimum ignition temperature, wood dust
Procedia PDF Downloads 32019465 Effective Nutrition Label Use on Smartphones
Authors: Vladimir Kulyukin, Tanwir Zaman, Sarat Kiran Andhavarapu
Abstract:
Research on nutrition label use identifies four factors that impede comprehension and retention of nutrition information by consumers: label’s location on the package, presentation of information within the label, label’s surface size, and surrounding visual clutter. In this paper, a system is presented that makes nutrition label use more effective for nutrition information comprehension and retention. The system’s front end is a smartphone application. The system’s back end is a four node Linux cluster for image recognition and data storage. Image frames captured on the smartphone are sent to the back end for skewed or aligned barcode recognition. When barcodes are recognized, corresponding nutrition labels are retrieved from a cloud database and presented to the user on the smartphone’s touchscreen. Each displayed nutrition label is positioned centrally on the touchscreen with no surrounding visual clutter. Wikipedia links to important nutrition terms are embedded to improve comprehension and retention of nutrition information. Standard touch gestures (e.g., zoom in/out) available on mainstream smartphones are used to manipulate the label’s surface size. The nutrition label database currently includes 200,000 nutrition labels compiled from public web sites by a custom crawler. Stress test experiments with the node cluster are presented. Implications for proactive nutrition management and food policy are discussed.Keywords: mobile computing, cloud computing, nutrition label use, nutrition management, barcode scanning
Procedia PDF Downloads 37519464 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encryption
Authors: Waziri Victor Onomza, John K. Alhassan, Idris Ismaila, Noel Dogonyaro Moses
Abstract:
This paper describes the problem of building secure computational services for encrypted information in the Cloud Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy, confidentiality, availability of the users. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute theoretical presentations in high-level computational processes that are based on number theory and algebra that can easily be integrated and leveraged in the Cloud computing with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based cryptographic security algorithm.Keywords: big data analytics, security, privacy, bootstrapping, homomorphic, homomorphic encryption scheme
Procedia PDF Downloads 38219463 Management Software for the Elaboration of an Electronic File in the Pharmaceutical Industry Following Mexican Regulations
Authors: M. Peña Aguilar Juan, Ríos Hernández Ezequiel, R. Valencia Luis
Abstract:
For certification, certain goods of public interest, such as medicines and food, it is required the preparation and delivery of a dossier. For its elaboration, legal and administrative knowledge must be taken, as well as organization of the documents of the process, and an order that allows the file verification. Therefore, a virtual platform was developed to support the process of management and elaboration of the dossier, providing accessibility to the information and interfaces that allow the user to know the status of projects. The development of dossier system on the cloud allows the inclusion of the technical requirements for the software management, including the validation and the manufacturing in the field industry. The platform guides and facilitates the dossier elaboration (report, file or history), considering Mexican legislation and regulations, it also has auxiliary tools for its management. This technological alternative provides organization support for documents and accessibility to the information required to specify the successful development of a dossier. The platform divides into the following modules: System control, catalog, dossier and enterprise management. The modules are designed per the structure required in a dossier in those areas. However, the structure allows for flexibility, as its goal is to become a tool that facilitates and does not obstruct processes. The architecture and development of the software allows flexibility for future work expansion to other fields, this would imply feeding the system with new regulations.Keywords: electronic dossier, cloud management software, pharmaceutical industry, sanitary registration
Procedia PDF Downloads 29819462 IT-Based Global Healthcare Delivery System: An Alternative Global Healthcare Delivery System
Authors: Arvind Aggarwal
Abstract:
We have developed a comprehensive global healthcare delivery System based on information technology. It has medical consultation system where a virtual consultant can give medical consultation to the patients and Doctors at the digital medical centre after reviewing the patient’s EMR file consisting of patient’s history, investigations in the voice, images and data format. The system has the surgical operation system too, where a remote robotic consultant can conduct surgery at the robotic surgical centre. The instant speech and text translation is incorporated in the software where the patient’s speech and text (language) can be translated into the consultant’s language and vice versa. A consultant of any specialty (surgeon or Physician) based in any country can provide instant health care consultation, to any patient in any country without loss of time. Robotic surgeons based in any country in a tertiary care hospital can perform remote robotic surgery, through patient friendly telemedicine and tele-surgical centres. The patient EMR, financial data and data of all the consultants and robotic surgeons shall be stored in cloud. It is a complete comprehensive business model with healthcare medical and surgical delivery system. The whole system is self-financing and can be implemented in any country. The entire system uses paperless, filmless techniques. This eliminates the use of all consumables thereby reduces substantial cost which is incurred by consumables. The consultants receive virtual patients, in the form of EMR, thus the consultant saves time and expense to travel to the hospital to see the patients. The consultant gets electronic file ready for reporting & diagnosis. Hence time spent on the physical examination of the patient is saved, the consultant can, therefore, spend quality time in studying the EMR/virtual patient and give his instant advice. The time consumed per patient is reduced and therefore can see more number of patients, the cost of the consultation per patients is therefore reduced. The additional productivity of the consultants can be channelized to serve rural patients devoid of doctors.Keywords: e-health, telemedicine, telecare, IT-based healthcare
Procedia PDF Downloads 18119461 A Machine Learning Based Method to Detect System Failure in Resource Constrained Environment
Authors: Payel Datta, Abhishek Das, Abhishek Roychoudhury, Dhiman Chattopadhyay, Tanushyam Chattopadhyay
Abstract:
Machine learning (ML) and deep learning (DL) is most predominantly used in image/video processing, natural language processing (NLP), audio and speech recognition but not that much used in system performance evaluation. In this paper, authors are going to describe the architecture of an abstraction layer constructed using ML/DL to detect the system failure. This proposed system is used to detect the system failure by evaluating the performance metrics of an IoT service deployment under constrained infrastructure environment. This system has been tested on the manually annotated data set containing different metrics of the system, like number of threads, throughput, average response time, CPU usage, memory usage, network input/output captured in different hardware environments like edge (atom based gateway) and cloud (AWS EC2). The main challenge of developing such system is that the accuracy of classification should be 100% as the error in the system has an impact on the degradation of the service performance and thus consequently affect the reliability and high availability which is mandatory for an IoT system. Proposed ML/DL classifiers work with 100% accuracy for the data set of nearly 4,000 samples captured within the organization.Keywords: machine learning, system performance, performance metrics, IoT, edge
Procedia PDF Downloads 19519460 Cloud Support for Scientific Workflow Execution: Prototyping Solutions for Remote Sensing Applications
Authors: Sofiane Bendoukha, Daniel Moldt, Hayat Bendoukha
Abstract:
Workflow concepts are essential for the development of remote sensing applications. They can help users to manage and process satellite data and execute scientific experiments on distributed resources. The objective of this paper is to introduce an approach for the specification and the execution of complex scientific workflows in Cloud-like environments. The approach strives to support scientists during the modeling, the deployment and the monitoring of their workflows. This work takes advantage from Petri nets and more pointedly the so-called reference nets formalism, which provides a robust modeling/implementation technique. RENEWGRASS is a tool that we implemented and integrated into the Petri nets editor and simulator RENEW. It provides an easy way to support not experienced scientists during the specification of their workflows. It allows both modeling and enactment of image processing workflows from the remote sensing domain. Our case study is related to the implementation of vegetation indecies. We have implemented the Normalized Differences Vegetation Index (NDVI) workflow. Additionally, we explore the integration possibilities of the Cloud technology as a supplementary layer for the deployment of the current implementation. For this purpose, we discuss migration patterns of data and applications and propose an architecture.Keywords: cloud computing, scientific workflows, petri nets, RENEWGRASS
Procedia PDF Downloads 44819459 Develop a Conceptual Data Model of Geotechnical Risk Assessment in Underground Coal Mining Using a Cloud-Based Machine Learning Platform
Authors: Reza Mohammadzadeh
Abstract:
The major challenges in geotechnical engineering in underground spaces arise from uncertainties and different probabilities. The collection, collation, and collaboration of existing data to incorporate them in analysis and design for given prospect evaluation would be a reliable, practical problem solving method under uncertainty. Machine learning (ML) is a subfield of artificial intelligence in statistical science which applies different techniques (e.g., Regression, neural networks, support vector machines, decision trees, random forests, genetic programming, etc.) on data to automatically learn and improve from them without being explicitly programmed and make decisions and predictions. In this paper, a conceptual database schema of geotechnical risks in underground coal mining based on a cloud system architecture has been designed. A new approach of risk assessment using a three-dimensional risk matrix supported by the level of knowledge (LoK) has been proposed in this model. Subsequently, the model workflow methodology stages have been described. In order to train data and LoK models deployment, an ML platform has been implemented. IBM Watson Studio, as a leading data science tool and data-driven cloud integration ML platform, is employed in this study. As a Use case, a data set of geotechnical hazards and risk assessment in underground coal mining were prepared to demonstrate the performance of the model, and accordingly, the results have been outlined.Keywords: data model, geotechnical risks, machine learning, underground coal mining
Procedia PDF Downloads 27519458 Optimizing Production Yield Through Process Parameter Tuning Using Deep Learning Models: A Case Study in Precision Manufacturing
Authors: Tolulope Aremu
Abstract:
This paper is based on the idea of using deep learning methodology for optimizing production yield by tuning a few key process parameters in a manufacturing environment. The study was explicitly on how to maximize production yield and minimize operational costs by utilizing advanced neural network models, specifically Long Short-Term Memory and Convolutional Neural Networks. These models were implemented using Python-based frameworks—TensorFlow and Keras. The targets of the research are the precision molding processes in which temperature ranges between 150°C and 220°C, the pressure ranges between 5 and 15 bar, and the material flow rate ranges between 10 and 50 kg/h, which are critical parameters that have a great effect on yield. A dataset of 1 million production cycles has been considered for five continuous years, where detailed logs are present showing the exact setting of parameters and yield output. The LSTM model would model time-dependent trends in production data, while CNN analyzed the spatial correlations between parameters. Models are designed in a supervised learning manner. For the model's loss, an MSE loss function is used, optimized through the Adam optimizer. After running a total of 100 training epochs, 95% accuracy was achieved by the models recommending optimal parameter configurations. Results indicated that with the use of RSM and DOE traditional methods, there was an increase in production yield of 12%. Besides, the error margin was reduced by 8%, hence consistent quality products from the deep learning models. The monetary value was annually around $2.5 million, the cost saved from material waste, energy consumption, and equipment wear resulting from the implementation of optimized process parameters. This system was deployed in an industrial production environment with the help of a hybrid cloud system: Microsoft Azure, for data storage, and the training and deployment of their models were performed on Google Cloud AI. The functionality of real-time monitoring of the process and automatic tuning of parameters depends on cloud infrastructure. To put it into perspective, deep learning models, especially those employing LSTM and CNN, optimize the production yield by fine-tuning process parameters. Future research will consider reinforcement learning with a view to achieving further enhancement of system autonomy and scalability across various manufacturing sectors.Keywords: production yield optimization, deep learning, tuning of process parameters, LSTM, CNN, precision manufacturing, TensorFlow, Keras, cloud infrastructure, cost saving
Procedia PDF Downloads 3419457 New Requirements of the Fifth Dimension of War: Planning of Cyber Operation Capabilities
Authors: Mehmet Kargaci
Abstract:
Transformation of technology and strategy has been the main factor for the evolution of war. In addition to land, maritime, air and space domains, cyberspace has become the fifth domain with emerge of internet. The current security environment has become more complex and uncertain than ever before. Moreover, warfare has evaluated from conventional to irregular, asymmetric and hybrid war. Weak actors such as terrorist organizations and non-state actors has increasingly conducted cyber-attacks against strong adversaries. Besides, states has developed cyber capabilities in order to defense critical infrastructure regarding the cyber threats. Cyber warfare will be key in future security environment. Although what to do has been placed in operational plans, how to do has lacked and ignored as to cyber defense and attack. The purpose of the article is to put forward a model for how to conduct cyber capabilities in a conventional war. First, cyber operations capabilities will be discussed. Second put forward the necessities of cyberspace environment and develop a model for how to plan an operation using cyber operation capabilities, finally the assessment of the applicability of cyber operation capabilities and offers will be presented.Keywords: cyber war, cyber threats, cyber operation capabilities, operation planning
Procedia PDF Downloads 33619456 Water Monitoring Sentinel Cloud Platform: Water Monitoring Platform Based on Satellite Imagery and Modeling Data
Authors: Alberto Azevedo, Ricardo Martins, André B. Fortunato, Anabela Oliveira
Abstract:
Water is under severe threat today because of the rising population, increased agricultural and industrial needs, and the intensifying effects of climate change. Due to sea-level rise, erosion, and demographic pressure, the coastal regions are of significant concern to the scientific community. The Water Monitoring Sentinel Cloud platform (WORSICA) service is focused on providing new tools for monitoring water in coastal and inland areas, taking advantage of remote sensing, in situ and tidal modeling data. WORSICA is a service that can be used to determine the coastline, coastal inundation areas, and the limits of inland water bodies using remote sensing (satellite and Unmanned Aerial Vehicles - UAVs) and in situ data (from field surveys). It applies to various purposes, from determining flooded areas (from rainfall, storms, hurricanes, or tsunamis) to detecting large water leaks in major water distribution networks. This service was built on components developed in national and European projects, integrated to provide a one-stop-shop service for remote sensing information, integrating data from the Copernicus satellite and drone/unmanned aerial vehicles, validated by existing online in-situ data. Since WORSICA is operational using the European Open Science Cloud (EOSC) computational infrastructures, the service can be accessed via a web browser and is freely available to all European public research groups without additional costs. In addition, the private sector will be able to use the service, but some usage costs may be applied, depending on the type of computational resources needed by each application/user. Although the service has three main sub-services i) coastline detection; ii) inland water detection; iii) water leak detection in irrigation networks, in the present study, an application of the service to Óbidos lagoon in Portugal is shown, where the user can monitor the evolution of the lagoon inlet and estimate the topography of the intertidal areas without any additional costs. The service has several distinct methodologies implemented based on the computations of the water indexes (e.g., NDWI, MNDWI, AWEI, and AWEIsh) retrieved from the satellite image processing. In conjunction with the tidal data obtained from the FES model, the system can estimate a coastline with the corresponding level or even topography of the inter-tidal areas based on the Flood2Topo methodology. The outcomes of the WORSICA service can be helpful for several intervention areas such as i) emergency by providing fast access to inundated areas to support emergency rescue operations; ii) support of management decisions on hydraulic infrastructures operation to minimize damage downstream; iii) climate change mitigation by minimizing water losses and reduce water mains operation costs; iv) early detection of water leakages in difficult-to-access water irrigation networks, promoting their fast repair.Keywords: remote sensing, coastline detection, water detection, satellite data, sentinel, Copernicus, EOSC
Procedia PDF Downloads 12819455 Robust Image Design Based Steganographic System
Authors: Sadiq J. Abou-Loukh, Hanan M. Habbi
Abstract:
This paper presents a steganography to hide the transmitted information without excite suspicious and also illustrates the level of secrecy that can be increased by using cryptography techniques. The proposed system has been implemented firstly by encrypted image file one time pad key and secondly encrypted message that hidden to perform encryption followed by image embedding. Then the new image file will be created from the original image by using four triangles operation, the new image is processed by one of two image processing techniques. The proposed two processing techniques are thresholding and differential predictive coding (DPC). Afterwards, encryption or decryption keys are generated by functional key generator. The generator key is used one time only. Encrypted text will be hidden in the places that are not used for image processing and key generation system has high embedding rate (0.1875 character/pixel) for true color image (24 bit depth).Keywords: encryption, thresholding, differential predictive coding, four triangles operation
Procedia PDF Downloads 49319454 Overcoming the Problems Affecting Drip Irrigation System through the Design of an Efficient Filtration and Flushing System
Authors: Stephen A. Akinlabi, Esther T. Akinlabi
Abstract:
The drip irrigation system is one of the important areas that affect the livelihood of farmers directly. The use of drip irrigation system has been the most efficient system compared to the other types of irrigations systems because the drip irrigation helps to save water and increase the productivity of crops. But like any other system, it can be considered inefficient when the filters and the emitters get clogged while in operation. The efficiency of the entire system is reduced when the emitters are clogged and blocked. This consequently impact and affect the farm operations which may result in scarcity of farm products and increase the demand. This design work focuses on how to overcome some of the challenges affecting drip irrigation system through the design of an efficient filtration and flushing system.Keywords: drip irrigation system, filters, soil texture, mechanical engineering design, analysis
Procedia PDF Downloads 38619453 Internet of Things Based Patient Health Monitoring System
Authors: G. Yoga Sairam Teja, K. Harsha Vardhan, A. Vinay Kumar, K. Nithish Kumar, Ch. Shanthi Priyag
Abstract:
The emergence of the Internet of Things (IoT) has facilitated better device control and monitoring in the modern world. The constant monitoring of a patient would be drastically altered by the usage of IoT in healthcare. As we've seen in the case of the COVID-19 pandemic, it's important to keep oneself untouched while continuously checking on the patient's heart rate and temperature. Additionally, patients with paralysis should be closely watched, especially if they are elderly and in need of special care. Our "IoT BASED PATIENT HEALTH MONITORING SYSTEM" project uses IoT to track patient health conditions in an effort to address these issues. In this project, the main board is an 8051 microcontroller that connects a number of sensors, including a heart rate sensor, a temperature sensor (LM-35), and a saline water measuring circuit. These sensors are connected via an ESP832 (WiFi) module, which enables the sending of recorded data directly to the cloud so that the patient's health status can be regularly monitored. An LCD is used to monitor the data in offline mode, and a buzzer will sound if any variation from the regular readings occurs. The data in the cloud may be viewed as a graph, making it simple for a user to spot any unusual conditions.Keywords: IoT, ESP8266, 8051 microcontrollers, sensors
Procedia PDF Downloads 8819452 A Design of Elliptic Curve Cryptography Processor based on SM2 over GF(p)
Authors: Shiji Hu, Lei Li, Wanting Zhou, DaoHong Yang
Abstract:
The data encryption, is the foundation of today’s communication. On this basis, how to improve the speed of data encryption and decryption is always a problem that scholars work for. In this paper, we proposed an elliptic curve crypto processor architecture based on SM2 prime field. In terms of hardware implementation, we optimized the algorithms in different stages of the structure. In finite field modulo operation, we proposed an optimized improvement of Karatsuba-Ofman multiplication algorithm, and shorten the critical path through pipeline structure in the algorithm implementation. Based on SM2 recommended prime field, a fast modular reduction algorithm is used to reduce 512-bit wide data obtained from the multiplication unit. The radix-4 extended Euclidean algorithm was used to realize the conversion between affine coordinate system and Jacobi projective coordinate system. In the parallel scheduling of point operations on elliptic curves, we proposed a three-level parallel structure of point addition and point double based on the Jacobian projective coordinate system. Combined with the scalar multiplication algorithm, we added mutual pre-operation to the point addition and double point operation to improve the efficiency of the scalar point multiplication. The proposed ECC hardware architecture was verified and implemented on Xilinx Virtex-7 and ZYNQ-7 platforms, and each 256-bit scalar multiplication operation took 0.275ms. The performance for handling scalar multiplication is 32 times that of CPU(dual-core ARM Cortex-A9).Keywords: Elliptic curve cryptosystems, SM2, modular multiplication, point multiplication.
Procedia PDF Downloads 10019451 Experimental Assessment of a Grid-Forming Inverter in Microgrid Islanding Operation Mode
Authors: Dalia Salem, Detlef Schulz
Abstract:
As Germany pursues its ambitious plan towards a power system based on renewable energy sources, the necessity to establish steady, robust microgrids becomes more evident. Inside the microgrid, there is at least one grid-forming inverter responsible for generating the coupling voltage and stabilizing the system frequency within the standardized accepted limits when the microgrid is forced to operate as a stand-alone power system. Grid-forming control for distributed inverters is required to enable steady control of a low-inertia power system. In this paper, a designed droop control technique is tested at the controller of an inverter as a component of a hardware test bed to understand the microgrid behavior in two modes of operation: i) grid-connected and ii) operating in islanding mode. This droop technique includes many current and voltage inner control loops, where the Q-V and P-f droop provide the required terminal output voltage and frequency. The technique is tested first in a simulation model of the inverter in MATLAB/SIMULINK, and the results are compared to the results of the hardware laboratory test. The results of this experiment illuminate the pivotal role of the grid-forming inverter in facilitating microgrid resilience during grid disconnection events and how microgrids could provide the functionality formerly provided by synchronous machinery, such as the black start process.Keywords: microgrid, grid-forming inverters, droop-control, islanding-operation
Procedia PDF Downloads 7119450 Fault Analysis of Ship Power System Comprising of Parallel Generators and Variable Frequency Drive
Authors: Umair Ashraf, Kjetil Uhlen, Sverre Eriksen, Nadeem Jelani
Abstract:
Although advancement in technology has increased the reliability and ease of work in ship power system, but these advancements are also adding complexities. Ever increasing non linear loads, like power electronics (PE) devices effect the stability of the system. Frequent load variations and complex load dynamics are due to the frequency converters and motor drives, these problem are more prominent when system is connected with the weak grid. In the ship power system major consumers are thruster motors for the propulsion. For the control operation of these motors variable frequency drives (VFD) are used, mostly VFDs operate on nominal voltage of the system. Some of the consumers in ship operate on lower voltage than nominal, these consumers got supply through step down transformers. In this paper the vector control scheme is used for the control of both rectifier and inverter, parallel operation of the synchronous generators is also demonstrated. The simulation have been performed with induction motor as load on VFD and parallel RLC load. Fault analysis has been performed first for the system which do not have VFD and then for the system with VFD. Three phase to the ground, single phase to the ground fault were implemented and behavior of the system in both the cases was observed.Keywords: non-linear load, power electronics, parallel operating generators, pulse width modulation, variable frequency drives, voltage source converters, weak grid
Procedia PDF Downloads 57019449 Use of Cloud-Based Virtual Classroom in Connectivism Learning Process to Enhance Information Literacy and Self-Efficacy for Undergraduate Students
Authors: Kulachai Kultawanich, Prakob Koraneekij, Jaitip Na-Songkhla
Abstract:
The way of learning has been changed into a new paradigm since the improvement of network and communication technology, so learners have to interact with massive amount of the information. Thus, information literacy has become a critical set of abilities required by every college and university in the world. Connectivism is considered to be an alternative way to design information literacy course in online learning environment, such as Virtual Classroom (VC). With the change of learning pedagogy, VC is employed to improve the social capability by integrating cloud-based technology. This paper aims to study the use of Cloud-based Virtual Classroom (CBVC) in Connectivism learning process to enhance information literacy and self-efficacy of twenty-one undergraduate students who registered in an e-publishing course at Chulalongkorn University. The data were gathered during 6 weeks of the study by using the following instruments: (1) Information literacy test (2) Information literacy rubrics (3) Information Literacy Self-Efficacy (ILSE) Scales and (4) Questionnaire. The result indicated that students have information literacy and self-efficacy posttest mean scores higher than pretest mean scores at .05 level of significant after using CBVC in Connectivism learning process. Additionally, the study identified that the Connectivism learning process proved useful for developing information rich environment and a sense of community, and the CBVC proved useful for developing social connection.Keywords: cloud-based, virtual classroom, connectivism, information literacy
Procedia PDF Downloads 45419448 Low-Cost Robotic-Assisted Laparoscope
Authors: Ege Can Onal, Enver Ersen, Meltem Elitas
Abstract:
Laparoscopy is a surgical operation, well known as keyhole surgery. The operation is performed through small holes, hence, scars of a patient become much smaller, patients can recover in a short time and the hospital stay becomes shorter in comparison to an open surgery. Several tools are used at laparoscopic operations; among them, the laparoscope has a crucial role. It provides the vision during the operation, which will be the main focus in here. Since the operation area is very small, motion of the surgical tools might be limited in laparoscopic operations compared to traditional surgeries. To overcome this limitation, most of the laparoscopic tools have become more precise, dexterous, multi-functional or automated. Here, we present a robotic-assisted laparoscope that is controlled with pedals directly by a surgeon. Thus, the movement of the laparoscope might be controlled better, so there will not be a need to calibrate the camera during the operation. The need for an assistant that controls the movement of the laparoscope will be eliminated. The duration of the laparoscopic operation might be shorter since the surgeon will directly operate the camera.Keywords: laparoscope, laparoscopy, low-cost, minimally invasive surgery, robotic-assisted surgery
Procedia PDF Downloads 34219447 Sensor Data Analysis for a Large Mining Major
Authors: Sudipto Shanker Dasgupta
Abstract:
One of the largest mining companies wanted to look at health analytics for their driverless trucks. These trucks were the key to their supply chain logistics. The automated trucks had multi-level sub-assemblies which would send out sensor information. The use case that was worked on was to capture the sensor signal from the truck subcomponents and analyze the health of the trucks from repair and replacement purview. Open source software was used to stream the data into a clustered Hadoop setup in Amazon Web Services cloud and Apache Spark SQL was used to analyze the data. All of this was achieved through a 10 node amazon 32 core, 64 GB RAM setup real-time analytics was achieved on ‘300 million records’. To check the scalability of the system, the cluster was increased to 100 node setup. This talk will highlight how Open Source software was used to achieve the above use case and the insights on the high data throughput on a cloud set up.Keywords: streaming analytics, data science, big data, Hadoop, high throughput, sensor data
Procedia PDF Downloads 40519446 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme
Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara
Abstract:
This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme
Procedia PDF Downloads 484