Search results for: cloud inference
334 Parkinson’s Disease Hand-Eye Coordination and Dexterity Evaluation System
Authors: Wann-Yun Shieh, Chin-Man Wang, Ya-Cheng Shieh
Abstract:
This study aims to develop an objective scoring system to evaluate hand-eye coordination and hand dexterity for Parkinson’s disease. This system contains three boards, and each of them is implemented with the sensors to sense a user’s finger operations. The operations include the peg test, the block test, and the blind block test. A user has to use the vision, hearing, and tactile abilities to finish these operations, and the board will record the results automatically. These results can help the physicians to evaluate a user’s reaction, coordination, dexterity function. The results will be collected to a cloud database for further analysis and statistics. A researcher can use this system to obtain systematic, graphic reports for an individual or a group of users. Particularly, a deep learning model is developed to learn the features of the data from different users. This model will help the physicians to assess the Parkinson’s disease symptoms by a more intellective algorithm.Keywords: deep learning, hand-eye coordination, reaction, hand dexterity
Procedia PDF Downloads 64333 Bayesian Semiparametric Geoadditive Modelling of Underweight Malnutrition of Children under 5 Years in Ethiopia
Authors: Endeshaw Assefa Derso, Maria Gabriella Campolo, Angela Alibrandi
Abstract:
Objectives:Early childhood malnutrition can have long-term and irreversible effects on a child's health and development. This study uses the Bayesian method with spatial variation to investigate the flexible trends of metrical covariates and to identify communities at high risk of injury. Methods: Cross-sectional data on underweight are collected from the 2016 Ethiopian Demographic and Health Survey (EDHS). The Bayesian geo-additive model is performed. Appropriate prior distributions were provided for scall parameters in the models, and the inference is entirely Bayesian, using Monte Carlo Markov chain (MCMC) stimulation. Results: The results show that metrical covariates like child age, maternal body mass index (BMI), and maternal age affect a child's underweight non-linearly. Lower and higher maternal BMI seem to have a significant impact on the child’s high underweight. There was also a significant spatial heterogeneity, and based on IDW interpolation of predictive values, the western, central, and eastern parts of the country are hotspot areas. Conclusion: Socio-demographic and community- based programs development should be considered compressively in Ethiopian policy to combat childhood underweight malnutrition.Keywords: bayesX, Ethiopia, malnutrition, MCMC, semi-parametric bayesian analysis, spatial distribution, P- splines
Procedia PDF Downloads 86332 Risk Tolerance and Individual Worthiness Based on Simultaneous Analysis of the Cognitive Performance and Emotional Response to a Multivariate Situational Risk Assessment
Authors: Frederic Jumelle, Kelvin So, Didan Deng
Abstract:
A method and system for neuropsychological performance test, comprising a mobile terminal, used to interact with a cloud server which stores user information and is logged into by the user through the terminal device; the user information is directly accessed through the terminal device and is processed by artificial neural network, and the user information comprises user facial emotions information, performance test answers information and user chronometrics. This assessment is used to evaluate the cognitive performance and emotional response of the subject to a series of dichotomous questions describing various situations of daily life and challenging the users' knowledge, values, ethics, and principles. In industrial applications, the timing of this assessment will depend on the users' need to obtain a service from a provider, such as opening a bank account, getting a mortgage or an insurance policy, authenticating clearance at work, or securing online payments.Keywords: artificial intelligence, neurofinance, neuropsychology, risk management
Procedia PDF Downloads 136331 Growth Performance and Critical Supersaturation of Heterogeneous Condensation for High Concentration of Insoluble Sub-Micron Particles
Abstract:
Measuring the growth performance and critical supersaturation of particle group have a high reference value for constructing a supersaturated water vapor environment that can improve the removal efficiency of the high-concentration particle group. The critical supersaturation and the variation of the growth performance with supersaturation for high-concentration particles were measured by a flow cloud chamber. Findings suggest that the influence of particle concentration on the growth performance will reduce with the increase of supersaturation. Reducing residence time and increasing particle concentration have similar effects on the growth performance of the high-concentration particle group. Increasing particle concentration and shortening residence time will increase the critical supersaturation of the particle group. The critical supersaturation required to activate a high-concentration particle group is lower than that of the single-particle when the minimum particle size in the particle group is the same as that of a single particle.Keywords: sub-micron particles, heterogeneous condensation, critical supersaturation, nucleation
Procedia PDF Downloads 153330 Information Technologies in Human Resources Management - Selected Examples
Authors: A. Karasek
Abstract:
Rapid growth of Information Technologies (IT) has had huge influence on enterprises, and it has contributed to its promotion and increasingly extensive use in enterprises. Information Technologies have to a large extent determined the processes taking place in a enterprise; what is more, IT development has brought the need to adopt a brand new approach to human resources management in an enterprise. The use of IT in Human Resource Management (HRM) is of high importance due to the growing role of information and information technologies. The aim of this paper is to evaluate the use of information technologies in human resources management in enterprises. These practices will be presented in the following areas: Recruitment and selection, development and training, employee assessment, motivation, talent management, personnel service. Results of conducted survey show diversity of solutions applied in particular areas of human resource management. In the future, further development in this area should be expected, as well as integration of individual HRM areas, growing mobile-enabled HR processes and their transfer into the cloud. Presented IT solutions applied in HRM are highly innovative, which is of great significance due to their possible implementation in other enterprises.Keywords: e-HR, human resources management, HRM practices, HRMS, information technologies
Procedia PDF Downloads 350329 Use of Thermosonication to Obtain Minimally Processed Mosambi Juice
Authors: Ruby Siwach, Manish Kumar, Raman Seth
Abstract:
Extent of inactivation of pectin methylesterase (PME) in mosambi juice during thermal and thermosonication treatments was studied to obtain a minimally processed product. Effect of both treatments on cloud value, pH, titratable acidity, oBrix, and sensory attributes (flavour and taste) was studied. Thermal treatments (HT) were carried out at three temperatures 60, 70, and 80°C in a serological water bath for 5, 10, 15, and 20 min at each temperature. Thermosonication treatments (TS) were also given for same time-temperature combinations in water bath of a thermosonicator. Treated samples were stored in a deep freezer at 18°C for PME assay. PME activity of untreated sample was also assayed and residual PME activity and % loss in PME activity was calculated at each time-temperature combination. The extent of inactivation of PME increased with increase in treatment temperature and duration. Thermosonication treatments were found far more effective than thermal treatments of same time temperature combination in PME inactivation and retention of sensory attributes.Keywords: pectin methylesterase, heat inactivation kinetics, thermosonication, thermal treatment
Procedia PDF Downloads 430328 Retrospective Data Analysis of Penetrating Injuries Admitted to Jigme Dorji Wangchuck National Referral Hospital (JDWNRH), Thimphu, Bhutan, Due to Traditional Sports over a Period of 3 Years
Authors: Sonam Kelzang
Abstract:
Background: Penetrating injuries as a result of traditional sports (Archery and Khuru) are commonly seen in Bhutan. To our knowledge, there is no study carried out looking into the data of penetrating injuries due to traditional sports. Aim: This is a retrospective analysis of cases of penetrating injuries as a result of traditional sports admitted to JDWNRH over the last 3 years to draw an inference on the pattern of injury and associated morbidity and mortality. Method: Data on penetrating injuries related to traditional sports (Archery and Khuru) were collected and reviewed over the period of 3 years. Assault cases were excluded. For each year we analysed age, sex, parts of the body affected, agent of injury and whether admission was required or not. Results: Out of the total 44 victims of penetrating injury by traditional sports (Archery and Khuru) between 2013 and 2015 (average of 15 cases of penetrating injuries per year). Eighty-five percent were male and 15% were female. Their age ranged from 4 yrs to 62 years. Sixty-one percent of the victims were in the working age group of 19-58 years; 30% of the victims were referred from various district hospitals; 38% of the victims needed admission; 42 % of the victims suffered injury to the head; and 54% of the injuries were caused by Khuru. Conclusion: Penetrating injuries due to traditional sports admitted to JDWNRH, Thimphu, remained same over the three years period despite safety regulations in place. Although there were no deaths during the last three years, morbidity still remains high.Keywords: archery, Bhutan, Khuru, darts
Procedia PDF Downloads 165327 Bayesian Inference for High Dimensional Dynamic Spatio-Temporal Models
Authors: Sofia M. Karadimitriou, Kostas Triantafyllopoulos, Timothy Heaton
Abstract:
Reduced dimension Dynamic Spatio-Temporal Models (DSTMs) jointly describe the spatial and temporal evolution of a function observed subject to noise. A basic state space model is adopted for the discrete temporal variation, while a continuous autoregressive structure describes the continuous spatial evolution. Application of such a DSTM relies upon the pre-selection of a suitable reduced set of basic functions and this can present a challenge in practice. In this talk, we propose an online estimation method for high dimensional spatio-temporal data based upon DSTM and we attempt to resolve this issue by allowing the basis to adapt to the observed data. Specifically, we present a wavelet decomposition in order to obtain a parsimonious approximation of the spatial continuous process. This parsimony can be achieved by placing a Laplace prior distribution on the wavelet coefficients. The aim of using the Laplace prior, is to filter wavelet coefficients with low contribution, and thus achieve the dimension reduction with significant computation savings. We then propose a Hierarchical Bayesian State Space model, for the estimation of which we offer an appropriate particle filter algorithm. The proposed methodology is illustrated using real environmental data.Keywords: multidimensional Laplace prior, particle filtering, spatio-temporal modelling, wavelets
Procedia PDF Downloads 424326 An Efficient Propensity Score Method for Causal Analysis With Application to Case-Control Study in Breast Cancer Research
Authors: Ms Azam Najafkouchak, David Todem, Dorothy Pathak, Pramod Pathak, Joseph Gardiner
Abstract:
Propensity score (PS) methods have recently become the standard analysis as a tool for the causal inference in the observational studies where exposure is not randomly assigned, thus, confounding can impact the estimation of treatment effect on the outcome. For the binary outcome, the effect of treatment on the outcome can be estimated by odds ratios, relative risks, and risk differences. However, using the different PS methods may give you a different estimation of the treatment effect on the outcome. Several methods of PS analyses have been used mainly, include matching, inverse probability of weighting, stratification, and covariate adjusted on PS. Due to the dangers of discretizing continuous variables (exposure, covariates), the focus of this paper will be on how the variation in cut-points or boundaries will affect the average treatment effect (ATE) utilizing the stratification of PS method. Therefore, we are trying to avoid choosing arbitrary cut-points, instead, we continuously discretize the PS and accumulate information across all cut-points for inferences. We will use Monte Carlo simulation to evaluate ATE, focusing on two PS methods, stratification and covariate adjusted on PS. We will then show how this can be observed based on the analyses of the data from a case-control study of breast cancer, the Polish Women’s Health Study.Keywords: average treatment effect, propensity score, stratification, covariate adjusted, monte Calro estimation, breast cancer, case_control study
Procedia PDF Downloads 105325 Data Driven Infrastructure Planning for Offshore Wind farms
Authors: Isha Saxena, Behzad Kazemtabrizi, Matthias C. M. Troffaes, Christopher Crabtree
Abstract:
The calculations done at the beginning of the life of a wind farm are rarely reliable, which makes it important to conduct research and study the failure and repair rates of the wind turbines under various conditions. This miscalculation happens because the current models make a simplifying assumption that the failure/repair rate remains constant over time. This means that the reliability function is exponential in nature. This research aims to create a more accurate model using sensory data and a data-driven approach. The data cleaning and data processing is done by comparing the Power Curve data of the wind turbines with SCADA data. This is then converted to times to repair and times to failure timeseries data. Several different mathematical functions are fitted to the times to failure and times to repair data of the wind turbine components using Maximum Likelihood Estimation and the Posterior expectation method for Bayesian Parameter Estimation. Initial results indicate that two parameter Weibull function and exponential function produce almost identical results. Further analysis is being done using the complex system analysis considering the failures of each electrical and mechanical component of the wind turbine. The aim of this project is to perform a more accurate reliability analysis that can be helpful for the engineers to schedule maintenance and repairs to decrease the downtime of the turbine.Keywords: reliability, bayesian parameter inference, maximum likelihood estimation, weibull function, SCADA data
Procedia PDF Downloads 85324 Wolof Voice Response Recognition System: A Deep Learning Model for Wolof Audio Classification
Authors: Krishna Mohan Bathula, Fatou Bintou Loucoubar, FNU Kaleemunnisa, Christelle Scharff, Mark Anthony De Castro
Abstract:
Voice recognition algorithms such as automatic speech recognition and text-to-speech systems with African languages can play an important role in bridging the digital divide of Artificial Intelligence in Africa, contributing to the establishment of a fully inclusive information society. This paper proposes a Deep Learning model that can classify the user responses as inputs for an interactive voice response system. A dataset with Wolof language words ‘yes’ and ‘no’ is collected as audio recordings. A two stage Data Augmentation approach is adopted for enhancing the dataset size required by the deep neural network. Data preprocessing and feature engineering with Mel-Frequency Cepstral Coefficients are implemented. Convolutional Neural Networks (CNNs) have proven to be very powerful in image classification and are promising for audio processing when sounds are transformed into spectra. For performing voice response classification, the recordings are transformed into sound frequency feature spectra and then applied image classification methodology using a deep CNN model. The inference model of this trained and reusable Wolof voice response recognition system can be integrated with many applications associated with both web and mobile platforms.Keywords: automatic speech recognition, interactive voice response, voice response recognition, wolof word classification
Procedia PDF Downloads 115323 The Intersection/Union Region Computation for Drosophila Brain Images Using Encoding Schemes Based on Multi-Core CPUs
Authors: Ming-Yang Guo, Cheng-Xian Wu, Wei-Xiang Chen, Chun-Yuan Lin, Yen-Jen Lin, Ann-Shyn Chiang
Abstract:
With more and more Drosophila Driver and Neuron images, it is an important work to find the similarity relationships among them as the functional inference. There is a general problem that how to find a Drosophila Driver image, which can cover a set of Drosophila Driver/Neuron images. In order to solve this problem, the intersection/union region for a set of images should be computed at first, then a comparison work is used to calculate the similarities between the region and other images. In this paper, three encoding schemes, namely Integer, Boolean, Decimal, are proposed to encode each image as a one-dimensional structure. Then, the intersection/union region from these images can be computed by using the compare operations, Boolean operators and lookup table method. Finally, the comparison work is done as the union region computation, and the similarity score can be calculated by the definition of Tanimoto coefficient. The above methods for the region computation are also implemented in the multi-core CPUs environment with the OpenMP. From the experimental results, in the encoding phase, the performance by the Boolean scheme is the best than that by others; in the region computation phase, the performance by Decimal is the best when the number of images is large. The speedup ratio can achieve 12 based on 16 CPUs. This work was supported by the Ministry of Science and Technology under the grant MOST 106-2221-E-182-070.Keywords: Drosophila driver image, Drosophila neuron images, intersection/union computation, parallel processing, OpenMP
Procedia PDF Downloads 236322 Increasing the System Availability of Data Centers by Using Virtualization Technologies
Authors: Chris Ewe, Naoum Jamous, Holger Schrödl
Abstract:
Like most entrepreneurs, data center operators pursue goals such as profit-maximization, improvement of the company’s reputation or basically to exist on the market. Part of those aims is to guarantee a given quality of service. Quality characteristics are specified in a contract called the service level agreement. Central part of this agreement is non-functional properties of an IT service. The system availability is one of the most important properties as it will be shown in this paper. To comply with availability requirements, data center operators can use virtualization technologies. A clear model to assess the effect of virtualization functions on the parts of a data center in relation to the system availability is still missing. This paper aims to introduce a basic model that shows these connections, and consider if the identified effects are positive or negative. Thus, this work also points out possible disadvantages of the technology. In consequence, the paper shows opportunities as well as risks of data center virtualization in relation to system availability.Keywords: availability, cloud computing IT service, quality of service, service level agreement, virtualization
Procedia PDF Downloads 534321 Exploring the Quest for Centralized Identity in Mohsin Hamid's "The Last White Man": Post-Apocalyptic Transformations and Societal Reconfigurations
Authors: Kashifa Khalid, Eesham Fatima
Abstract:
This study aims to analyze the loss of identity and its impact on one’s life in ‘The Last White Man’ by Mohsin Hamid. The theory of Alienation Effect by Bertolt Brecht has been applied to the text as Hamid offers the readers a unique perspective, alluding to significant themes like identity, race, and death. The aspects of defamiliarization align impeccably with the plot, as existence and the corresponding concept of identity seem to have dissolved into utter chaos. This extends from the unexplained transformation to the way the entire world unravels from its general norm into a dystopian mayhem. The characters, starting with the protagonist Anders, have lost their center. One’s own self transforms into the ‘other,’ and the struggle is to get refamiliarized with one’s own self. Alienation and isolation only rise as the construct of race and identity is taken apart brick by brick, ironically at its own pace as many new realities are blown to bits. The inseparable relationship between identity and grief under the ever-looming cloud of ‘death’ is studied in detail. The theoretical framework and thematic aspects harmonize in accordance with the writing style put forth by Hamid, tying all the loose ends together.Keywords: alienation, chaos, identity, transformation
Procedia PDF Downloads 43320 The Implantable MEMS Blood Pressure Sensor Model With Wireless Powering And Data Transmission
Authors: Vitaliy Petrov, Natalia Shusharina, Vitaliy Kasymov, Maksim Patrushev, Evgeny Bogdanov
Abstract:
The leading worldwide death reasons are ischemic heart disease and other cardiovascular illnesses. Generally, the common symptom is high blood pressure. Long-time blood pressure control is very important for the prophylaxis, correct diagnosis and timely therapy. Non-invasive methods which are based on Korotkoff sounds are impossible to apply often and for a long time. Implantable devices can combine longtime monitoring with high accuracy of measurements. The main purpose of this work is to create a real-time monitoring system for decreasing the death rate from cardiovascular diseases. These days implantable electronic devices began to play an important role in medicine. Usually implantable devices consist of a transmitter, powering which could be wireless with a special made battery and measurement circuit. Common problems in making implantable devices are short lifetime of the battery, big size and biocompatibility. In these work, blood pressure measure will be the focus because it’s one of the main symptoms of cardiovascular diseases. Our device will consist of three parts: the implantable pressure sensor, external transmitter and automated workstation in a hospital. The Implantable part of pressure sensors could be based on piezoresistive or capacitive technologies. Both sensors have some advantages and some limitations. The Developed circuit is based on a small capacitive sensor which is made of the technology of microelectromechanical systems (MEMS). The Capacitive sensor can provide high sensitivity, low power consumption and minimum hysteresis compared to the piezoresistive sensor. For this device, it was selected the oscillator-based circuit where frequency depends from the capacitance of sensor hence from capacitance one can calculate pressure. The external device (transmitter) used for wireless charging and signal transmission. Some implant devices for these applications are passive, the external device sends radio wave signal on internal LC circuit device. The external device gets reflected the signal from the implant and from a change of frequency is possible to calculate changing of capacitance and then blood pressure. However, this method has some disadvantages, such as the patient position dependence and static using. Developed implantable device doesn’t have these disadvantages and sends blood pressure data to the external part in real-time. The external device continuously sends information about blood pressure to hospital cloud service for analysis by a physician. Doctor’s automated workstation at the hospital also acts as a dashboard, which displays actual medical data of patients (which require attention) and stores it in cloud service. Usually, critical heart conditions occur few hours before heart attack but the device is able to send an alarm signal to the hospital for an early action of medical service. The system was tested with wireless charging and data transmission. These results can be used for ASIC design for MEMS pressure sensor.Keywords: MEMS sensor, RF power, wireless data, oscillator-based circuit
Procedia PDF Downloads 587319 Next-Gen Solutions: How Generative AI Will Reshape Businesses
Authors: Aishwarya Rai
Abstract:
This study explores the transformative influence of generative AI on startups, businesses, and industries. We will explore how large businesses can benefit in the area of customer operations, where AI-powered chatbots can improve self-service and agent effectiveness, greatly increasing efficiency. In marketing and sales, generative AI could transform businesses by automating content development, data utilization, and personalization, resulting in a substantial increase in marketing and sales productivity. In software engineering-focused startups, generative AI can streamline activities, significantly impacting coding processes and work experiences. It can be extremely useful in product R&D for market analysis, virtual design, simulations, and test preparation, altering old workflows and increasing efficiency. Zooming into the retail and CPG industry, industry findings suggest a 1-2% increase in annual revenues, equating to $400 billion to $660 billion. By automating customer service, marketing, sales, and supply chain management, generative AI can streamline operations, optimizing personalized offerings and presenting itself as a disruptive force. While celebrating economic potential, we acknowledge challenges like external inference and adversarial attacks. Human involvement remains crucial for quality control and security in the era of generative AI-driven transformative innovation. This talk provides a comprehensive exploration of generative AI's pivotal role in reshaping businesses, recognizing its strategic impact on customer interactions, productivity, and operational efficiency.Keywords: generative AI, digital transformation, LLM, artificial intelligence, startups, businesses
Procedia PDF Downloads 75318 Modelling of Reactive Methodologies in Auto-Scaling Time-Sensitive Services With a MAPE-K Architecture
Authors: Óscar Muñoz Garrigós, José Manuel Bernabeu Aubán
Abstract:
Time-sensitive services are the base of the cloud services industry. Keeping low service saturation is essential for controlling response time. All auto-scalable services make use of reactive auto-scaling. However, reactive auto-scaling has few in-depth studies. This presentation shows a model for reactive auto-scaling methodologies with a MAPE-k architecture. Queuing theory can compute different properties of static services but lacks some parameters related to the transition between models. Our model uses queuing theory parameters to relate the transition between models. It associates MAPE-k related times, the sampling frequency, the cooldown period, the number of requests that an instance can handle per unit of time, the number of incoming requests at a time instant, and a function that describes the acceleration in the service's ability to handle more requests. This model is later used as a solution to horizontally auto-scale time-sensitive services composed of microservices, reevaluating the model’s parameters periodically to allocate resources. The solution requires limiting the acceleration of the growth in the number of incoming requests to keep a constrained response time. Business benefits determine such limits. The solution can add a dynamic number of instances and remains valid under different system sizes. The study includes performance recommendations to improve results according to the incoming load shape and business benefits. The exposed methodology is tested in a simulation. The simulator contains a load generator and a service composed of two microservices, where the frontend microservice depends on a backend microservice with a 1:1 request relation ratio. A common request takes 2.3 seconds to be computed by the service and is discarded if it takes more than 7 seconds. Both microservices contain a load balancer that assigns requests to the less loaded instance and preemptively discards requests if they are not finished in time to prevent resource saturation. When load decreases, instances with lower load are kept in the backlog where no more requests are assigned. If the load grows and an instance in the backlog is required, it returns to the running state, but if it finishes the computation of all requests and is no longer required, it is permanently deallocated. A few load patterns are required to represent the worst-case scenario for reactive systems: the following scenarios test response times, resource consumption and business costs. The first scenario is a burst-load scenario. All methodologies will discard requests if the rapidness of the burst is high enough. This scenario focuses on the number of discarded requests and the variance of the response time. The second scenario contains sudden load drops followed by bursts to observe how the methodology behaves when releasing resources that are lately required. The third scenario contains diverse growth accelerations in the number of incoming requests to observe how approaches that add a different number of instances can handle the load with less business cost. The exposed methodology is compared against a multiple threshold CPU methodology allocating/deallocating 10 or 20 instances, outperforming the competitor in all studied metrics.Keywords: reactive auto-scaling, auto-scaling, microservices, cloud computing
Procedia PDF Downloads 93317 Political Views and Information and Communication Technology (ICT) in Tertiary Institutions in Achieving the Millennium Development Goals (MDGS)
Authors: Perpetual Nwakaego Ibe
Abstract:
The Millennium Development Goals (MDGs), were an integrated project formed to eradicate many unnatural situations the citizens of the third world country may found themselves in. The MDGs, to be a sustainable project for the future depends 100% on the actions of governments, multilateral institutions and civil society. This paper first looks at the political views on the MDGs and relates it to the current electoral situations around the country by underlining the drastic changes over the few months. The second part of the paper presents ICT in tertiary institutions as one of the solutions in terms of the success of the MDGs. ICT is vital in all phases of educational process and development of the cloud connectivity is an added advantage of Information and Communication Technology (ICT) for sharing a common data bank for research purposes among UNICEF, RED CROSS, NPS, INEC, NMIC, and WHO. Finally, the paper concludes with areas that needs twigging and recommendations for the tertiary institutions committed to delivering an ambitious set of goals. A combination of observation, and document materials for data gathering was employed as the methodology for carrying out this research.Keywords: MDG, ICT, data bank, database
Procedia PDF Downloads 199316 Double Encrypted Data Communication Using Cryptography and Steganography
Authors: Adine Barett, Jermel Watson, Anteneh Girma, Kacem Thabet
Abstract:
In information security, secure communication of data across networks has always been a problem at the forefront. Transfer of information across networks is susceptible to being exploited by attackers engaging in malicious activity. In this paper, we leverage steganography and cryptography to create a layered security solution to protect the information being transmitted. The first layer of security leverages crypto- graphic techniques to scramble the information so that it cannot be deciphered even if the steganography-based layer is compromised. The second layer of security relies on steganography to disguise the encrypted in- formation so that it cannot be seen. We consider three cryptographic cipher methods in the cryptography layer, namely, Playfair cipher, Blowfish cipher, and Hills cipher. Then, the encrypted message is passed through the least significant bit (LSB) to the steganography algorithm for further encryption. Both encryption approaches are combined efficiently to help secure information in transit over a network. This multi-layered encryption is a solution that will benefit cloud platforms, social media platforms and networks that regularly transfer private information such as banks and insurance companies.Keywords: cryptography, steganography, layered security, Cipher, encryption
Procedia PDF Downloads 82315 A Framework for Automated Nuclear Waste Classification
Authors: Seonaid Hume, Gordon Dobie, Graeme West
Abstract:
Detecting and localizing radioactive sources is a necessity for safe and secure decommissioning of nuclear facilities. An important aspect for the management of the sort-and-segregation process is establishing the spatial distributions and quantities of the waste radionuclides, their type, corresponding activity, and ultimately classification for disposal. The data received from surveys directly informs decommissioning plans, on-site incident management strategies, the approach needed for a new cell, as well as protecting the workforce and the public. Manual classification of nuclear waste from a nuclear cell is time-consuming, expensive, and requires significant expertise to make the classification judgment call. Also, in-cell decommissioning is still in its relative infancy, and few techniques are well-developed. As with any repetitive and routine tasks, there is the opportunity to improve the task of classifying nuclear waste using autonomous systems. Hence, this paper proposes a new framework for the automatic classification of nuclear waste. This framework consists of five main stages; 3D spatial mapping and object detection, object classification, radiological mapping, source localisation based on gathered evidence and finally, waste classification. The first stage of the framework, 3D visual mapping, involves object detection from point cloud data. A review of related applications in other industries is provided, and recommendations for approaches for waste classification are made. Object detection focusses initially on cylindrical objects since pipework is significant in nuclear cells and indeed any industrial site. The approach can be extended to other commonly occurring primitives such as spheres and cubes. This is in preparation of stage two, characterizing the point cloud data and estimating the dimensions, material, degradation, and mass of the objects detected in order to feature match them to an inventory of possible items found in that nuclear cell. Many items in nuclear cells are one-offs, have limited or poor drawings available, or have been modified since installation, and have complex interiors, which often and inadvertently pose difficulties when accessing certain zones and identifying waste remotely. Hence, this may require expert input to feature match objects. The third stage, radiological mapping, is similar in order to facilitate the characterization of the nuclear cell in terms of radiation fields, including the type of radiation, activity, and location within the nuclear cell. The fourth stage of the framework takes the visual map for stage 1, the object characterization from stage 2, and radiation map from stage 3 and fuses them together, providing a more detailed scene of the nuclear cell by identifying the location of radioactive materials in three dimensions. The last stage involves combining the evidence from the fused data sets to reveal the classification of the waste in Bq/kg, thus enabling better decision making and monitoring for in-cell decommissioning. The presentation of the framework is supported by representative case study data drawn from an application in decommissioning from a UK nuclear facility. This framework utilises recent advancements of the detection and mapping capabilities of complex radiation fields in three dimensions to make the process of classifying nuclear waste faster, more reliable, cost-effective and safer.Keywords: nuclear decommissioning, radiation detection, object detection, waste classification
Procedia PDF Downloads 200314 Development of Fault Diagnosis Technology for Power System Based on Smart Meter
Authors: Chih-Chieh Yang, Chung-Neng Huang
Abstract:
In power system, how to improve the fault diagnosis technology of transmission line has always been the primary goal of power grid operators. In recent years, due to the rise of green energy, the addition of all kinds of distributed power also has an impact on the stability of the power system. Because the smart meters are with the function of data recording and bidirectional transmission, the adaptive Fuzzy Neural inference system, ANFIS, as well as the artificial intelligence that has the characteristics of learning and estimation in artificial intelligence. For transmission network, in order to avoid misjudgment of the fault type and location due to the input of these unstable power sources, combined with the above advantages of smart meter and ANFIS, a method for identifying fault types and location of faults is proposed in this study. In ANFIS training, the bus voltage and current information collected by smart meters can be trained through the ANFIS tool in MATLAB to generate fault codes to identify different types of faults and the location of faults. In addition, due to the uncertainty of distributed generation, a wind power system is added to the transmission network to verify the diagnosis correctness of the study. Simulation results show that the method proposed in this study can correctly identify the fault type and location of fault with more efficiency, and can deal with the interference caused by the addition of unstable power sources.Keywords: ANFIS, fault diagnosis, power system, smart meter
Procedia PDF Downloads 135313 Sustainable Separation of Nicotine from Its Aqueous Solutions
Authors: Zoran Visak, Joana Lopes, Vesna Najdanovic-Visak
Abstract:
Within this study, the separation of nicotine from its aqueous solutions, using inorganic salt sodium chloride or ionic liquid (molten salt) ECOENG212® as salting-out media, was carried out. Thus, liquid-liquid equilibria of the ternary solutions (nicotine+water+NaCl) and (nicotine+water+ECOENG212®) were determined at ambient pressure, 0.1 MPa, at three temperatures. The related phase diagrams were constructed in two manners: by adding the determined cloud-points and by the chemical analysis of phases in equilibrium (tie-line data). The latter were used to calculate two important separation parameters - partition coefficients of nicotine and separation factors. The impacts of the initial compositions of the mother solutions and of temperature on the liquid-liquid phase separation and partition coefficients were analyzed and discussed. The results obtained clearly showed that both investigated salts are good salting-out media for the efficient and sustainable separation of nicotine from its solutions with water. However, when compared, sodium chloride exhibited much better separation performance than the ionic liquid.Keywords: nicotine, liquid-liquid separation, inorganic salt, ionic liquid
Procedia PDF Downloads 309312 Supplier Risk Management: A Multivariate Statistical Modelling and Portfolio Optimization Based Approach for Supplier Delivery Performance Development
Authors: Jiahui Yang, John Quigley, Lesley Walls
Abstract:
In this paper, the authors develop a stochastic model regarding the investment in supplier delivery performance development from a buyer’s perspective. The authors propose a multivariate model through a Multinomial-Dirichlet distribution within an Empirical Bayesian inference framework, representing both the epistemic and aleatory uncertainties in deliveries. A closed form solution is obtained and the lower and upper bound for both optimal investment level and expected profit under uncertainty are derived. The theoretical properties provide decision makers with useful insights regarding supplier delivery performance improvement problems where multiple delivery statuses are involved. The authors also extend the model from a single supplier investment into a supplier portfolio, using a Lagrangian method to obtain a theoretical expression for an optimal investment level and overall expected profit. The model enables a buyer to know how the marginal expected profit/investment level of each supplier changes with respect to the budget and which supplier should be invested in when additional budget is available. An application of this model is illustrated in a simulation study. Overall, the main contribution of this study is to provide an optimal investment decision making framework for supplier development, taking into account multiple delivery statuses as well as multiple projects.Keywords: decision making, empirical bayesian, portfolio optimization, supplier development, supply chain management
Procedia PDF Downloads 287311 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)
Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton
Abstract:
Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.Keywords: cold-start learning, expectation propagation, multi-armed bandits, Thompson Sampling, variational inference
Procedia PDF Downloads 106310 The Forensic Swing of Things: The Current Legal and Technical Challenges of IoT Forensics
Authors: Pantaleon Lutta, Mohamed Sedky, Mohamed Hassan
Abstract:
The inability of organizations to put in place management control measures for Internet of Things (IoT) complexities persists to be a risk concern. Policy makers have been left to scamper in finding measures to combat these security and privacy concerns. IoT forensics is a cumbersome process as there is no standardization of the IoT products, no or limited historical data are stored on the devices. This paper highlights why IoT forensics is a unique adventure and brought out the legal challenges encountered in the investigation process. A quadrant model is presented to study the conflicting aspects in IoT forensics. The model analyses the effectiveness of forensic investigation process versus the admissibility of the evidence integrity; taking into account the user privacy and the providers’ compliance with the laws and regulations. Our analysis concludes that a semi-automated forensic process using machine learning, could eliminate the human factor from the profiling and surveillance processes, and hence resolves the issues of data protection (privacy and confidentiality).Keywords: cloud forensics, data protection Laws, GDPR, IoT forensics, machine Learning
Procedia PDF Downloads 149309 Reduction of Energy Consumption Using Smart Home Techniques in the Household Sector
Authors: Ahmed Al-Adaileh, Souheil Khaddaj
Abstract:
Outcomes of exhaustion of natural resources started influencing each spirit on this planet. Energy is an essential factor in this aspect. To restore the circumstance to the appropriate track, all attempts must focus on two fundamental branches: producing electricity from clean and renewable reserves and decreasing the overall unnecessary consumption of energy. The focal point of this paper will be on lessening the power consumption in the household's segment. This paper is an attempt to give a clear understanding of a framework called Reduction of Energy Consumption in Household Sector (RECHS) and how it should help householders to reduce their power consumption by substituting their household appliances, turning-off the appliances when stand-by modus is detected, and scheduling their appliances operation periods. Technically, the framework depends on utilizing Z-Wave compatible plug-ins which will be connected to the usual house devices to gauge and control them remotely and semi-automatically. The suggested framework underpins numerous quality characteristics, for example, integrability, scalability, security and adaptability.Keywords: smart energy management systems, internet of things, wireless mesh networks, microservices, cloud computing, big data
Procedia PDF Downloads 193308 The Incesant Subversion of Judiciary by African Political Leaders
Authors: Joy Olayemi Gbala, Fatai Olatokunbo, Philip Cloud
Abstract:
Catastrophic dictatorship has been discovered to be the major leadership challenge that orchestrates stagnated and contrasted economy with dysfunctional democracy in Africa through willful misappropriation of resources and egregious subversion of the rule of law. Almost invariably, most African leaders inexplicably often become power drunk and addicted which usually leads to abuse of state power, abdication of constitutional duties, unjustly withdrawal of business license of operation, human right violation, election malpractices, financial corruption, disruptions of policies of democratic government transition, annulment of free and fair election, and disruptions of legal electoral procedures and unachievable dividends of democracy and many more. Owing to this, most African nations have gone and still go through political unrest and insurgencies leading to loss of lives and property, violent protests, detention of detractors and political activists and massive human displacement. This research work is concerned with, and investigates the causes, menace, consequences and impacts of subverting the rule of law in Africa on the economy and the development of the continent with a suggested practical solution to the plights.Keywords: corruption, law, leadership, violation
Procedia PDF Downloads 153307 Programming without Code: An Approach and Environment to Conditions-On-Data Programming
Authors: Philippe Larvet
Abstract:
This paper presents the concept of an object-based programming language where tests (if... then... else) and control structures (while, repeat, for...) disappear and are replaced by conditions on data. According to the object paradigm, by using this concept, data are still embedded inside objects, as variable-value couples, but object methods are expressed into the form of logical propositions (‘conditions on data’ or COD).For instance : variable1 = value1 AND variable2 > value2 => variable3 = value3. Implementing this approach, a central inference engine turns and examines objects one after another, collecting all CODs of each object. CODs are considered as rules in a rule-based system: the left part of each proposition (left side of the ‘=>‘ sign) is the premise and the right part is the conclusion. So, premises are evaluated and conclusions are fired. Conclusions modify the variable-value couples of the object and the engine goes to examine the next object. The paper develops the principles of writing CODs instead of complex algorithms. Through samples, the paper also presents several hints for implementing a simple mechanism able to process this ‘COD language’. The proposed approach can be used within the context of simulation, process control, industrial systems validation, etc. By writing simple and rigorous conditions on data, instead of using classical and long-to-learn languages, engineers and specialists can easily simulate and validate the functioning of complex systems.Keywords: conditions on data, logical proposition, programming without code, object-oriented programming, system simulation, system validation
Procedia PDF Downloads 220306 Application of RS and GIS Technique for Identifying Groundwater Potential Zone in Gomukhi Nadhi Sub Basin, South India
Authors: Punitha Periyasamy, Mahalingam Sudalaimuthu, Sachikanta Nanda, Arasu Sundaram
Abstract:
India holds 17.5% of the world’s population but has only 2% of the total geographical area of the world where 27.35% of the area is categorized as wasteland due to lack of or less groundwater. So there is a demand for excessive groundwater for agricultural and non agricultural activities to balance its growth rate. With this in mind, an attempt is made to find the groundwater potential zone in Gomukhi river sub basin of Vellar River basin, TamilNadu, India covering an area of 1146.6 Sq.Km consists of 9 blocks from Peddanaickanpalayam to Villupuram fall in the sub basin. The thematic maps such as Geology, Geomorphology, Lineament, Landuse, and Landcover and Drainage are prepared for the study area using IRS P6 data. The collateral data includes rainfall, water level, soil map are collected for analysis and inference. The digital elevation model (DEM) is generated using Shuttle Radar Topographic Mission (SRTM) and the slope of the study area is obtained. ArcGIS 10.1 acts as a powerful spatial analysis tool to find out the ground water potential zones in the study area by means of weighted overlay analysis. Each individual parameter of the thematic maps are ranked and weighted in accordance with their influence to increase the water level in the ground. The potential zones in the study area are classified viz., Very Good, Good, Moderate, Poor with its aerial extent of 15.67, 381.06, 575.38, 174.49 Sq.Km respectively.Keywords: ArcGIS, DEM, groundwater, recharge, weighted overlay
Procedia PDF Downloads 443305 Application Methodology for the Generation of 3D Thermal Models Using UAV Photogrammety and Dual Sensors for Mining/Industrial Facilities Inspection
Authors: Javier Sedano-Cibrián, Julio Manuel de Luis-Ruiz, Rubén Pérez-Álvarez, Raúl Pereda-García, Beatriz Malagón-Picón
Abstract:
Structural inspection activities are necessary to ensure the correct functioning of infrastructures. Unmanned Aerial Vehicle (UAV) techniques have become more popular than traditional techniques. Specifically, UAV Photogrammetry allows time and cost savings. The development of this technology has permitted the use of low-cost thermal sensors in UAVs. The representation of 3D thermal models with this type of equipment is in continuous evolution. The direct processing of thermal images usually leads to errors and inaccurate results. A methodology is proposed for the generation of 3D thermal models using dual sensors, which involves the application of visible Red-Blue-Green (RGB) and thermal images in parallel. Hence, the RGB images are used as the basis for the generation of the model geometry, and the thermal images are the source of the surface temperature information that is projected onto the model. Mining/industrial facilities representations that are obtained can be used for inspection activities.Keywords: aerial thermography, data processing, drone, low-cost, point cloud
Procedia PDF Downloads 142