Search results for: data interpolating empirical orthogonal function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29462

Search results for: data interpolating empirical orthogonal function

25922 CompPSA: A Component-Based Pairwise RNA Secondary Structure Alignment Algorithm

Authors: Ghada Badr, Arwa Alturki

Abstract:

The biological function of an RNA molecule depends on its structure. The objective of the alignment is finding the homology between two or more RNA secondary structures. Knowing the common functionalities between two RNA structures allows a better understanding and a discovery of other relationships between them. Besides, identifying non-coding RNAs -that is not translated into a protein- is a popular application in which RNA structural alignment is the first step A few methods for RNA structure-to-structure alignment have been developed. Most of these methods are partial structure-to-structure, sequence-to-structure, or structure-to-sequence alignment. Less attention is given in the literature to the use of efficient RNA structure representation and the structure-to-structure alignment methods are lacking. In this paper, we introduce an O(N2) Component-based Pairwise RNA Structure Alignment (CompPSA) algorithm, where structures are given as a component-based representation and where N is the maximum number of components in the two structures. The proposed algorithm compares the two RNA secondary structures based on their weighted component features rather than on their base-pair details. Extensive experiments are conducted illustrating the efficiency of the CompPSA algorithm when compared to other approaches and on different real and simulated datasets. The CompPSA algorithm shows an accurate similarity measure between components. The algorithm gives the flexibility for the user to align the two RNA structures based on their weighted features (position, full length, and/or stem length). Moreover, the algorithm proves scalability and efficiency in time and memory performance.

Keywords: alignment, RNA secondary structure, pairwise, component-based, data mining

Procedia PDF Downloads 444
25921 AI/ML Atmospheric Parameters Retrieval Using the “Atmospheric Retrievals conditional Generative Adversarial Network (ARcGAN)”

Authors: Thomas Monahan, Nicolas Gorius, Thanh Nguyen

Abstract:

Exoplanet atmospheric parameters retrieval is a complex, computationally intensive, inverse modeling problem in which an exoplanet’s atmospheric composition is extracted from an observed spectrum. Traditional Bayesian sampling methods require extensive time and computation, involving algorithms that compare large numbers of known atmospheric models to the input spectral data. Runtimes are directly proportional to the number of parameters under consideration. These increased power and runtime requirements are difficult to accommodate in space missions where model size, speed, and power consumption are of particular importance. The use of traditional Bayesian sampling methods, therefore, compromise model complexity or sampling accuracy. The Atmospheric Retrievals conditional Generative Adversarial Network (ARcGAN) is a deep convolutional generative adversarial network that improves on the previous model’s speed and accuracy. We demonstrate the efficacy of artificial intelligence to quickly and reliably predict atmospheric parameters and present it as a viable alternative to slow and computationally heavy Bayesian methods. In addition to its broad applicability across instruments and planetary types, ARcGAN has been designed to function on low power application-specific integrated circuits. The application of edge computing to atmospheric retrievals allows for real or near-real-time quantification of atmospheric constituents at the instrument level. Additionally, edge computing provides both high-performance and power-efficient computing for AI applications, both of which are critical for space missions. With the edge computing chip implementation, ArcGAN serves as a strong basis for the development of a similar machine-learning algorithm to reduce the downlinked data volume from the Compact Ultraviolet to Visible Imaging Spectrometer (CUVIS) onboard the DAVINCI mission to Venus.

Keywords: deep learning, generative adversarial network, edge computing, atmospheric parameters retrieval

Procedia PDF Downloads 156
25920 Estimating Tree Height and Forest Classification from Multi Temporal Risat-1 HH and HV Polarized Satellite Aperture Radar Interferometric Phase Data

Authors: Saurav Kumar Suman, P. Karthigayani

Abstract:

In this paper the height of the tree is estimated and forest types is classified from the multi temporal RISAT-1 Horizontal-Horizontal (HH) and Horizontal-Vertical (HV) Polarised Satellite Aperture Radar (SAR) data. The novelty of the proposed project is combined use of the Back-scattering Coefficients (Sigma Naught) and the Coherence. It uses Water Cloud Model (WCM). The approaches use two main steps. (a) Extraction of the different forest parameter data from the Product.xml, BAND-META file and from Grid-xxx.txt file come with the HH & HV polarized data from the ISRO (Indian Space Research Centre). These file contains the required parameter during height estimation. (b) Calculation of the Vegetation and Ground Backscattering, Coherence and other Forest Parameters. (c) Classification of Forest Types using the ENVI 5.0 Tool and ROI (Region of Interest) calculation.

Keywords: RISAT-1, classification, forest, SAR data

Procedia PDF Downloads 393
25919 Wind Power Potential in Selected Algerian Sahara Regions

Authors: M. Dahbi, M. Sellam, A. Benatiallah, A. Harrouz

Abstract:

The wind energy is one of the most significant and rapidly developing renewable energy sources in the world and it provides a clean energy resource, which is a promising alternative in the short term in Algeria The main purpose of this paper is to compared and discuss the wind power potential in three sites located in sahara of Algeria (south west of Algeria) and to perform an investigation on the wind power potential of desert of Algeria. In this comparative, wind speed frequency distributions data obtained from the web site SODA.com are used to calculate the average wind speed and the available wind power. The Weibull density function has been used to estimate the monthly power wind density and to determine the characteristics of monthly parameters of Weibull for these three sites. The annual energy produced by the BWC XL.1 1KW wind machine is obtained and compared. The analysis shows that in the south west of Algeria, at 10 m height, the available wind power was found to vary between 136.59 W/m2 and 231.04 W/m2. The highest potential wind power was found at Adrar, with 21h per day and the mean wind speed is above 6 m/s. Besides, it is found that the annual wind energy generated by that machine lie between 512 KWh and 1643.2 kWh. However, the wind resource appears to be suitable for power production on the sahara and it could provide a viable substitute to diesel oil for irrigation pumps and rural electricity generation.

Keywords: Weibull distribution, parameters of Wiebull, wind energy, wind turbine, operating hours

Procedia PDF Downloads 484
25918 Presenting a Model for Predicting the State of Being Accident-Prone of Passages According to Neural Network and Spatial Data Analysis

Authors: Hamd Rezaeifar, Hamid Reza Sahriari

Abstract:

Accidents are considered to be one of the challenges of modern life. Due to the fact that the victims of this problem and also internal transportations are getting increased day by day in Iran, studying effective factors of accidents and identifying suitable models and parameters about this issue are absolutely essential. The main purpose of this research has been studying the factors and spatial data affecting accidents of Mashhad during 2007- 2008. In this paper it has been attempted to – through matching spatial layers on each other and finally by elaborating them with the place of accident – at the first step by adding landmarks of the accident and through adding especial fields regarding the existence or non-existence of effective phenomenon on accident, existing information banks of the accidents be completed and in the next step by means of data mining tools and analyzing by neural network, the relationship between these data be evaluated and a logical model be designed for predicting accident-prone spots with minimum error. The model of this article has a very accurate prediction in low-accident spots; yet it has more errors in accident-prone regions due to lack of primary data.

Keywords: accident, data mining, neural network, GIS

Procedia PDF Downloads 35
25917 Methodology of the Turkey’s National Geographic Information System Integration Project

Authors: Buse A. Ataç, Doğan K. Cenan, Arda Çetinkaya, Naz D. Şahin, Köksal Sanlı, Zeynep Koç, Akın Kısa

Abstract:

With its spatial data reliability, interpretation and questioning capabilities, Geographical Information Systems make significant contributions to scientists, planners and practitioners. Geographic information systems have received great attention in today's digital world, growing rapidly, and increasing the efficiency of use. Access to and use of current and accurate geographical data, which are the most important components of the Geographical Information System, has become a necessity rather than a need for sustainable and economic development. This project aims to enable sharing of data collected by public institutions and organizations on a web-based platform. Within the scope of the project, INSPIRE (Infrastructure for Spatial Information in the European Community) data specifications are considered as a road-map. In this context, Turkey's National Geographic Information System (TUCBS) Integration Project supports sharing spatial data within 61 pilot public institutions as complied with defined national standards. In this paper, which is prepared by the project team members in the TUCBS Integration Project, the technical process with a detailed methodology is explained. In this context, the main technical processes of the Project consist of Geographic Data Analysis, Geographic Data Harmonization (Standardization), Web Service Creation (WMS, WFS) and Metadata Creation-Publication. In this paper, the integration process carried out to provide the data produced by 61 institutions to be shared from the National Geographic Data Portal (GEOPORTAL), have been trying to be conveyed with a detailed methodology.

Keywords: data specification, geoportal, GIS, INSPIRE, Turkish National Geographic Information System, TUCBS, Turkey's national geographic information system

Procedia PDF Downloads 131
25916 Secure Content Centric Network

Authors: Syed Umair Aziz, Muhammad Faheem, Sameer Hussain, Faraz Idris

Abstract:

Content centric network is the network based on the mechanism of sending and receiving the data based on the interest and data request to the specified node (which has cached data). In this network, the security is bind with the content not with the host hence making it host independent and secure. In this network security is applied by taking content’s MAC (message authentication code) and encrypting it with the public key of the receiver. On the receiver end, the message is first verified and after verification message is saved and decrypted using the receiver's private key.

Keywords: content centric network, client-server, host security threats, message authentication code, named data network, network caching, peer-to-peer

Procedia PDF Downloads 630
25915 Simulation on Influence of Environmental Conditions on Part Distortion in Fused Deposition Modelling

Authors: Anto Antony Samy, Atefeh Golbang, Edward Archer, Alistair McIlhagger

Abstract:

Fused deposition modelling (FDM) is one of the additive manufacturing techniques that has become highly attractive in the industrial and academic sectors. However, parts fabricated through FDM are highly susceptible to geometrical defects such as warpage, shrinkage, and delamination that can severely affect their function. Among the thermoplastic polymer feedstock for FDM, semi-crystalline polymers are highly prone to part distortion due to polymer crystallization. In this study, the influence of FDM processing conditions such as chamber temperature and print bed temperature on the induced thermal residual stress and resulting warpage are investigated using the 3D transient thermal model for a semi-crystalline polymer. The thermo-mechanical properties and the viscoelasticity of the polymer, as well as the crystallization physics, which considers the crystallinity of the polymer, are coupled with the evolving temperature gradient of the print model. From the results, it was observed that increasing the chamber temperature from 25°C to 75°C lead to a decrease of 1.5% residual stress, while decreasing bed temperature from 100°C to 60°C, resulted in a 33% increase in residual stress and a significant rise of 138% in warpage. The simulated warpage data is validated by comparing it with the measured warpage values of the samples using 3D scanning.

Keywords: finite element analysis, fused deposition modelling, residual stress, warpage

Procedia PDF Downloads 173
25914 Modelling of Filters CO2 (Carbondioxide) and CO (Carbonmonoxide) Portable in Motor Vehicle's Exhaust with Absorbent Chitosan

Authors: Yuandanis Wahyu Salam, Irfi Panrepi, Nuraeni

Abstract:

The increased of greenhouse gases, that is CO2 (carbondioxide) in atmosphere induce the rising of earth’s surface average temperature. One of the largest contributors to greenhouse gases is motor vehicles. Smoke which is emitted by motor’s exhaust containing gases such as CO2 (carbondioxide) and CO (carbon monoxide). Chemically, chitosan is cellulose like plant fiber that has the ability to bind like absorbant foam. Chitosan is a natural antacid (absorb toxins), when chitosan is spread over the surface of water, chitosan is able to absorb fats, oils, heavy metals, and other toxic substances. Judging from the nature of chitosan is able to absorb various toxic substances, it is expected that chitosan is also able to filter out gas emission from the motor vehicles. This study designing a carbondioxide filter in the exhaust of motor vehicles using chitosan as its absorbant. It aims to filter out gases in the exhaust so that CO2 and CO can be reducted before emitted by exhaust. Form of this reseach is study of literature and applied with experimental research of tool manufacture. Data collected through documentary studies by studying books, magazines, thesis, search on the internet as well as the relevant reference. This study will produce a filters which has main function to filter out CO2 and CO emissions that generated by vehicle’s exhaust and can be used as portable.

Keywords: filter, carbon, carbondioxide, exhaust, chitosan

Procedia PDF Downloads 341
25913 Fuel Inventory/ Depletion Analysis for a Thorium-Uranium Dioxide (Th-U) O2 Pin Cell Benchmark Using Monte Carlo and Deterministic Codes with New Version VIII.0 of the Evaluated Nuclear Data File (ENDF/B) Nuclear Data Library

Authors: Jamal Al-Zain, O. El Hajjaji, T. El Bardouni

Abstract:

A (Th-U) O2 fuel pin benchmark made up of 25 w/o U and 75 w/o Th was used. In order to analyze the depletion and inventory of the fuel for the pressurized water reactor pin-cell model. The new version VIII.0 of the ENDF/B nuclear data library was used to create a data set in ACE format at various temperatures and process the data using the MAKXSF6.2 and NJOY2016 programs to process the data at the various temperatures in order to conduct this study and analyze cross-section data. The infinite multiplication factor, the concentrations and activities of the main fission products, the actinide radionuclides accumulated in the pin cell, and the total radioactivity were all estimated and compared in this study using the Monte Carlo N-Particle 6 (MCNP6.2) and DRAGON5 programs. Additionally, the behavior of the Pressurized Water Reactor (PWR) thorium pin cell that is dependent on burn-up (BU) was validated and compared with the reference data obtained using the Massachusetts Institute of Technology (MIT-MOCUP), Idaho National Engineering and Environmental Laboratory (INEEL-MOCUP), and CASMO-4 codes. The results of this study indicate that all of the codes examined have good agreements.

Keywords: PWR thorium pin cell, ENDF/B-VIII.0, MAKXSF6.2, NJOY2016, MCNP6.2, DRAGON5, fuel burn-up.

Procedia PDF Downloads 79
25912 Natural Language News Generation from Big Data

Authors: Bastian Haarmann, Likas Sikorski

Abstract:

In this paper, we introduce an NLG application for the automatic creation of ready-to-publish texts from big data. The fully automatic generated stories have a high resemblance to the style in which the human writer would draw up a news story. Topics may include soccer games, stock exchange market reports, weather forecasts and many more. The generation of the texts runs according to the human language production. Each generated text is unique. Ready-to-publish stories written by a computer application can help humans to quickly grasp the outcomes of big data analyses, save time-consuming pre-formulations for journalists and cater to rather small audiences by offering stories that would otherwise not exist.

Keywords: big data, natural language generation, publishing, robotic journalism

Procedia PDF Downloads 419
25911 Performance Evaluation of the Classic seq2seq Model versus a Proposed Semi-supervised Long Short-Term Memory Autoencoder for Time Series Data Forecasting

Authors: Aswathi Thrivikraman, S. Advaith

Abstract:

The study is aimed at designing encoders for deciphering intricacies in time series data by redescribing the dynamics operating on a lower-dimensional manifold. A semi-supervised LSTM autoencoder is devised and investigated to see if the latent representation of the time series data can better forecast the data. End-to-end training of the LSTM autoencoder, together with another LSTM network that is connected to the latent space, forces the hidden states of the encoder to represent the most meaningful latent variables relevant for forecasting. Furthermore, the study compares the predictions with those of a traditional seq2seq model.

Keywords: LSTM, autoencoder, forecasting, seq2seq model

Procedia PDF Downloads 140
25910 An Axiomatic Model for Development of the Allocated Architecture in Systems Engineering Process

Authors: Amir Sharahi, Reza Tehrani, Ali Mollajan

Abstract:

The final step to complete the “Analytical Systems Engineering Process” is the “Allocated Architecture” in which all Functional Requirements (FRs) of an engineering system must be allocated into their corresponding Physical Components (PCs). At this step, any design for developing the system’s allocated architecture in which no clear pattern of assigning the exclusive “responsibility” of each PC for fulfilling the allocated FR(s) can be found is considered a poor design that may cause difficulties in determining the specific PC(s) which has (have) failed to satisfy a given FR successfully. The present study utilizes the Axiomatic Design method principles to mathematically address this problem and establishes an “Axiomatic Model” as a solution for reaching good alternatives for developing the allocated architecture. This study proposes a “loss Function”, as a quantitative criterion to monetarily compare non-ideal designs for developing the allocated architecture and choose the one which imposes relatively lower cost to the system’s stakeholders. For the case-study, we use the existing design of U. S. electricity marketing subsystem, based on data provided by the U.S. Energy Information Administration (EIA). The result for 2012 shows the symptoms of a poor design and ineffectiveness due to coupling among the FRs of this subsystem.

Keywords: allocated architecture, analytical systems engineering process, functional requirements (FRs), physical components (PCs), responsibility of a physical component, system’s stakeholders

Procedia PDF Downloads 392
25909 The Analysis of Emergency Shutdown Valves Torque Data in Terms of Its Use as a Health Indicator for System Prognostics

Authors: Ewa M. Laskowska, Jorn Vatn

Abstract:

Industry 4.0 focuses on digital optimization of industrial processes. The idea is to use extracted data in order to build a decision support model enabling use of those data for real time decision making. In terms of predictive maintenance, the desired decision support tool would be a model enabling prognostics of system's health based on the current condition of considered equipment. Within area of system prognostics and health management, a commonly used health indicator is Remaining Useful Lifetime (RUL) of a system. Because the RUL is a random variable, it has to be estimated based on available health indicators. Health indicators can be of different types and come from different sources. They can be process variables, equipment performance variables, data related to number of experienced failures, etc. The aim of this study is the analysis of performance variables of emergency shutdown valves (ESV) used in oil and gas industry. ESV is inspected periodically, and at each inspection torque and time of valve operation are registered. The data will be analyzed by means of machine learning or statistical analysis. The purpose is to investigate whether the available data could be used as a health indicator for a prognostic purpose. The second objective is to examine what is the most efficient way to incorporate the data into predictive model. The idea is to check whether the data can be applied in form of explanatory variables in Markov process or whether other stochastic processes would be a more convenient to build an RUL model based on the information coming from registered data.

Keywords: emergency shutdown valves, health indicator, prognostics, remaining useful lifetime, RUL

Procedia PDF Downloads 75
25908 Artificial Intelligence Based Predictive Models for Short Term Global Horizontal Irradiation Prediction

Authors: Kudzanayi Chiteka, Wellington Makondo

Abstract:

The whole world is on the drive to go green owing to the negative effects of burning fossil fuels. Therefore, there is immediate need to identify and utilise alternative renewable energy sources. Among these energy sources solar energy is one of the most dominant in Zimbabwe. Solar power plants used to generate electricity are entirely dependent on solar radiation. For planning purposes, solar radiation values should be known in advance to make necessary arrangements to minimise the negative effects of the absence of solar radiation due to cloud cover and other naturally occurring phenomena. This research focused on the prediction of Global Horizontal Irradiation values for the sixth day given values for the past five days. Artificial intelligence techniques were used in this research. Three models were developed based on Support Vector Machines, Radial Basis Function, and Feed Forward Back-Propagation Artificial neural network. Results revealed that Support Vector Machines gives the best results compared to the other two with a mean absolute percentage error (MAPE) of 2%, Mean Absolute Error (MAE) of 0.05kWh/m²/day root mean square (RMS) error of 0.15kWh/m²/day and a coefficient of determination of 0.990. The other predictive models had prediction accuracies of MAPEs of 4.5% and 6% respectively for Radial Basis Function and Feed Forward Back-propagation Artificial neural network. These two models also had coefficients of determination of 0.975 and 0.970 respectively. It was found that prediction of GHI values for the future days is possible using artificial intelligence-based predictive models.

Keywords: solar energy, global horizontal irradiation, artificial intelligence, predictive models

Procedia PDF Downloads 261
25907 Solar Radiation Time Series Prediction

Authors: Cameron Hamilton, Walter Potter, Gerrit Hoogenboom, Ronald McClendon, Will Hobbs

Abstract:

A model was constructed to predict the amount of solar radiation that will make contact with the surface of the earth in a given location an hour into the future. This project was supported by the Southern Company to determine at what specific times during a given day of the year solar panels could be relied upon to produce energy in sufficient quantities. Due to their ability as universal function approximators, an artificial neural network was used to estimate the nonlinear pattern of solar radiation, which utilized measurements of weather conditions collected at the Griffin, Georgia weather station as inputs. A number of network configurations and training strategies were utilized, though a multilayer perceptron with a variety of hidden nodes trained with the resilient propagation algorithm consistently yielded the most accurate predictions. In addition, a modeled DNI field and adjacent weather station data were used to bolster prediction accuracy. In later trials, the solar radiation field was preprocessed with a discrete wavelet transform with the aim of removing noise from the measurements. The current model provides predictions of solar radiation with a mean square error of 0.0042, though ongoing efforts are being made to further improve the model’s accuracy.

Keywords: artificial neural networks, resilient propagation, solar radiation, time series forecasting

Procedia PDF Downloads 372
25906 Metformin Protects Cardiac Muscle against the Pro-Apoptotic Effects of Hyperglycaemia, Elevated Fatty Acid and Nicotine

Authors: Christopher R. Triggle, Hong Ding, Khaled Machaca, Gnanapragasam Arunachalam

Abstract:

The antidiabetic drug, metformin, has been in clinical use for over 50 years and remains the first choice drug for the treatment of type two diabetes. In addition to its effectiveness as an oral anti-hyperglycaemic drug metformin also possesses vasculoprotective effects that are assumed to be secondary to its ability to reduce insulin resistance and control glycated hemoglobin levels; however, recent data from our laboratory indicate that metformin also has direct vasoprotective effects that are mediated, at least in part, via the anti-ageing gene, SIRT1. Diabetes is a major risk factor for the development of cardiovascular disease (CVD) and it is also well established that tobacco use further enhances the risk of CVD; however, it is not known whether treatment with metformin can offset the negative effects of diabetes and tobacco use on cardiac function. The current study was therefore designed to investigate 1: the effects of hyperglycaemia (HG) either alone or in the presence of elevated fatty acids (palmitate) and nicotine on the protein expression levels of the deacetylase sirtuin 1 (the protein product of SIRT1), anti-apoptotic Bcl-2, pro-apoptotic BIM and the pro-apoptotic, tumour suppressor protein, acetylated p53 in cardiomyocytes. 2: the ability of metformin to prevent the detrimental effects of HG, palmitate and nicotine on cardiomyocyte survival. Cell culture protocols were designed using a rat cardiomyocyte cell line, H9c2, either under normal glycaemic (NG) conditions of 5.5mM glucose, or hyperglycaemic conditions (HG) of 25mM glucose with, or without, added palmitate (250μM) or nicotine (1.0mM) for 24h. Immuno-blotting was used to detect the expression of sirtuin 1, Bcl-2, BIM, acetylated (Ac)-p53, p53 with β-actin used as the reference protein. Exposure to HG, palmitate, or nicotine alone significantly reduced expression of sirtuin1, Bcl-2 and raised the expression levels of acetylated p53 and BIM; however, the combination of HG, palmitate and nicotine had a synergistic effect to significantly suppress the expression levels of sirtuin 1 and Bcl-2, but further enhanced the expression of Ac-p53, and BIM. The inclusion of 1000μM, but not 50μM, metformin in the H9c2 cell culture protocol prevented the effects of HG, palmitate and nicotine on the pro-apoptotic pathways. Collectively these data indicate that metformin, in addition to its anti-hyperglycaemic and vasculoprotective properties, also has direct cardioprotective actions that offset the negative effects of hyerglycaemia, elevated free fatty acids and nicotine on cardiac cell survival. These data are of particular significance for the treatment of patients with diabetes who are also smokers as the inclusion of metformin in their therapeutic treatment plan should help reduce cardiac-related morbidity and mortality.

Keywords: apoptosis, cardiac muscle, diabetes, metformin, nicotine

Procedia PDF Downloads 302
25905 Execution of Joinery in Large Scale Projects: Middle East Region as a Case Study

Authors: Arsany Philip Fawzy

Abstract:

This study is going to address the hurdles of project management in the joinery field. It is widely divided into two sections; the first one will shed light on how to execute large-scale projects with a specific focus on the middle east region. It will also raise major obstacles that may face the joinery team from the site clearance and the coordination between the joinery team and the construction team. The second section is going to technically analyze the commercial side of the joinery and how to control the main cost of the project to avoid financial problems. It will also suggest empirical solutions to monitor the cost impact (e.g., Variation of contract quantity and claims).

Keywords: clearance, quality, cost, variation, claim

Procedia PDF Downloads 82
25904 Assessing the Blood-Brain Barrier (BBB) Permeability in PEA-15 Mutant Cat Brain using Magnetization Transfer (MT) Effect at 7T

Authors: Sultan Z. Mahmud, Emily C. Graff, Adil Bashir

Abstract:

Phosphoprotein enriched in astrocytes 15 kDa (PEA-15) is a multifunctional adapter protein which is associated with the regulation of apoptotic cell death. Recently it has been discovered that PEA-15 is crucial in normal neurodevelopment of domestic cats, a gyrencephalic animal model, although the exact function of PEA-15 in neurodevelopment is unknown. This study investigates how PEA-15 affects the blood-brain barrier (BBB) permeability in cat brain, which can cause abnormalities in tissue metabolite and energy supplies. Severe polymicrogyria and microcephaly have been observed in cats with a loss of function PEA-15 mutation, affecting the normal neurodevelopment of the cat. This suggests that the vital role of PEA-15 in neurodevelopment is associated with gyrification. Neurodevelopment is a highly energy demanding process. The mammalian brain depends on glucose as its main energy source. PEA-15 plays a very important role in glucose uptake and utilization by interacting with phospholipase D1 (PLD1). Mitochondria also plays a critical role in bioenergetics and essential to supply adequate energy needed for neurodevelopment. Cerebral blood flow regulates adequate metabolite supply and recent findings also showed that blood plasma contains mitochondria as well. So the BBB can play a very important role in regulating metabolite and energy supply in the brain. In this study the blood-brain permeability in cat brain was measured using MRI magnetization transfer (MT) effect on the perfusion signal. Perfusion is the tissue mass normalized supply of blood to the capillary bed. Perfusion also accommodates the supply of oxygen and other metabolites to the tissue. A fraction of the arterial blood can diffuse to the tissue, which depends on the BBB permeability. This fraction is known as water extraction fraction (EF). MT is a process of saturating the macromolecules, which has an effect on the blood that has been diffused into the tissue while having minimal effect on intravascular blood water that has not been exchanged with the tissue. Measurement of perfusion signal with and without MT enables to estimate the microvascular blood flow, EF and permeability surface area product (PS) in the brain. All the experiments were performed with Siemens 7T Magnetom with 32 channel head coil. Three control cats and three PEA-15 mutant cats were used for the study. Average EF in white and gray matter was 0.9±0.1 and 0.86±0.15 respectively, perfusion in white and gray matter was 85±15 mL/100g/min and 97±20 mL/100g/min respectively, PS in white and gray matter was 201±25 mL/100g/min and 225±35 mL/100g/min respectively for control cats. For PEA-15 mutant cats, average EF in white and gray matter was 0.81±0.15 and 0.77±0.2 respectively, perfusion in white and gray matter was 140±25 mL/100g/min and 165±18 mL/100g/min respectively, PS in white and gray matter was 240±30 mL/100g/min and 259±21 mL/100g/min respectively. This results show that BBB is compromised in PEA-15 mutant cat brain, where EF is decreased and perfusion as well as PS are increased in the mutant cats compared to the control cats. This findings might further explain the function of PEA-15 in neurodevelopment.

Keywords: BBB, cat brain, magnetization transfer, PEA-15

Procedia PDF Downloads 122
25903 Consumer Over-Indebtedness in Germany: An Investigation of Key Determinants

Authors: Xiaojing Wang, Ann-Marie Ward, Tony Wall

Abstract:

The problem of over-indebtedness has increased since deregulation of the banking industry in the 1980s, and now it has become a major problem for most countries in Europe, including Germany. Consumer debt issues have attracted not only the attention of academics but also government and debt counselling institutions. Overall, this research aims to contribute to the knowledge gap regarding the causes of consumer over-indebtedness in Germany and to develop predictive models for assessing consumer over-indebtedness risk at consumer level. The situation of consumer over-indebtedness is serious in Germany. The relatively high level of social welfare support in Germany suggests that consumer debt problems are caused by other factors, other than just over-spending and income volatility. Prior literature suggests that the overall stability of the economy and level of welfare support for individuals from the structural environment contributes to consumers’ debt problems. In terms of cultural influence, the conspicuous consumption theory in consumer behaviour suggests that consumers would spend more than their means to be seen as similar profiles to consumers in a higher socio-economic class. This results in consumers taking on more debt than they can afford, and eventually becoming over-indebted. Studies have also shown that financial literacy is negatively related to consumer over-indebtedness risk. Whilst prior literature has examined structural and cultural influences respectively, no study has taken a collective approach. To address this gap, a model is developed to investigate the association between consumer over-indebtedness and proxies for influences from the structural and cultural environment based on the above theories. The model also controls for consumer demographic characteristics identified as being of influence in prior literature, such as gender and age, and adverse shocks, such as divorce or bereavement in the household. Benefiting from SOEP regional data, this study is able to conduct quantitative empirical analysis to test both structural and cultural influences at a localised level. Using German Socio-Economic Panel (SOEP) study data from 2006 to 2016, this study finds that social benefits, financial literacy and the existence of conspicuous consumption all contribute to being over-indebted. Generally speaking, the risk of becoming over-indebted is high when consumers are in a low-welfare community, have little awareness of their own financial situation and always over-spend. In order to tackle the problem of over-indebtedness, countermeasures can be taken, for example, increasing consumers’ financial awareness, and the level of welfare support. By analysing causes of consumer over-indebtedness in Germany, this study also provides new insights on the nature and underlying causes of consumer debt issues in Europe.

Keywords: consumer, debt, financial literacy, socio-economic

Procedia PDF Downloads 195
25902 Block Mining: Block Chain Enabled Process Mining Database

Authors: James Newman

Abstract:

Process mining is an emerging technology that looks to serialize enterprise data in time series data. It has been used by many companies and has been the subject of a variety of research papers. However, the majority of current efforts have looked at how to best create process mining from standard relational databases. This paper is the first pass at outlining a database custom-built for the minimal viable product of process mining. We present Block Miner, a blockchain protocol to store process mining data across a distributed network. We demonstrate the feasibility of storing process mining data on the blockchain. We present a proof of concept and show how the intersection of these two technologies helps to solve a variety of issues, including but not limited to ransomware attacks, tax documentation, and conflict resolution.

Keywords: blockchain, process mining, memory optimization, protocol

Procedia PDF Downloads 81
25901 Ethnic-Racial Breakdown in Psychological Research among Latinx Populations in the U.S.

Authors: Madeline Phillips, Luis Mendez

Abstract:

The 21st century has seen an increase in the amount and variety of psychological research on Latinx, the largest minority group in the U.S., with great variability from the individual’s cultural origin (e.g., ethnicity) to region (e.g., nationality). We were interested in exploring how scientists recruit, conduct and report research on Latinx samples. Ethnicity and race are important components of individuals and should be addressed to capture a broader and deeper understanding of psychological research findings. In order to explore Latinx/Hispanic work, the Journal of Latinx Psychology (JLP) and Hispanic Journal of Behavioral Sciences (HJBS) were analyzed for 1) measures of ethnicity and race in empirical studies 2) nationalities represented 3) how researchers reported ethnic-racial demographics. The analysis included publications from 2013-2018 and revealed two common themes of reporting ethnicity and race: overrepresentation/underrepresentation and overgeneralization. There is currently not a systematic way of reporting ethnicity and race among Latinx/Hispanic research, creating a vague sense of what and how ethnicity/race plays a role in the lives of participants. Second, studies used the Hispanic/Latinx terms interchangeably and are not consistent across publications. For the purpose of this project, we were only interested in publications with Latinx samples in the U.S. Therefore, studies outside of the U.S. and non-empirical studies were excluded. JLP went from N = 118 articles to N = 94 and HJBS went from N = 174 to N = 154. For this project, we developed a coding rubric for ethnicity/race that reflected the different ways researchers reported ethnicity and race and was compatible with the U.S. census. We coded which ethnicity/race was identified as the largest ethnic group in each sample. We used the ethnic-racial breakdown numbers or percentages if provided. There were also studies that simply did not report the ethnic composition besides Hispanic or Latinx. We found that in 80% of the samples, Mexicans are overrepresented compared to the population statistics of Latinx in the US. We observed all the ethnic-racial breakdowns, demonstrating the overrepresentation of Mexican samples and underrepresentation and/or lack of representation of certain ethnicities (e.g., Chilean, Guatemalan). Our results showed an overgeneralization of studies that cluster their participants to Latinx/Hispanic, 23 for JLP and 63 for HJBS. The authors discuss the importance of transparency from researchers in reporting the context of the sample, including country, state, neighborhood, and demographic variables that are relevant to the goals of the project, except when there may be an issue of privacy and/or confidentiality involved. In addition, the authors discuss the importance to recognize the variability within the Latinx population and how it is reflected in the scientific discourse.

Keywords: Latinx, Hispanic, race and ethnicity, diversity

Procedia PDF Downloads 102
25900 Aggregation of Electric Vehicles for Emergency Frequency Regulation of Two-Area Interconnected Grid

Authors: S. Agheb, G. Ledwich, G.Walker, Z.Tong

Abstract:

Frequency control has become more of concern for reliable operation of interconnected power systems due to the integration of low inertia renewable energy sources to the grid and their volatility. Also, in case of a sudden fault, the system has less time to recover before widespread blackouts. Electric Vehicles (EV)s have the potential to cooperate in the Emergency Frequency Regulation (EFR) by a nonlinear control of the power system in case of large disturbances. The time is not adequate to communicate with each individual EV on emergency cases, and thus, an aggregate model is necessary for a quick response to prevent from much frequency deviation and the occurrence of any blackout. In this work, an aggregate of EVs is modelled as a big virtual battery in each area considering various aspects of uncertainty such as the number of connected EVs and their initial State of Charge (SOC) as stochastic variables. A control law was proposed and applied to the aggregate model using Lyapunov energy function to maximize the rate of reduction of total kinetic energy in a two-area network after the occurrence of a fault. The control methods are primarily based on the charging/ discharging control of available EVs as shunt capacity in the distribution system. Three different cases were studied considering the locational aspect of the model with the virtual EV either in the center of the two areas or in the corners. The simulation results showed that EVs could help the generator lose its kinetic energy in a short time after a contingency. Earlier estimation of possible contributions of EVs can help the supervisory control level to transmit a prompt control signal to the subsystems such as the aggregator agents and the grid. Thus, the percentage of EVs contribution for EFR will be characterized in the future as the goal of this study.

Keywords: emergency frequency regulation, electric vehicle, EV, aggregation, Lyapunov energy function

Procedia PDF Downloads 89
25899 A Comparative Time-Series Analysis and Deep Learning Projection of Innate Radon Gas Risk in Canadian and Swedish Residential Buildings

Authors: Selim M. Khan, Dustin D. Pearson, Tryggve Rönnqvist, Markus E. Nielsen, Joshua M. Taron, Aaron A. Goodarzi

Abstract:

Accumulation of radioactive radon gas in indoor air poses a serious risk to human health by increasing the lifetime risk of lung cancer and is classified by IARC as a category one carcinogen. Radon exposure risks are a function of geologic, geographic, design, and human behavioural variables and can change over time. Using time series and deep machine learning modelling, we analyzed long-term radon test outcomes as a function of building metrics from 25,489 Canadian and 38,596 Swedish residential properties constructed between 1945 to 2020. While Canadian and Swedish properties built between 1970 and 1980 are comparable (96–103 Bq/m³), innate radon risks subsequently diverge, rising in Canada and falling in Sweden such that 21st Century Canadian houses show 467% greater average radon (131 Bq/m³) relative to Swedish equivalents (28 Bq/m³). These trends are consistent across housing types and regions within each country. The introduction of energy efficiency measures within Canadian and Swedish building codes coincided with opposing radon level trajectories in each nation. Deep machine learning modelling predicts that, without intervention, average Canadian residential radon levels will increase to 176 Bq/m³ by 2050, emphasizing the importance and urgency of future building code intervention to achieve systemic radon reduction in Canada.

Keywords: radon health risk, time-series, deep machine learning, lung cancer, Canada, Sweden

Procedia PDF Downloads 74
25898 Vulnerability of Groundwater to Pollution in Akwa Ibom State, Southern Nigeria, using the DRASTIC Model and Geographic Information System (GIS)

Authors: Aniedi A. Udo, Magnus U. Igboekwe, Rasaaq Bello, Francis D. Eyenaka, Michael C. Ohakwere-Eze

Abstract:

Groundwater vulnerability to pollution was assessed in Akwa Ibom State, Southern Nigeria, with the aim of locating areas with high potentials for resource contamination, especially due to anthropogenic influence. The electrical resistivity method was utilized in the collection of the initial field data. Additional data input, which included depth to static water level, drilled well log data, aquifer recharge data, percentage slope, as well as soil information, were sourced from secondary sources. The initial field data were interpreted both manually and with computer modeling to provide information on the geoelectric properties of the subsurface. Interpreted results together with the secondary data were used to develop the DRASTIC thematic maps. A vulnerability assessment was performed using the DRASTIC model in a GIS environment and areas with high vulnerability which needed immediate attention was clearly mapped out and presented using an aquifer vulnerability map. The model was subjected to validation and the rate of validity was 73% within the area of study.

Keywords: groundwater, vulnerability, DRASTIC model, pollution

Procedia PDF Downloads 196
25897 A Review Paper on Data Security in Precision Agriculture Using Internet of Things

Authors: Tonderai Muchenje, Xolani Mkhwanazi

Abstract:

Precision agriculture uses a number of technologies, devices, protocols, and computing paradigms to optimize agricultural processes. Big data, artificial intelligence, cloud computing, and edge computing are all used to handle the huge amounts of data generated by precision agriculture. However, precision agriculture is still emerging and has a low level of security features. Furthermore, future solutions will demand data availability and accuracy as key points to help farmers, and security is important to build robust and efficient systems. Since precision agriculture comprises a wide variety and quantity of resources, security addresses issues such as compatibility, constrained resources, and massive data. Moreover, conventional protection schemes used in the traditional internet may not be useful for agricultural systems, creating extra demands and opportunities. Therefore, this paper aims at reviewing state of the art of precision agriculture security, particularly in open field agriculture, discussing its architecture, describing security issues, and presenting the major challenges and future directions.

Keywords: precision agriculture, security, IoT, EIDE

Procedia PDF Downloads 78
25896 Commercial Automobile Insurance: A Practical Approach of the Generalized Additive Model

Authors: Nicolas Plamondon, Stuart Atkinson, Shuzi Zhou

Abstract:

The insurance industry is usually not the first topic one has in mind when thinking about applications of data science. However, the use of data science in the finance and insurance industry is growing quickly for several reasons, including an abundance of reliable customer data, ferocious competition requiring more accurate pricing, etc. Among the top use cases of data science, we find pricing optimization, customer segmentation, customer risk assessment, fraud detection, marketing, and triage analytics. The objective of this paper is to present an application of the generalized additive model (GAM) on a commercial automobile insurance product: an individually rated commercial automobile. These are vehicles used for commercial purposes, but for which there is not enough volume to apply pricing to several vehicles at the same time. The GAM model was selected as an improvement over GLM for its ease of use and its wide range of applications. The model was trained using the largest split of the data to determine model parameters. The remaining part of the data was used as testing data to verify the quality of the modeling activity. We used the Gini coefficient to evaluate the performance of the model. For long-term monitoring, commonly used metrics such as RMSE and MAE will be used. Another topic of interest in the insurance industry is to process of producing the model. We will discuss at a high level the interactions between the different teams with an insurance company that needs to work together to produce a model and then monitor the performance of the model over time. Moreover, we will discuss the regulations in place in the insurance industry. Finally, we will discuss the maintenance of the model and the fact that new data does not come constantly and that some metrics can take a long time to become meaningful.

Keywords: insurance, data science, modeling, monitoring, regulation, processes

Procedia PDF Downloads 67
25895 Moving from Computer Assisted Learning Language to Mobile Assisted Learning Language Edutainment: A Trend for Teaching and Learning

Authors: Ahmad Almohana

Abstract:

Technology has led to rapid changes in the world, and most importantly to education, particularly in the 21st century. Technology has enhanced teachers’ potential and has resulted in the provision of greater interaction and choices for learners. In addition, technology is helping to improve individuals’ learning experiences and building their capacity to read, listen, speak, search, analyse, memorise and encode languages, as well as bringing learners together and creating a sense of greater involvement. This paper has been organised in the following way: the first section provides a review of the literature related to the implementation of CALL (computer assisted learning language), and it explains CALL and its phases, as well as attempting to highlight and analyse Warschauer’s article. The second section is an attempt to describe the move from CALL to mobilised systems of edutainment, which challenge existing forms of teaching and learning. It also addresses the role of the teacher and the curriculum content, and how this is affected by the computerisation of learning that is taking place. Finally, an empirical study has been conducted to collect data from teachers in Saudi Arabia using quantitive and qualitative method tools. Connections are made between the area of study and the personal experience of the researcher carrying out the study with a methodological reflection on the challenges faced by the teachers of this same system. The major findings were that it is worth spelling out here that despite the circumstances in which students and lecturers are currently working, the participants revealed themselves to be highly intelligent and articulate individuals who were constrained from revealing this criticality and creativity by the system of learning and teaching operant in most schools.

Keywords: CALL, computer assisted learning language, EFL, English as a foreign language, ELT, English language teaching, ETL, enhanced technology learning, MALL, mobile assisted learning language

Procedia PDF Downloads 156
25894 Modeling Pan Evaporation Using Intelligent Methods of ANN, LSSVM and Tree Model M5 (Case Study: Shahroud and Mayamey Stations)

Authors: Hamidreza Ghazvinian, Khosro Ghazvinian, Touba Khodaiean

Abstract:

The importance of evaporation estimation in water resources and agricultural studies is undeniable. Pan evaporation are used as an indicator to determine the evaporation of lakes and reservoirs around the world due to the ease of interpreting its data. In this research, intelligent models were investigated in estimating pan evaporation on a daily basis. Shahroud and Mayamey were considered as the studied cities. These two cities are located in Semnan province in Iran. The mentioned cities have dry weather conditions that are susceptible to high evaporation potential. Meteorological data of 11 years of synoptic stations of Shahrood and Mayamey cities were used. The intelligent models used in this study are Artificial Neural Network (ANN), Least Squares Support Vector Machine (LSSVM), and M5 tree models. Meteorological parameters of minimum and maximum air temperature (Tmax, Tmin), wind speed (WS), sunshine hours (SH), air pressure (PA), relative humidity (RH) as selected input data and evaporation data from pan (EP) to The output data was considered. 70% of data is used at the education level, and 30 % of the data is used at the test level. Models used with explanation coefficient evaluation (R2) Root of Mean Squares Error (RMSE) and Mean Absolute Error (MAE). The results for the two Shahroud and Mayamey stations showed that the above three models' operations are rather appropriate.

Keywords: pan evaporation, intelligent methods, shahroud, mayamey

Procedia PDF Downloads 63
25893 Predicting Ecological Impacts of Sea-Level Change on Coastal Conservation Areas in India

Authors: Mohammad Zafar-ul Islam, Shaily Menon, Xingong Li, A. Townsend Peterson

Abstract:

In addition to the mounting empirical data on direct implications of climate change for natural and human systems, evidence is increasing for other, indirect climate change phenomena such as sea-level rise. Rising sea levels and associated marine intrusion into terrestrial environments are predicted to be among the most serious eventual consequences of climate change. The many complex and interacting factors affecting sea levels create considerable uncertainty in sea-level rise projections: conservative estimates are on the order of 0.5-1.0 m globally, while other estimates are much higher, approaching 6 m. Marine intrusion associated with 1– 6 m sea-level rise will impact species and habitats in coastal ecosystems severely. Examining areas most vulnerable to such impacts may allow design of appropriate adaptation and mitigation strategies. We present an overview of potential effects of 1 and 6 m sea level rise for coastal conservation areas in the Indian Subcontinent. In particular, we examine the projected magnitude of areal losses in relevant biogeographic zones, ecoregions, protected areas (PAs), and Important Bird Areas (IBAs). In addition, we provide a more detailed and quantitative analysis of likely effects of marine intrusion on 22 coastal PAs and IBAs that provide critical habitat for birds in the form of breeding areas, migratory stopover sites, and overwintering habitats. Several coastal PAs and IBAs are predicted to experience higher than 50% losses to marine intrusion. We explore consequences of such inundation levels on species and habitat in these areas.

Keywords: sea-level change, coastal inundation, marine intrusion, biogeographic zones, ecoregions, protected areas, important bird areas, adaptation, mitigation

Procedia PDF Downloads 241