Search results for: 2d and 3d data conversion
23251 Systematic NIR of Internal Disorder and Quality Detection of Apple Fruit
Authors: Eid Alharbi, Yaser Miaji, Saeed Alzahrani
Abstract:
The importance of fruit quality and freshness is potential in today’s life. Most recent studies show and automatic online sorting system according to the internal disorder for fresh apple fruit has developed by using near infrared (NIR) spectroscopic technology. The automatic convener belts system along with sorting mechanism was constructed. To check the internal quality of the apple fruit, apple was exposed to the NIR radiations in the range 650-1300 nm and the data were collected in form of absorption spectra. The collected data were compared to the reference (data of known sample) analyzed and an electronic signal was pass to the sorting system. The sorting system was separate the apple fruit samples according to electronic signal passed to the system. It is found that absorption of NIR radiation in the range 930-950 nm was higher in the internally defected samples as compared to healthy samples. On the base of this high absorption of NIR radiation in 930-950 nm region the online sorting system was constructed.Keywords: mechatronics design, NIR, fruit quality, spectroscopic technology
Procedia PDF Downloads 50023250 The Accuracy of Parkinson's Disease Diagnosis Using [123I]-FP-CIT Brain SPECT Data with Machine Learning Techniques: A Survey
Authors: Lavanya Madhuri Bollipo, K. V. Kadambari
Abstract:
Objective: To discuss key issues in the diagnosis of Parkinson disease (PD), To discuss features influencing PD progression, To discuss importance of brain SPECT data in PD diagnosis, and To discuss the essentiality of machine learning techniques in early diagnosis of PD. An accurate and early diagnosis of PD is nowadays a challenge as clinical symptoms in PD arise only when there is more than 60% loss of dopaminergic neurons. So far there are no laboratory tests for the diagnosis of PD, causing a high rate of misdiagnosis especially when the disease is in the early stages. Recent neuroimaging studies with brain SPECT using 123I-Ioflupane (DaTSCAN) as radiotracer shown to be widely used to assist the diagnosis of PD even in its early stages. Machine learning techniques can be used in combination with image analysis procedures to develop computer-aided diagnosis (CAD) systems for PD. This paper addressed recent studies involving diagnosis of PD in its early stages using brain SPECT data with Machine Learning Techniques.Keywords: Parkinson disease (PD), dopamine transporter, single-photon emission computed tomography (SPECT), support vector machine (SVM)
Procedia PDF Downloads 40223249 Effect of Synthetic L-Lysine and DL-Methionine Amino Acids on Performance of Broiler Chickens
Authors: S. M. Ali, S. I. Mohamed
Abstract:
Reduction of feed cost for broiler production is at most importance in decreasing the cost of production. The objectives of this study were to evaluate the use of synthetic amino acids (L-lysine – DL-methionine) instead of super concentrate and groundnut cake versus meat powder as protein sources. A total of 180 male broiler chicks (Cobb – strain) at 15 day of age (DOA) were selected according to their average body weight (380 g) from a broiler chicks flock at Elbashair Farm. The chicks were randomly divided into six groups of 30 chicks. Each group was further sub divided into three replicates with 10 birds. Six experimental diets were formulated. The first diet contained groundnut cake and super concentrate as the control (GNC + C); in the second diet, meat powder and super concentrate (MP + C) were used. The third diet contained groundnut cake and amino acids (GNC + AA); the forth diet contained meat powder and amino acids (MP + AA). The fifth diet contained groundnut cake, meat powder and super concentrate (GNC + MP + C) and the sixth diet contained groundnut cake, meat powder and amino acids (GNC + MP + AA). The formulated rations were randomly assigned for the different sub groups in a completely randomized design of six treatments and three replicates. Weekly feed intake, body weight and mortality were recorded and body weight gain and feed conversion ratio were calculated. At the end of the experiment (49 DOA), nine birds from each treatment were slaughtered. Live body weight, carcass weight, head, shank, and some internal organs (gizzard, heart, liver, small intestine, and abdominal fat pad) weights were taken. For the overall experimental period the (GNC + C +MP) consumed significantly (P≤0.01) the highest cumulative feed while the (MP + AA) group consumed the lowest amount of feed. The (GNC + C) and the (GNC + AA) groups had the heaviest live body weight while (MP + AA) had the lowest live body weight. The overall FCR was significantly (P≤0.01) the best for (GNC + AA) group while the (MP + AA) reported the worst FCR. However, the (GNC + AA) had significantly (P≤0.01) the lowest AFP. The (GNC + MP + Con) group had the highest dressing % while the (MP + AA) group had the lowest dressing %. It is concluded that amino acids can be used instead of super concentrate in broiler feeding with perfect performance and less cost and that meat powder is not advisable to be used with amino acids.Keywords: broiler chickens, DL-lysine, methionine, performance
Procedia PDF Downloads 27123248 Experimental and Theoretical Mass Transfer Studies of Pure Carbondioxide Absorption in Sodium Hydroxide in Millichannels
Authors: A. Durgadevi, S. Pushpavanam
Abstract:
For the past several decades, CO2 levels have been dramatically increasing in the atmosphere due to the man-made emissions such as fossil fuel-fired power plants. With the increase in CO2 emissions, CO2 concentration in the atmosphere has increased resulting in global warming. This shows the need to study different ways to capture the emitted CO2 directly from the exhausts of power plants or atmosphere. There are several ways to remove CO2, such as absorption into a liquid solvent, adsorption into a solid, cryogenic separation, permeation through membranes and photochemical conversion. In most industries, the absorption of CO2 in chemical solvents (in absorption towers) is used for CO2 capture. In these towers, the mass transfer along with chemical reactions take place between the gas and liquid phase. This helps in the separation of CO2 from other gases. It is important to understand these processes in detail. These flow patterns are difficult to maintain in large scale industrial absorbers. So to get accurate information controlled gas-liquid absorption experiments are carried out in milli-channels in this work under controlled atmosphere. The absorption experiments of CO2 in varying concentrations of sodium hydroxide solution are carried out in T-junction glass milli-channels with a circular cross section (inner diameter of 2mm). The gas and liquid flow rates are controlled by a mass flow controller (MFC) and a Harvard syringe pump respectively. The slug flow in the channel is recorded using a camera and the videos are analysed. The gas slug of pure CO2 is found to decrease in size along the length of the channel due to absorption of gas in the liquid. This is also captured with the model developed and the mass transfer characteristics are studied. The pressure drop across the channel is determined by sum of the pressure drops from the gas slugs and the liquid plugs. A dimensionless correlation for the mass transfer coefficient is developed in terms of Sherwood number and compared with the existing correlations in the literature. They are found to be in close agreement with each other. In this case, due to the presence of chemical reaction, the enhancement of mass transfer is obtained. This is quantified with the help of an enhancement factor.Keywords: absorption, enhancement factor, mass transfer coefficient, Sherwood number
Procedia PDF Downloads 17823247 Secure Network Coding against Content Pollution Attacks in Named Data Network
Authors: Tao Feng, Xiaomei Ma, Xian Guo, Jing Wang
Abstract:
Named Data Network (NDN) is one of the future Internet architecture, all nodes (i.e., hosts, routers) are allowed to have a local cache, used to satisfy incoming requests for content. However, depending on caching allows an adversary to perform attacks that are very effective and relatively easy to implement, such as content pollution attack. In this paper, we use a method of secure network coding based on homomorphic signature system to solve this problem. Firstly ,we use a dynamic public key technique, our scheme for each generation authentication without updating the initial secret key used. Secondly, employing the homomorphism of hash function, intermediate node and destination node verify the signature of the received message. In addition, when the network topology of NDN is simple and fixed, the code coefficients in our scheme are generated in a pseudorandom number generator in each node, so the distribution of the coefficients is also avoided. In short, our scheme not only can efficiently prevent against Intra/Inter-GPAs, but also can against the content poisoning attack in NDN.Keywords: named data networking, content polloution attack, network coding signature, internet architecture
Procedia PDF Downloads 33923246 Investigating Seasonal Changes of Urban Land Cover with High Spatio-Temporal Resolution Satellite Data via Image Fusion
Authors: Hantian Wu, Bo Huang, Yuan Zeng
Abstract:
Divisions between wealthy and poor, private and public landscapes are propagated by the increasing economic inequality of cities. While these are the spatial reflections of larger social issues and problems, urban design can at least employ spatial techniques that promote more inclusive rather than exclusive, overlapping rather than segregated, interlinked rather than disconnected landscapes. Indeed, the type of edge or border between urban landscapes plays a critical role in the way the environment is perceived. China experiences rapid urbanization, which poses unpredictable environmental challenges. The urban green cover and water body are under changes, which highly relevant to resident wealth and happiness. However, very limited knowledge and data on their rapid changes are available. In this regard, enhancing the monitoring of urban landscape with high-frequency method, evaluating and estimating the impacts of the urban landscape changes, and understating the driving forces of urban landscape changes can be a significant contribution for urban planning and studying. High-resolution remote sensing data has been widely applied to urban management in China. The map of urban land use map for the entire China of 2018 with 10 meters resolution has been published. However, this research focuses on the large-scale and high-resolution remote sensing land use but does not precisely focus on the seasonal change of urban covers. High-resolution remote sensing data has a long-operation cycle (e.g., Landsat 8 required 16 days for the same location), which is unable to satisfy the requirement of monitoring urban-landscape changes. On the other hand, aerial-remote or unmanned aerial vehicle (UAV) sensing are limited by the aviation-regulation and cost was hardly widely applied in the mega-cities. Moreover, those data are limited by the climate and weather conditions (e.g., cloud, fog), and those problems make capturing spatial and temporal dynamics is always a challenge for the remote sensing community. Particularly, during the rainy season, no data are available even for Sentinel Satellite data with 5 days interval. Many natural events and/or human activities drive the changes of urban covers. In this case, enhancing the monitoring of urban landscape with high-frequency method, evaluating and estimating the impacts of the urban landscape changes, and understanding the mechanism of urban landscape changes can be a significant contribution for urban planning and studying. This project aims to use the high spatiotemporal fusion of remote sensing data to create short-cycle, high-resolution remote sensing data sets for exploring the high-frequently urban cover changes. This research will enhance the long-term monitoring applicability of high spatiotemporal fusion of remote sensing data for the urban landscape for optimizing the urban management of landscape border to promoting the inclusive of the urban landscape to all communities.Keywords: urban land cover changes, remote sensing, high spatiotemporal fusion, urban management
Procedia PDF Downloads 13123245 Statistical Time-Series and Neural Architecture of Malaria Patients Records in Lagos, Nigeria
Authors: Akinbo Razak Yinka, Adesanya Kehinde Kazeem, Oladokun Oluwagbenga Peter
Abstract:
Time series data are sequences of observations collected over a period of time. Such data can be used to predict health outcomes, such as disease progression, mortality, hospitalization, etc. The Statistical approach is based on mathematical models that capture the patterns and trends of the data, such as autocorrelation, seasonality, and noise, while Neural methods are based on artificial neural networks, which are computational models that mimic the structure and function of biological neurons. This paper compared both parametric and non-parametric time series models of patients treated for malaria in Maternal and Child Health Centres in Lagos State, Nigeria. The forecast methods considered linear regression, Integrated Moving Average, ARIMA and SARIMA Modeling for the parametric approach, while Multilayer Perceptron (MLP) and Long Short-Term Memory (LSTM) Network were used for the non-parametric model. The performance of each method is evaluated using the Mean Absolute Error (MAE), R-squared (R2) and Root Mean Square Error (RMSE) as criteria to determine the accuracy of each model. The study revealed that the best performance in terms of error was found in MLP, followed by the LSTM and ARIMA models. In addition, the Bootstrap Aggregating technique was used to make robust forecasts when there are uncertainties in the data.Keywords: ARIMA, bootstrap aggregation, MLP, LSTM, SARIMA, time-series analysis
Procedia PDF Downloads 8323244 The Non-Stationary BINARMA(1,1) Process with Poisson Innovations: An Application on Accident Data
Authors: Y. Sunecher, N. Mamode Khan, V. Jowaheer
Abstract:
This paper considers the modelling of a non-stationary bivariate integer-valued autoregressive moving average of order one (BINARMA(1,1)) with correlated Poisson innovations. The BINARMA(1,1) model is specified using the binomial thinning operator and by assuming that the cross-correlation between the two series is induced by the innovation terms only. Based on these assumptions, the non-stationary marginal and joint moments of the BINARMA(1,1) are derived iteratively by using some initial stationary moments. As regards to the estimation of parameters of the proposed model, the conditional maximum likelihood (CML) estimation method is derived based on thinning and convolution properties. The forecasting equations of the BINARMA(1,1) model are also derived. A simulation study is also proposed where BINARMA(1,1) count data are generated using a multivariate Poisson R code for the innovation terms. The performance of the BINARMA(1,1) model is then assessed through a simulation experiment and the mean estimates of the model parameters obtained are all efficient, based on their standard errors. The proposed model is then used to analyse a real-life accident data on the motorway in Mauritius, based on some covariates: policemen, daily patrol, speed cameras, traffic lights and roundabouts. The BINARMA(1,1) model is applied on the accident data and the CML estimates clearly indicate a significant impact of the covariates on the number of accidents on the motorway in Mauritius. The forecasting equations also provide reliable one-step ahead forecasts.Keywords: non-stationary, BINARMA(1, 1) model, Poisson innovations, conditional maximum likelihood, CML
Procedia PDF Downloads 13123243 Usage of Crude Glycerol for Biological Hydrogen Production, Experiments and Analysis
Authors: Ilze Dimanta, Zane Rutkovska, Vizma Nikolajeva, Janis Kleperis, Indrikis Muiznieks
Abstract:
Majority of word’s steadily increasing energy consumption is provided by non-renewable fossil resources. Need to find an alternative energy resource is essential for further socio-economic development. Hydrogen is renewable, clean energy carrier with high energy density (142 MJ/kg, accordingly – oil has 42 MJ/kg). Biological hydrogen production is an alternative way to produce hydrogen from renewable resources, e.g. using organic waste material resource fermentation that facilitate recycling of sewage and are environmentally benign. Hydrogen gas is produced during the fermentation process of bacteria in anaerobic conditions. Bacteria are producing hydrogen in the liquid phase and when thermodynamic equilibrium is reached, hydrogen is diffusing from liquid to gaseous phase. Because of large quantities of available crude glycerol and the highly reduced nature of carbon in glycerol per se, microbial conversion of it seems to be economically and environmentally viable possibility. Such industrial organic waste product as crude glycerol is perspective for usage in feedstock for hydrogen producing bacteria. The process of biodiesel production results in 41% (w/w) of crude glycerol. The developed lab-scale test system (experimental bioreactor) with hydrogen micro-electrode (Unisense, Denmark) was used to determine hydrogen production yield and rate in the liquid phase. For hydrogen analysis in the gas phase the RGAPro-100 mass-spectrometer connected to the experimental test-system was used. Fermentative bacteria strains were tested for hydrogen gas production rates. The presence of hydrogen in gaseous phase was measured using mass spectrometer but registered concentrations were comparatively small. To decrease the hydrogen partial pressure in liquid phase reactor with a system for continuous bubbling with inert gas was developed. H2 production rate for the best producer in liquid phase reached 0,40 mmol H2/l, in gaseous phase - 1,32 mmol H2/l. Hydrogen production rate is time dependent – higher rate of hydrogen production is at the fermentation process beginning when concentration increases, but after three hours of fermentation, it decreases.Keywords: bio-hydrogen, fermentation, experimental bioreactor, crude glycerol
Procedia PDF Downloads 52523242 Library on the Cloud: Universalizing Libraries Based on Virtual Space
Authors: S. Vanaja, P. Panneerselvam, S. Santhanakarthikeyan
Abstract:
Cloud Computing is a latest trend in Libraries. Entering in to cloud services, Librarians can suit the present information handling and they are able to satisfy needs of the knowledge society. Libraries are now in the platform of universalizing all its information to users and they focus towards clouds which gives easiest access to data and application. Cloud computing is a highly scalable platform promising quick access to hardware and software over the internet, in addition to easy management and access by non-expert users. In this paper, we discuss the cloud’s features and its potential applications in the library and information centers, how cloud computing actually works is illustrated in this communication and how it will be implemented. It discuss about what are the needs to move to cloud, process of migration to cloud. In addition to that this paper assessed the practical problems during migration in libraries, advantages of migration process and what are the measures that Libraries should follow during migration in to cloud. This paper highlights the benefits and some concerns regarding data ownership and data security on the cloud computing.Keywords: cloud computing, cloud-service, cloud based-ILS, cloud-providers, discovery service, IaaS, PaaS, SaaS, virtualization, Web scale access
Procedia PDF Downloads 66623241 Deliberation of Daily Evapotranspiration and Evaporative Fraction Based on Remote Sensing Data
Authors: J. Bahrawi, M. Elhag
Abstract:
Estimation of evapotranspiration is always a major component in water resources management. Traditional techniques of calculating daily evapotranspiration based on field measurements are valid only for local scales. Earth observation satellite sensors are thus used to overcome difficulties in obtaining daily evapotranspiration measurements on regional scale. The Surface Energy Balance System (SEBS) model was adopted to estimate daily evapotranspiration and relative evaporation along with other land surface energy fluxes. The model requires agro-climatic data that improve the model outputs. Advance Along Track Scanning Radiometer (AATSR) and Medium Spectral Resolution Imaging Spectrometer (MERIS) imageries were used to estimate the daily evapotranspiration and relative evaporation over the entire Nile Delta region in Egypt supported by meteorological data collected from six different weather stations located within the study area. Daily evapotranspiration maps derived from SEBS model show a strong agreement with actual ground-truth data taken from 92 points uniformly distributed all over the study area. Moreover, daily evapotranspiration and relative evaporation are strongly correlated. The reliable estimation of daily evapotranspiration supports the decision makers to review the current land use practices in terms of water management, while enabling them to propose proper land use changes.Keywords: daily evapotranspiration, relative evaporation, SEBS, AATSR, MERIS, Nile Delta
Procedia PDF Downloads 26323240 High Secure Data Hiding Using Cropping Image and Least Significant Bit Steganography
Authors: Khalid A. Al-Afandy, El-Sayyed El-Rabaie, Osama Salah, Ahmed El-Mhalaway
Abstract:
This paper presents a high secure data hiding technique using image cropping and Least Significant Bit (LSB) steganography. The predefined certain secret coordinate crops will be extracted from the cover image. The secret text message will be divided into sections. These sections quantity is equal the image crops quantity. Each section from the secret text message will embed into an image crop with a secret sequence using LSB technique. The embedding is done using the cover image color channels. Stego image is given by reassembling the image and the stego crops. The results of the technique will be compared to the other state of art techniques. Evaluation is based on visualization to detect any degradation of stego image, the difficulty of extracting the embedded data by any unauthorized viewer, Peak Signal-to-Noise Ratio of stego image (PSNR), and the embedding algorithm CPU time. Experimental results ensure that the proposed technique is more secure compared with the other traditional techniques.Keywords: steganography, stego, LSB, crop
Procedia PDF Downloads 27123239 A Usability Framework to Influence the Intention to Use Mobile Fitness Applications in South Africa
Authors: Bulelani Ngamntwini, Liezel Cilliers
Abstract:
South Africa has one of the highest prevalence of obese people on the African continent. Forty-six percent of the adults in South Africa are physically inactive. Fitness applications can be used to increase physical inactivity. However, the uptake of mobile fitness applications in South Africa has been found to be poor due to usability challenges with the technology. The study developed a usability framework to influence the intention to use mobile fitness applications in South Africa. The study made use of a positivistic approach to collect data. A questionnaire was used to collect quantitative data from 377 respondents that have used mobile fitness applications in the past. A response rate of 80.90% was recorded. To analyse the data, the Pearson correlation was used to determine the relationships between the various hypotheses. There are four usability factors, efficiency, effectiveness, satisfaction, and learnability, which contribute to the intention of users to make use of mobile fitness applications. The study, therefore, recommends that for a mobile fitness application to be successful, these four factors must be considered and incorporated by developers when designing the applications.Keywords: obese, overweight, physical inactivity, mobile fitness application, usability factors
Procedia PDF Downloads 16623238 Non-Signaling Chemokine Receptor CCRL1 and Its Active Counterpart CCR7 in Prostate Cancer
Authors: Yiding Qu, Svetlana V. Komarova
Abstract:
Chemokines acting through their cognate chemokine receptors guide the directional migration of the cell along the chemokine gradient. Several chemokine receptors were recently identified as non-signaling (decoy), based on their ability to bind the chemokine but produce no measurable signal in the cell. The function of these decoy receptors is not well understood. We examined the expression of a decoy receptor CCRL1 and a signaling receptor that binds to the same ligands, CCR7, in prostate cancer using publically available microarray data (www.oncomine.org). The expression of both CCRL1 and CCR7 increased in an approximately half of prostate carcinoma samples and the majority of metastatic cancer samples compared to normal prostate. Moreover, the expression of CCRL1 positively correlated with the expression of CCR7. These data suggest that CCR7 and CCRL1 can be used as clinical markers for the early detection of transformation from carcinoma to metastatic cancer. In addition, these data support our hypothesis that the non-signaling chemokine receptors actively stimulate cell migration.Keywords: bioinformatics, cell migration, decoy receptor, meta-analysis, prostate cancer
Procedia PDF Downloads 47523237 Developing NAND Flash-Memory SSD-Based File System Design
Authors: Jaechun No
Abstract:
This paper focuses on I/O optimizations of N-hybrid (New-Form of hybrid), which provides a hybrid file system space constructed on SSD and HDD. Although the promising potentials of SSD, such as the absence of mechanical moving overhead and high random I/O throughput, have drawn a lot of attentions from IT enterprises, its high ratio of cost/capacity makes it less desirable to build a large-scale data storage subsystem composed of only SSDs. In this paper, we present N-hybrid that attempts to integrate the strengths of SSD and HDD, to offer a single, large hybrid file system space. Several experiments were conducted to verify the performance of N-hybrid.Keywords: SSD, data section, I/O optimizations, hybrid system
Procedia PDF Downloads 42223236 Air Handling Units Power Consumption Using Generalized Additive Model for Anomaly Detection: A Case Study in a Singapore Campus
Authors: Ju Peng Poh, Jun Yu Charles Lee, Jonathan Chew Hoe Khoo
Abstract:
The emergence of digital twin technology, a digital replica of physical world, has improved the real-time access to data from sensors about the performance of buildings. This digital transformation has opened up many opportunities to improve the management of the building by using the data collected to help monitor consumption patterns and energy leakages. One example is the integration of predictive models for anomaly detection. In this paper, we use the GAM (Generalised Additive Model) for the anomaly detection of Air Handling Units (AHU) power consumption pattern. There is ample research work on the use of GAM for the prediction of power consumption at the office building and nation-wide level. However, there is limited illustration of its anomaly detection capabilities, prescriptive analytics case study, and its integration with the latest development of digital twin technology. In this paper, we applied the general GAM modelling framework on the historical data of the AHU power consumption and cooling load of the building between Jan 2018 to Aug 2019 from an education campus in Singapore to train prediction models that, in turn, yield predicted values and ranges. The historical data are seamlessly extracted from the digital twin for modelling purposes. We enhanced the utility of the GAM model by using it to power a real-time anomaly detection system based on the forward predicted ranges. The magnitude of deviation from the upper and lower bounds of the uncertainty intervals is used to inform and identify anomalous data points, all based on historical data, without explicit intervention from domain experts. Notwithstanding, the domain expert fits in through an optional feedback loop through which iterative data cleansing is performed. After an anomalously high or low level of power consumption detected, a set of rule-based conditions are evaluated in real-time to help determine the next course of action for the facilities manager. The performance of GAM is then compared with other approaches to evaluate its effectiveness. Lastly, we discuss the successfully deployment of this approach for the detection of anomalous power consumption pattern and illustrated with real-world use cases.Keywords: anomaly detection, digital twin, generalised additive model, GAM, power consumption, supervised learning
Procedia PDF Downloads 15923235 Sentiment Analysis: An Enhancement of Ontological-Based Features Extraction Techniques and Word Equations
Authors: Mohd Ridzwan Yaakub, Muhammad Iqbal Abu Latiffi
Abstract:
Online business has become popular recently due to the massive amount of information and medium available on the Internet. This has resulted in the huge number of reviews where the consumers share their opinion, criticisms, and satisfaction on the products they have purchased on the websites or the social media such as Facebook and Twitter. However, to analyze customer’s behavior has become very important for organizations to find new market trends and insights. The reviews from the websites or the social media are in structured and unstructured data that need a sentiment analysis approach in analyzing customer’s review. In this article, techniques used in will be defined. Definition of the ontology and description of its possible usage in sentiment analysis will be defined. It will lead to empirical research that related to mobile phones used in research and the ontology used in the experiment. The researcher also will explore the role of preprocessing data and feature selection methodology. As the result, ontology-based approach in sentiment analysis can help in achieving high accuracy for the classification task.Keywords: feature selection, ontology, opinion, preprocessing data, sentiment analysis
Procedia PDF Downloads 20123234 Construction of the Large Scale Biological Networks from Microarrays
Authors: Fadhl Alakwaa
Abstract:
One of the sustainable goals of the system biology is understanding gene-gene interactions. Hence, gene regulatory networks (GRN) need to be constructed for understanding the disease ontology and to reduce the cost of drug development. To construct gene regulatory from gene expression we need to overcome many challenges such as data denoising and dimensionality. In this paper, we develop an integrated system to reduce data dimension and remove the noise. The generated network from our system was validated via available interaction databases and was compared to previous methods. The result revealed the performance of our proposed method.Keywords: gene regulatory network, biclustering, denoising, system biology
Procedia PDF Downloads 24223233 Assessment of Soil Salinity through Remote Sensing Technique in the Coastal Region of Bangladesh
Abstract:
Soil salinity is a major problem for the coastal region of Bangladesh, which has been increasing for the last four decades. Determination of soil salinity is essential for proper land use planning for agricultural crop production. The aim of the research is to estimate and monitor the soil salinity in the study area. Remote sensing can be an effective tool for detecting soil salinity in data-scarce conditions. In the research, Landsat 8 is used, which required atmospheric and radiometric correction, and nine soil salinity indices are applied to develop a soil salinity map. Ground soil salinity data, i.e., EC value, is collected as a printed map which is then scanned and digitized to develop a point shapefile. Linear regression is made between satellite-based generated map and ground soil salinity data, i.e., EC value. The results show that maximum R² value is found for salinity index SI 7 = G*R/B representing 0.022. This minimal R² value refers that there is a negligible relationship between ground EC value and salinity index generated value. Hence, these indices are not appropriate to assess soil salinity though many studies used those soil salinity indices successfully. Therefore, further research is necessary to formulate a model for determining the soil salinity in the coastal of Bangladesh.Keywords: soil salinity, EC, Landsat 8, salinity indices, linear regression, remote sensing
Procedia PDF Downloads 35223232 Despiking of Turbulent Flow Data in Gravel Bed Stream
Authors: Ratul Das
Abstract:
The present experimental study insights the decontamination of instantaneous velocity fluctuations captured by Acoustic Doppler Velocimeter (ADV) in gravel-bed streams to ascertain near-bed turbulence for low Reynolds number. The interference between incidental and reflected pulses produce spikes in the ADV data especially in the near-bed flow zone and therefore filtering the data are very essential. Nortek’s Vectrino four-receiver ADV probe was used to capture the instantaneous three-dimensional velocity fluctuations over a non-cohesive bed. A spike removal algorithm based on the acceleration threshold method was applied to note the bed roughness and its influence on velocity fluctuations and velocity power spectra in the carrier fluid. The velocity power spectra of despiked signals with a best combination of velocity threshold (VT) and acceleration threshold (AT) are proposed which ascertained velocity power spectra a satisfactory fit with the Kolmogorov “–5/3 scaling-law” in the inertial sub-range. Also, velocity distributions below the roughness crest level fairly follows a third-degree polynomial series.Keywords: acoustic doppler velocimeter, gravel-bed, spike removal, reynolds shear stress, near-bed turbulence, velocity power spectra
Procedia PDF Downloads 30523231 Without the Labs, You’re Only Guessing: Why Laboratory Data Is a Baseline for Water and Wastewater Treatment
Authors: Sadikia Thomas Caldarazzo
Abstract:
Water municipalities are crucial to public and environmental health and safety. Historically, support labs have acted as a system of checks and balances for water and wastewater treatment plants. However, their contributions extend far beyond this role and often go unrecognized. The City of Baltimore Department of Public Works operates four labs: two for water treatment and two for wastewater treatment. Each lab supports its designated plant by employing subject matter experts (SMEs) in chemistry, biology, and quality control. These experts produce valid and precise data in a timely manner, reducing data uncertainty for both routine monitoring and special sampling. Beyond the plants, Baltimore City labs also analyze samples and produce data for several inter-agency divisions, including utility maintenance, solid waste, stormwater, the office of research management, sanitary pre-treatment, and special sampling requested by the Mayor, City Council, or consumers within the distribution area. Municipalities may not always fully appreciate the integral role labs play in urban water cycle management. As operations continually adjust their processes to maintain compliance, support labs must also adapt to these changes. High-ranking lab managers should be consulted for scientific advice in major utility changes or decisions, similar to consulting lawyers or other experts. Lab managers and scientific analysts are first responders in analyzing data trends and sample integrity. They provide analytical insights into biological and chemical changes in the processes, aiding in decision-making and problem-solving for operations. Engaging lab personnel at various levels to address impediments and discrepancies leads to effective solutions. Effective communication and consultation are imperative. Comprehensive sharing of pertinent information increases awareness and acts as a catalyst for optimal utility management. Fully utilizing lab management for scientific guidance and data analysis builds resilience across the utility's operations. The data produced by the labs, validated by their SMEs, forms the basis for regulatory reports that plant operations and other divisions submit to their regulators for permit purposes. Labs are on the front line, along with operations! This collaboration also helps personnel outside the labs understand outliers or trend changes in data without being forced to delve outside their areas of expertise.Keywords: water, wastewater, wastewater treatment, water treatment
Procedia PDF Downloads 1023230 RS Based SCADA System for Longer Distance Powered Devices
Authors: Harkishen Singh, Gavin Mangeni
Abstract:
This project aims at building an efficient and automatic power monitoring SCADA system, which is capable of monitoring the electrical parameters of high voltage powered devices in real time for example RMS voltage and current, frequency, energy consumed, power factor etc. The system uses RS-485 serial communication interface to transfer data over longer distances. Embedded C programming is the platform used to develop two hardware modules namely: RTU and Master Station modules, which both use the CC2540 BLE 4.0 microcontroller configured in slave / master mode. The Si8900 galvanic ally isolated microchip is used to perform ADC externally. The hardware communicates via UART port and sends data to the user PC using the USB port. Labview software is used to design a user interface to display current state of the power loads being monitored as well as logs data to excel spreadsheet file. An understanding of the Si8900’s auto baud rate process is key to successful implementation of this project.Keywords: SCADA, RS485, CC2540, labview, Si8900
Procedia PDF Downloads 30623229 Blockchain Technology for Secure and Transparent Oil and Gas Supply Chain Management
Authors: Gaurav Kumar Sinha
Abstract:
The oil and gas industry, characterized by its complex and global supply chains, faces significant challenges in ensuring security, transparency, and efficiency. Blockchain technology, with its decentralized and immutable ledger, offers a transformative solution to these issues. This paper explores the application of blockchain technology in the oil and gas supply chain, highlighting its potential to enhance data security, improve transparency, and streamline operations. By leveraging smart contracts, blockchain can automate and secure transactions, reducing the risk of fraud and errors. Additionally, the integration of blockchain with IoT devices enables real-time tracking and monitoring of assets, ensuring data accuracy and integrity throughout the supply chain. Case studies and pilot projects within the industry demonstrate the practical benefits and challenges of implementing blockchain solutions. The findings suggest that blockchain technology can significantly improve trust and collaboration among supply chain participants, ultimately leading to more efficient and resilient operations. This study provides valuable insights for industry stakeholders considering the adoption of blockchain technology to address their supply chain management challenges.Keywords: blockchain technology, oil and gas supply chain, data security, transparency, smart contracts, IoT integration, real-time tracking, asset monitoring, fraud reduction, supply chain efficiency, data integrity, case studies, industry implementation, trust, collaboration.
Procedia PDF Downloads 3923228 Creating Database and Building 3D Geological Models: A Case Study on Bac Ai Pumped Storage Hydropower Project
Authors: Nguyen Chi Quang, Nguyen Duong Tri Nguyen
Abstract:
This article is the first step to research and outline the structure of the geotechnical database in the geological survey of a power project; in the context of this report creating the database that has been carried out for the Bac Ai pumped storage hydropower project. For the purpose of providing a method of organizing and storing geological and topographic survey data and experimental results in a spatial database, the RockWorks software is used to bring optimal efficiency in the process of exploiting, using, and analyzing data in service of the design work in the power engineering consulting. Three-dimensional (3D) geotechnical models are created from the survey data: such as stratigraphy, lithology, porosity, etc. The results of the 3D geotechnical model in the case of Bac Ai pumped storage hydropower project include six closely stacked stratigraphic formations by Horizons method, whereas modeling of engineering geological parameters is performed by geostatistical methods. The accuracy and reliability assessments are tested through error statistics, empirical evaluation, and expert methods. The three-dimensional model analysis allows better visualization of volumetric calculations, excavation and backfilling of the lake area, tunneling of power pipelines, and calculation of on-site construction material reserves. In general, the application of engineering geological modeling makes the design work more intuitive and comprehensive, helping construction designers better identify and offer the most optimal design solutions for the project. The database always ensures the update and synchronization, as well as enables 3D modeling of geological and topographic data to integrate with the designed data according to the building information modeling. This is also the base platform for BIM & GIS integration.Keywords: database, engineering geology, 3D Model, RockWorks, Bac Ai pumped storage hydropower project
Procedia PDF Downloads 17423227 The Inequality Effects of Natural Disasters: Evidence from Thailand
Authors: Annop Jaewisorn
Abstract:
This study explores the relationship between natural disasters and inequalities -both income and expenditure inequality- at a micro-level of Thailand as the first study of this nature for this country. The analysis uses a unique panel and remote-sensing dataset constructed for the purpose of this research. It contains provincial inequality measures and other economic and social indicators based on the Thailand Household Survey during the period between 1992 and 2019. Meanwhile, the data on natural disasters, which are remote-sensing data, are received from several official geophysical or meteorological databases. Employing a panel fixed effects, the results show that natural disasters significantly reduce household income and expenditure inequality as measured by the Gini index, implying that rich people in Thailand bear a higher cost of natural disasters when compared to poor people. The effect on income inequality is mainly driven by droughts, while the effect on expenditure inequality is mainly driven by flood events. The results are robust across heterogeneity of the samples, lagged effects, outliers, and an alternative inequality measure.Keywords: inequality, natural disasters, remote-sensing data, Thailand
Procedia PDF Downloads 12923226 Non-Local Simultaneous Sparse Unmixing for Hyperspectral Data
Authors: Fanqiang Kong, Chending Bian
Abstract:
Sparse unmixing is a promising approach in a semisupervised fashion by assuming that the observed pixels of a hyperspectral image can be expressed in the form of linear combination of only a few pure spectral signatures (end members) in an available spectral library. However, the sparse unmixing problem still remains a great challenge at finding the optimal subset of endmembers for the observed data from a large standard spectral library, without considering the spatial information. Under such circumstances, a sparse unmixing algorithm termed as non-local simultaneous sparse unmixing (NLSSU) is presented. In NLSSU, the non-local simultaneous sparse representation method for endmember selection of sparse unmixing, is used to finding the optimal subset of endmembers for the similar image patch set in the hyperspectral image. And then, the non-local means method, as a regularizer for abundance estimation of sparse unmixing, is used to exploit the abundance image non-local self-similarity. Experimental results on both simulated and real data demonstrate that NLSSU outperforms the other algorithms, with a better spectral unmixing accuracy.Keywords: hyperspectral unmixing, simultaneous sparse representation, sparse regression, non-local means
Procedia PDF Downloads 25323225 Human Resource Management Practices, Person-Environment Fit and Financial Performance in Brazilian Publicly Traded Companies
Authors: Bruno Henrique Rocha Fernandes, Amir Rezaee, Jucelia Appio
Abstract:
The relation between Human Resource Management (HRM) practices and organizational performance remains the subject of substantial literature. Though many studies demonstrated positive relationship, still major influencing variables are not yet clear. This study considers the Person-Environment Fit (PE Fit) and its components, Person-Supervisor (PS), Person-Group (PG), Person-Organization (PO) and Person-Job (PJ) Fit, as possible explanatory variables. We analyzed PE Fit as a moderator between HRM practices and financial performance in the “best companies to work” in Brazil. Data from HRM practices were classified through the High Performance Working Systems (HPWS) construct and data on PE-Fit were obtained through surveys among employees. Financial data, consisting of return on invested capital (ROIC) and price earnings ratio (PER) were collected for publicly traded best companies to work. Findings show that PO Fit and PJ Fit play a significant moderator role for PER but not for ROIC.Keywords: financial performance, human resource management, high performance working systems, person-environment fit
Procedia PDF Downloads 16823224 Flow Duration Curves and Recession Curves Connection through a Mathematical Link
Authors: Elena Carcano, Mirzi Betasolo
Abstract:
This study helps Public Water Bureaus in giving reliable answers to water concession requests. Rapidly increasing water requests can be supported provided that further uses of a river course are not totally compromised, and environmental features are protected as well. Strictly speaking, a water concession can be considered a continuous drawing from the source and causes a mean annual streamflow reduction. Therefore, deciding if a water concession is appropriate or inappropriate seems to be easily solved by comparing the generic demand to the mean annual streamflow value at disposal. Still, the immediate shortcoming for such a comparison is that streamflow data are information available only for few catchments and, most often, limited to specific sites. Subsequently, comparing the generic water demand to mean daily discharge is indeed far from being completely satisfactory since the mean daily streamflow is greater than the water withdrawal for a long period of a year. Consequently, such a comparison appears to be of little significance in order to preserve the quality and the quantity of the river. In order to overcome such a limit, this study aims to complete the information provided by flow duration curves introducing a link between Flow Duration Curves (FDCs) and recession curves and aims to show the chronological sequence of flows with a particular focus on low flow data. The analysis is carried out on 25 catchments located in North-Eastern Italy for which daily data are provided. The results identify groups of catchments as hydrologically homogeneous, having the lower part of the FDCs (corresponding streamflow interval is streamflow Q between 300 and 335, namely: Q(300), Q(335)) smoothly reproduced by a common recession curve. In conclusion, the results are useful to provide more reliable answers to water request, especially for those catchments which show similar hydrological response and can be used for a focused regionalization approach on low flow data. A mathematical link between streamflow duration curves and recession curves is herein provided, thus furnishing streamflow duration curves information upon a temporal sequence of data. In such a way, by introducing assumptions on recession curves, the chronological sequence upon low flow data can also be attributed to FDCs, which are known to lack this information by nature.Keywords: chronological sequence of discharges, recession curves, streamflow duration curves, water concession
Procedia PDF Downloads 19623223 Alternating Expectation-Maximization Algorithm for a Bilinear Model in Isoform Quantification from RNA-Seq Data
Authors: Wenjiang Deng, Tian Mou, Yudi Pawitan, Trung Nghia Vu
Abstract:
Estimation of isoform-level gene expression from RNA-seq data depends on simplifying assumptions, such as uniform reads distribution, that are easily violated in real data. Such violations typically lead to biased estimates. Most existing methods provide a bias correction step(s), which is based on biological considerations, such as GC content–and applied in single samples separately. The main problem is that not all biases are known. For example, new technologies such as single-cell RNA-seq (scRNA-seq) may introduce new sources of bias not seen in bulk-cell data. This study introduces a method called XAEM based on a more flexible and robust statistical model. Existing methods are essentially based on a linear model Xβ, where the design matrix X is known and derived based on the simplifying assumptions. In contrast, XAEM considers Xβ as a bilinear model with both X and β unknown. Joint estimation of X and β is made possible by simultaneous analysis of multi-sample RNA-seq data. Compared to existing methods, XAEM automatically performs empirical correction of potentially unknown biases. XAEM implements an alternating expectation-maximization (AEM) algorithm, alternating between estimation of X and β. For speed XAEM utilizes quasi-mapping for read alignment, thus leading to a fast algorithm. Overall XAEM performs favorably compared to other recent advanced methods. For simulated datasets, XAEM obtains higher accuracy for multiple-isoform genes, particularly for paralogs. In a differential-expression analysis of a real scRNA-seq dataset, XAEM achieves substantially greater rediscovery rates in an independent validation set.Keywords: alternating EM algorithm, bias correction, bilinear model, gene expression, RNA-seq
Procedia PDF Downloads 14623222 A New Distribution and Application on the Lifetime Data
Authors: Gamze Ozel, Selen Cakmakyapan
Abstract:
We introduce a new model called the Marshall-Olkin Rayleigh distribution which extends the Rayleigh distribution using Marshall-Olkin transformation and has increasing and decreasing shapes for the hazard rate function. Various structural properties of the new distribution are derived including explicit expressions for the moments, generating and quantile function, some entropy measures, and order statistics are presented. The model parameters are estimated by the method of maximum likelihood and the observed information matrix is determined. The potentiality of the new model is illustrated by means of real life data set.Keywords: Marshall-Olkin distribution, Rayleigh distribution, estimation, maximum likelihood
Procedia PDF Downloads 504