Search results for: 3D plant data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27251

Search results for: 3D plant data

24581 Biochemical and Antiviral Study of Peptides Isolated from Amaranthus hypochondriacus on Tomato Yellow Leaf Curl Virus Replication

Authors: José Silvestre Mendoza Figueroa, Anders Kvarnheden, Jesús Méndez Lozano, Edgar Antonio Rodríguez Negrete, Manuel Soriano García

Abstract:

Agroindustrial plants such as cereals and pseudo cereals offer a substantial source of biomacromolecules, as they contain large amounts per tissue-gram of proteins, polysaccharides and lipids in comparison with other plants. In particular, Amaranthus hypochondriacus seeds have high levels of proteins in comparison with other cereal and pseudo cereal species, which makes the plant a good source of bioactive molecules such as peptides. Geminiviruses are one principal class of pathogens that causes important economic losses in crops, affecting directly the development and production of the plant. One such virus is the Tomato yellow leaf curl virus (TYLCV), which affects mainly Solanacea family plants such as tomato species. The symptoms of the disease are curling of leaves, chlorosis, dwarfing and floral abortion. The aim of this work was to get peptides derived from enzymatic hydrolysis of globulins and albumins from amaranth seeds with specific recognition of the replication origin in the TYLCV genome, and to test the antiviral activity on host plants with the idea to generate a direct control of this viral infection. Globulins and albumins from amaranth were extracted, the fraction was enzymatically digested with papain, and the aromatic peptides fraction was selected for further purification. Six peptides were tested against the replication origin (OR) using affinity assays, surface resonance plasmon and fluorescent titration, and two of these peptides showed high affinity values to the replication origin of the virus, dissociation constant values were calculated and showed specific interaction between the peptide Ampep1 and the OR. An in vitro replication test of the total TYLCV DNA was performed, in which the peptide AmPep1 was added in different concentrations to the system reaction, which resulted in a decrease of viral DNA synthesis when the peptide concentration increased. Also, we showed that the peptide can decrease the complementary DNA chain of the virus in Nicotiana benthamiana leaves, confirming that the peptide binds to the OR and that its expected mechanism of action is to decrease the replication rate of the viral genome. In an infection assay, N. benthamiana plants were agroinfected with TYLCV-Israel and TYLCV-Guasave. After confirming systemic infection, the peptide was infiltrated in new infected leaves, and the plants treated with the peptide showed a decrease of virus symptoms and viral titer. In order to confirm the antiviral activity in a commercial crop, tomato plants were infected with TYLCV. After confirming systemic infection, plants were infiltrated with peptide solution as above, and the symptom development was monitored 21 days after treatment, showing that tomato plants treated with peptides had lower symptom rates and viral titer. The peptide was also tested against other begomovirus such as Pepper huasteco yellow vein virus (PHYVV-Guasave), showing a decrease of symptoms in N. benthamiana infected plants. The model of direct biochemical control of TYLCV infection shown in this work can be extrapolated to other begomovirus infections, and the methods reported here can be used for design of antiviral agrochemicals for other plant virus infections.

Keywords: agrochemical screening, antiviral, begomovirus, geminivirus, peptides, plasmon, TYLCV

Procedia PDF Downloads 255
24580 Generalized Approach to Linear Data Transformation

Authors: Abhijith Asok

Abstract:

This paper presents a generalized approach for the simple linear data transformation, Y=bX, through an integration of multidimensional coordinate geometry, vector space theory and polygonal geometry. The scaling is performed by adding an additional ’Dummy Dimension’ to the n-dimensional data, which helps plot two dimensional component-wise straight lines on pairs of dimensions. The end result is a set of scaled extensions of observations in any of the 2n spatial divisions, where n is the total number of applicable dimensions/dataset variables, created by shifting the n-dimensional plane along the ’Dummy Axis’. The derived scaling factor was found to be dependent on the coordinates of the common point of origin for diverging straight lines and the plane of extension, chosen on and perpendicular to the ’Dummy Axis’, respectively. This result indicates the geometrical interpretation of a linear data transformation and hence, opportunities for a more informed choice of the factor ’b’, based on a better choice of these coordinate values. The paper follows on to identify the effect of this transformation on certain popular distance metrics, wherein for many, the distance metric retained the same scaling factor as that of the features.

Keywords: data transformation, dummy dimension, linear transformation, scaling

Procedia PDF Downloads 286
24579 Blockchain Platform Configuration for MyData Operator in Digital and Connected Health

Authors: Minna Pikkarainen, Yueqiang Xu

Abstract:

The integration of digital technology with existing healthcare processes has been painfully slow, a huge gap exists between the fields of strictly regulated official medical care and the quickly moving field of health and wellness technology. We claim that the promises of preventive healthcare can only be fulfilled when this gap is closed – health care and self-care becomes seamless continuum “correct information, in the correct hands, at the correct time allowing individuals and professionals to make better decisions” what we call connected health approach. Currently, the issues related to security, privacy, consumer consent and data sharing are hindering the implementation of this new paradigm of healthcare. This could be solved by following MyData principles stating that: Individuals should have the right and practical means to manage their data and privacy. MyData infrastructure enables decentralized management of personal data, improves interoperability, makes it easier for companies to comply with tightening data protection regulations, and allows individuals to change service providers without proprietary data lock-ins. This paper tackles today’s unprecedented challenges of enabling and stimulating multiple healthcare data providers and stakeholders to have more active participation in the digital health ecosystem. First, the paper systematically proposes the MyData approach for healthcare and preventive health data ecosystem. In this research, the work is targeted for health and wellness ecosystems. Each ecosystem consists of key actors, such as 1) individual (citizen or professional controlling/using the services) i.e. data subject, 2) services providing personal data (e.g. startups providing data collection apps or data collection devices), 3) health and wellness services utilizing aforementioned data and 4) services authorizing the access to this data under individual’s provided explicit consent. Second, the research extends the existing four archetypes of orchestrator-driven healthcare data business models for the healthcare industry and proposes the fifth type of healthcare data model, the MyData Blockchain Platform. This new architecture is developed by the Action Design Research approach, which is a prominent research methodology in the information system domain. The key novelty of the paper is to expand the health data value chain architecture and design from centralization and pseudo-decentralization to full decentralization, enabled by blockchain, thus the MyData blockchain platform. The study not only broadens the healthcare informatics literature but also contributes to the theoretical development of digital healthcare and blockchain research domains with a systemic approach.

Keywords: blockchain, health data, platform, action design

Procedia PDF Downloads 87
24578 Optimization of Solar Chimney Power Production

Authors: Olusola Bamisile, Oluwaseun Ayodele, Mustafa Dagbasi

Abstract:

The main objective of this research is to optimize the power produced by a solar chimney wind turbine. The cut out speed and the maximum possible production are considered while performing the optimization. Solar chimney is one of the solar technologies that can be used in rural areas at cheap cost. With over 50% of rural areas still yet to have access to electricity. The OptimTool in MATLAB is used to maximize power produced by the turbine subject to certain constraints. The results show that an optimized turbine produces about ten times the power of the normal turbine which is 111 W/h. The rest of the research discuss in detail solar chimney power plant and the optimization simulation used in this study.

Keywords: solar chimney, optimization, wind turbine, renewable energy systems

Procedia PDF Downloads 571
24577 Using Learning Apps in the Classroom

Authors: Janet C. Read

Abstract:

UClan set collaboration with Lingokids to assess the Lingokids learning app's impact on learning outcomes in classrooms in the UK for children with ages ranging from 3 to 5 years. Data gathered during the controlled study with 69 children includes attitudinal data, engagement, and learning scores. Data shows that children enjoyment while learning was higher among those children using the game-based app compared to those children using other traditional methods. It’s worth pointing out that engagement when using the learning app was significantly higher than other traditional methods among older children. According to existing literature, there is a direct correlation between engagement, motivation, and learning. Therefore, this study provides relevant data points to conclude that Lingokids learning app serves its purpose of encouraging learning through playful and interactive content. That being said, we believe that learning outcomes should be assessed with a wider range of methods in further studies. Likewise, it would be beneficial to assess the level of usability and playability of the app in order to evaluate the learning app from other angles.

Keywords: learning app, learning outcomes, rapid test activity, Smileyometer, early childhood education, innovative pedagogy

Procedia PDF Downloads 58
24576 Road Safety in the Great Britain: An Exploratory Data Analysis

Authors: Jatin Kumar Choudhary, Naren Rayala, Abbas Eslami Kiasari, Fahimeh Jafari

Abstract:

The Great Britain has one of the safest road networks in the world. However, the consequences of any death or serious injury are devastating for loved ones, as well as for those who help the severely injured. This paper aims to analyse the Great Britain's road safety situation and show the response measures for areas where the total damage caused by accidents can be significantly and quickly reduced. In this paper, we do an exploratory data analysis using STATS19 data. For the past 30 years, the UK has had a good record in reducing fatalities. The UK ranked third based on the number of road deaths per million inhabitants. There were around 165,000 accidents reported in the Great Britain in 2009 and it has been decreasing every year until 2019 which is under 120,000. The government continues to scale back road deaths empowering responsible road users by identifying and prosecuting the parameters that make the roads less safe.

Keywords: road safety, data analysis, openstreetmap, feature expanding.

Procedia PDF Downloads 120
24575 Intrusion Detection System Using Linear Discriminant Analysis

Authors: Zyad Elkhadir, Khalid Chougdali, Mohammed Benattou

Abstract:

Most of the existing intrusion detection systems works on quantitative network traffic data with many irrelevant and redundant features, which makes detection process more time’s consuming and inaccurate. A several feature extraction methods, such as linear discriminant analysis (LDA), have been proposed. However, LDA suffers from the small sample size (SSS) problem which occurs when the number of the training samples is small compared with the samples dimension. Hence, classical LDA cannot be applied directly for high dimensional data such as network traffic data. In this paper, we propose two solutions to solve SSS problem for LDA and apply them to a network IDS. The first method, reduce the original dimension data using principal component analysis (PCA) and then apply LDA. In the second solution, we propose to use the pseudo inverse to avoid singularity of within-class scatter matrix due to SSS problem. After that, the KNN algorithm is used for classification process. We have chosen two known datasets KDDcup99 and NSLKDD for testing the proposed approaches. Results showed that the classification accuracy of (PCA+LDA) method outperforms clearly the pseudo inverse LDA method when we have large training data.

Keywords: LDA, Pseudoinverse, PCA, IDS, NSL-KDD, KDDcup99

Procedia PDF Downloads 216
24574 Studying the Effect of Reducing Thermal Processing over the Bioactive Composition of Non-Centrifugal Cane Sugar: Towards Natural Products with High Therapeutic Value

Authors: Laura Rueda-Gensini, Jader Rodríguez, Juan C. Cruz, Carolina Munoz-Camargo

Abstract:

There is an emerging interest in botanicals and plant extracts for medicinal practices due to their widely reported health benefits. A large variety of phytochemicals found in plants have been correlated with antioxidant, immunomodulatory, and analgesic properties, which makes plant-derived products promising candidates for modulating the progression and treatment of numerous diseases. Non-centrifugal cane sugar (NCS), in particular, has been known for its high antioxidant and nutritional value, but composition-wise variability due to changing environmental and processing conditions have considerably limited its use in the nutraceutical and biomedical fields. This work is therefore aimed at assessing the effect of thermal exposure during NCS production over its bioactive composition and, in turn, its therapeutic value. Accordingly, two modified dehydration methods are proposed that employ: (i) vacuum-aided evaporation, which reduces the necessary temperatures to dehydrate the sample, and (ii) window refractance evaporation, which reduces thermal exposure time. The biochemical composition of NCS produced under these two methods was compared to traditionally-produced NCS by estimating their total polyphenolic and protein content with Folin-Ciocalteu and Bradford assays, as well as identifying the major phenolic compounds in each sample via HPLC-coupled mass spectrometry. Their antioxidant activities were also compared as measured by their scavenging potential of ABTS and DPPH radicals. Results show that the two modified production methods enhance polyphenolic and protein yield in resulting NCS samples when compared to traditional production methods. In particular, reducing employed temperatures with vacuum-aided evaporation demonstrated to be superior at preserving polyphenolic compounds, as evidenced both in the total and individual polyphenol concentrations. However, antioxidant activities were not significantly different between these. Although additional studies should be performed to determine if the observed compositional differences affect other therapeutic activities (e.g., anti-inflammatory, analgesic, and immunoprotective), these results suggest that reducing thermal exposure holds great promise for the production of natural products with enhanced nutritional value.

Keywords: non-centrifugal cane sugar, polyphenolic compounds, thermal processing, antioxidant activity

Procedia PDF Downloads 77
24573 A Low-Cost and Easy-To-Operate Remediation Technology of Heavy Metals Contaminated Agricultural Soil

Authors: Xiao-Hua Zhu, Xin Yuan, Yi-Ran Zhao

Abstract:

High-cadmium pollution in rice is a serious problem in many parts of China. Many kinds of remediation technologies have been tested and applied in many farmlands. Because of the productive function of the farmland, most technologies are inappropriate due to their destruction to the tillage soil layer. And the large labours and expensive fees of many technologies are also the restrictive factors for their applications. The conception of 'Root Micro-Geochemical Barrier' was proposed to reduce cadmium (Cd) bioavailability and the concentration of the cadmium in rice. Remediation and mitigation techniques were demonstrated on contaminated farmland in the downstream of some mine. According to the rule of rice growth, Cd would be absorbed by the crops in every growth stage, and the plant-absorb efficiency in the first stage of the tillering stage is almost the highest. We should create a method to protect the crops from heavy metal pollution, which could begin to work from the early growth stage. Many materials with repair property get our attention. The materials will create a barrier preventing Cd from being absorbed by the crops during all the growing process because the material has the ability to adsorb soil-Cd and making it losing its migration activity. And we should choose a good chance to put the materials into the crop-growing system cheaply as soon as early. Per plant, rice has a little root system scope, which makes the roots reach about 15cm deep and 15cm wide. So small root radiation area makes it possible for all the Cd approaching the roots to be adsorbed with a small amount of adsorbent. Mixing the remediation materials with the seed-raising soli and adding them to the tillage soil in the process of transplanting seedlings, we can control the soil-Cd activity in the range of roots to reduce the Cd-amount absorbed by the crops. Of course, the mineral materials must have enough adsorptive capacity and no additional pollution. More than 3000 square meters farmlands have been remediated. And on the application of root micro-geochemical barrier, the Cd-concentration in rice and the remediation-cost have been decreased by 90% and 80%, respectively, with little extra labour brought to the farmers. The Cd-concentrations in rice from remediated farmland have been controlled below 0.1 ppm. The remediation of one acre of contaminated cropland costs less than $100. The concept has its advantage in the remediation of paddy field contaminated by Cd, especially for the field with outside pollution sources.

Keywords: cadmium pollution, growth stage, cost, root micro-geochemistry barrier

Procedia PDF Downloads 69
24572 Studies of Rule Induction by STRIM from the Decision Table with Contaminated Attribute Values from Missing Data and Noise — in the Case of Critical Dataset Size —

Authors: Tetsuro Saeki, Yuichi Kato, Shoutarou Mizuno

Abstract:

STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to realworld data.

Keywords: rule induction, decision table, missing data, noise

Procedia PDF Downloads 380
24571 Characterization of Soil Microbial Communities from Vineyard under a Spectrum of Drought Pressures in Sensitive Area of Mediterranean Region

Authors: Gianmaria Califano, Júlio Augusto Lucena Maciel, Olfa Zarrouk, Miguel Damasio, Jose Silvestre, Ana Margarida Fortes

Abstract:

Global warming, with rapid and sudden changes in meteorological conditions, is one of the major constraints to ensuring agricultural and crop resilience in the Mediterranean regions. Several strategies are being adopted to reduce the pressure of drought stress on grapevines at regional and local scales: improvements in the irrigation systems, adoption of interline cover crops, and adaptation of pruning techniques. However, still, more can be achieved if also microbial compartments associated with plants are considered in crop management. It is known that the microbial community change according to several factors such as latitude, plant variety, age, rootstock, soil composition and agricultural management system. Considering the increasing pressure of the biotic and abiotic stresses, it is of utmost necessity to also evaluate the effects of drought on the microbiome associated with the grapevine, which is a commercially important crop worldwide. In this study, we characterize the diversity and the structure of the microbial community under three long-term irrigation levels (100% ETc, 50% ETc and rain-fed) in a drought-tolerant grapevine cultivar present worldwide, Syrah. To avoid the limitations of culture-dependent methods, amplicon sequencing with target primers for bacteria and fungi was applied to the same soil samples. The use of the DNeasy PowerSoil (Qiagen) extraction kit required further optimization with the use of lytic enzymes and heating steps to improve DNA yield and quality systematically across biological treatments. Target regions (16S rRNA and ITS genes) of our samples are being sequenced with Illumina technology. With bioinformatic pipelines, it will be possible to obtain a characterization of the bacterial and fungal diversity, structure and composition. Further, the microbial communities will be assessed for their functional activity, which remains an important metric considering the strong inter-kingdom interactions existing between plants and their associated microbiome. The results of this study will lay the basis for biotechnological applications: in combination with the establishment of a bacterial library, it will be possible to explore the possibility of testing synthetic microbial communities to support plant resistance to water scarcity.

Keywords: microbiome, metabarcoding, soil, vinegrape, syrah, global warming, crop sustainability

Procedia PDF Downloads 102
24570 Machine Learning Strategies for Data Extraction from Unstructured Documents in Financial Services

Authors: Delphine Vendryes, Dushyanth Sekhar, Baojia Tong, Matthew Theisen, Chester Curme

Abstract:

Much of the data that inform the decisions of governments, corporations and individuals are harvested from unstructured documents. Data extraction is defined here as a process that turns non-machine-readable information into a machine-readable format that can be stored, for instance, in a database. In financial services, introducing more automation in data extraction pipelines is a major challenge. Information sought by financial data consumers is often buried within vast bodies of unstructured documents, which have historically required thorough manual extraction. Automated solutions provide faster access to non-machine-readable datasets, in a context where untimely information quickly becomes irrelevant. Data quality standards cannot be compromised, so automation requires high data integrity. This multifaceted task is broken down into smaller steps: ingestion, table parsing (detection and structure recognition), text analysis (entity detection and disambiguation), schema-based record extraction, user feedback incorporation. Selected intermediary steps are phrased as machine learning problems. Solutions leveraging cutting-edge approaches from the fields of computer vision (e.g. table detection) and natural language processing (e.g. entity detection and disambiguation) are proposed.

Keywords: computer vision, entity recognition, finance, information retrieval, machine learning, natural language processing

Procedia PDF Downloads 95
24569 Regression Approach for Optimal Purchase of Hosts Cluster in Fixed Fund for Hadoop Big Data Platform

Authors: Haitao Yang, Jianming Lv, Fei Xu, Xintong Wang, Yilin Huang, Lanting Xia, Xuewu Zhu

Abstract:

Given a fixed fund, purchasing fewer hosts of higher capability or inversely more of lower capability is a must-be-made trade-off in practices for building a Hadoop big data platform. An exploratory study is presented for a Housing Big Data Platform project (HBDP), where typical big data computing is with SQL queries of aggregate, join, and space-time condition selections executed upon massive data from more than 10 million housing units. In HBDP, an empirical formula was introduced to predict the performance of host clusters potential for the intended typical big data computing, and it was shaped via a regression approach. With this empirical formula, it is easy to suggest an optimal cluster configuration. The investigation was based on a typical Hadoop computing ecosystem HDFS+Hive+Spark. A proper metric was raised to measure the performance of Hadoop clusters in HBDP, which was tested and compared with its predicted counterpart, on executing three kinds of typical SQL query tasks. Tests were conducted with respect to factors of CPU benchmark, memory size, virtual host division, and the number of element physical host in cluster. The research has been applied to practical cluster procurement for housing big data computing.

Keywords: Hadoop platform planning, optimal cluster scheme at fixed-fund, performance predicting formula, typical SQL query tasks

Procedia PDF Downloads 216
24568 Model Predictive Controller for Pasteurization Process

Authors: Tesfaye Alamirew Dessie

Abstract:

Our study focuses on developing a Model Predictive Controller (MPC) and evaluating it against a traditional PID for a pasteurization process. Utilizing system identification from the experimental data, the dynamics of the pasteurization process were calculated. Using best fit with data validation, residual, and stability analysis, the quality of several model architectures was evaluated. The validation data fit the auto-regressive with exogenous input (ARX322) model of the pasteurization process by roughly 80.37 percent. The ARX322 model structure was used to create MPC and PID control techniques. After comparing controller performance based on settling time, overshoot percentage, and stability analysis, it was found that MPC controllers outperform PID for those parameters.

Keywords: MPC, PID, ARX, pasteurization

Procedia PDF Downloads 145
24567 Point Estimation for the Type II Generalized Logistic Distribution Based on Progressively Censored Data

Authors: Rana Rimawi, Ayman Baklizi

Abstract:

Skewed distributions are important models that are frequently used in applications. Generalized distributions form a class of skewed distributions and gain widespread use in applications because of their flexibility in data analysis. More specifically, the Generalized Logistic Distribution with its different types has received considerable attention recently. In this study, based on progressively type-II censored data, we will consider point estimation in type II Generalized Logistic Distribution (Type II GLD). We will develop several estimators for its unknown parameters, including maximum likelihood estimators (MLE), Bayes estimators and linear estimators (BLUE). The estimators will be compared using simulation based on the criteria of bias and Mean square error (MSE). An illustrative example of a real data set will be given.

Keywords: point estimation, type II generalized logistic distribution, progressive censoring, maximum likelihood estimation

Procedia PDF Downloads 186
24566 Restoration of Steppes in Algeria: Case of the Stipa tenacissima L. Steppe

Authors: H. Kadi-Hanifi, F. Amghar

Abstract:

Steppes of arid Mediterranean zones are deeply threatened by desertification. To stop or alleviate ecological and economic problems associated with this desertification, management actions have been implemented since the last three decades. The struggle against desertification has become a national priority in many countries. In Algeria, several management techniques have been used to cope with desertification. This study aims at investigating the effect of exclosure on floristic diversity and chemical soil proprieties after four years of implementation. 167 phyto-ecological samples have been studied, 122 inside the exclosure and 45 outside. Results showed that plant diversity, composition, vegetation cover, pastoral value and soil fertility were significantly higher in protected areas.

Keywords: Algeria, arid, desertification, pastoral management, soil fertility

Procedia PDF Downloads 178
24565 Omni: Data Science Platform for Evaluate Performance of a LoRaWAN Network

Authors: Emanuele A. Solagna, Ricardo S, Tozetto, Roberto dos S. Rabello

Abstract:

Nowadays, physical processes are becoming digitized by the evolution of communication, sensing and storage technologies which promote the development of smart cities. The evolution of this technology has generated multiple challenges related to the generation of big data and the active participation of electronic devices in society. Thus, devices can send information that is captured and processed over large areas, but there is no guarantee that all the obtained data amount will be effectively stored and correctly persisted. Because, depending on the technology which is used, there are parameters that has huge influence on the full delivery of information. This article aims to characterize the project, currently under development, of a platform that based on data science will perform a performance and effectiveness evaluation of an industrial network that implements LoRaWAN technology considering its main parameters configuration relating these parameters to the information loss.

Keywords: Internet of Things, LoRa, LoRaWAN, smart cities

Procedia PDF Downloads 132
24564 Effects of Heat Treatment on the Mechanical Properties of Kenaf Fiber

Authors: Paulo Teodoro De Luna Carada, Toru Fujii, Kazuya Okubo

Abstract:

Natural fibers have wide variety of uses (e.g., rope, paper, and building materials). One specific application of it is in the field of composite materials (i.e., green composites). Huge amount of research are being done in this field due to rising concerns in the harmful effects of synthetic materials to the environment. There are several natural fibers used in this field, one of which can be extracted from a plant called kenaf (Hibiscus cannabinus L.). Kenaf fiber is regarded as a good alternative because the plant is easy to grow and the fiber is easy to extract. Additionally, it has good properties. Treatments, which are classified as mechanical or chemical in nature, can be done in order to improve the properties of the fiber. The aim of this study is to assess the effects of heat treatment in kenaf fiber. It specifically aims to observe the effect in the tensile strength and modulus of the fiber. Kenaf fiber bundles with an average diameter of at most 100μm was used for this purpose. Heat treatment was done using a constant temperature oven with the following heating temperatures: (1) 160̊C, (2) 180̊C, and (3) 200̊C for a duration of one hour. As a basis for comparison, tensile test was first done to kenaf fibers without any heat treatment. For every heating temperature, three groups of samples were prepared. Two groups of which were for doing tensile test (one group was tested right after heat treatment while the remaining group was kept inside a closed container with relative humidity of at least 95% for two days). The third group was used to observe how much moisture the treated fiber will absorb when it is enclosed in a high moisture environment for two days. The results showed that kenaf fiber can retain its tensile strength when heated up to a temperature of 160̊C. However, when heated at a temperature of about 180̊C or higher, the tensile strength decreases significantly. The same behavior was observed for the tensile modulus of the fiber. Additionally, the fibers which were stored for two days absorbed nearly the same amount of moisture (about 20% of the dried weight) regardless of the heating temperature. Heat treatment might have damaged the fiber in some way. Additional test was done in order to see if the damage due to heat treatment is attributed to changes in the viscoelastic property of the fiber. The findings showed that kenaf fibers can be heated for at most 160̊C to attain good tensile strength and modulus. Additionally, heating the fiber at high temperature (>180̊C) causes changes in its viscoelastic property. The results of this study is significant for processes which requires heat treatment not only in kenaf fiber but might also be helpful for natural fibers in general.

Keywords: heat treatment, kenaf fiber, natural fiber, mechanical properties

Procedia PDF Downloads 338
24563 Cybervetting and Online Privacy in Job Recruitment – Perspectives on the Current and Future Legislative Framework Within the EU

Authors: Nicole Christiansen, Hanne Marie Motzfeldt

Abstract:

In recent years, more and more HR professionals have been using cyber-vetting in job recruitment in an effort to find the perfect match for the company. These practices are growing rapidly, accessing a vast amount of data from social networks, some of which is privileged and protected information. Thus, there is a risk that the right to privacy is becoming a duty to manage your private data. This paper investigates to which degree a job applicant's fundamental rights are protected adequately in current and future legislation in the EU. This paper argues that current data protection regulations and forthcoming regulations on the use of AI ensure sufficient protection. However, even though the regulation on paper protects employees within the EU, the recruitment sector may not pay sufficient attention to the regulation as it not specifically targeting this area. Therefore, the lack of specific labor and employment regulation is a concern that the social partners should attend to.

Keywords: AI, cyber vetting, data protection, job recruitment, online privacy

Procedia PDF Downloads 69
24562 Sequential Pattern Mining from Data of Medical Record with Sequential Pattern Discovery Using Equivalent Classes (SPADE) Algorithm (A Case Study : Bolo Primary Health Care, Bima)

Authors: Rezky Rifaini, Raden Bagus Fajriya Hakim

Abstract:

This research was conducted at the Bolo primary health Care in Bima Regency. The purpose of the research is to find out the association pattern that is formed of medical record database from Bolo Primary health care’s patient. The data used is secondary data from medical records database PHC. Sequential pattern mining technique is the method that used to analysis. Transaction data generated from Patient_ID, Check_Date and diagnosis. Sequential Pattern Discovery Algorithms Using Equivalent Classes (SPADE) is one of the algorithm in sequential pattern mining, this algorithm find frequent sequences of data transaction, using vertical database and sequence join process. Results of the SPADE algorithm is frequent sequences that then used to form a rule. It technique is used to find the association pattern between items combination. Based on association rules sequential analysis with SPADE algorithm for minimum support 0,03 and minimum confidence 0,75 is gotten 3 association sequential pattern based on the sequence of patient_ID, check_Date and diagnosis data in the Bolo PHC.

Keywords: diagnosis, primary health care, medical record, data mining, sequential pattern mining, SPADE algorithm

Procedia PDF Downloads 386
24561 Discrete Tracking Control of Nonholonomic Mobile Robots: Backstepping Design Approach

Authors: Alexander S. Andreev, Olga A. Peregudova

Abstract:

In this paper, we propose a discrete tracking control of nonholonomic mobile robots with two degrees of freedom. The electro-mechanical model of a mobile robot moving on a horizontal surface without slipping, with two rear wheels controlled by two independent DC electric, and one front roal wheel is considered. We present back-stepping design based on the Euler approximate discrete-time model of a continuous-time plant. Theoretical considerations are verified by numerical simulation. The work was supported by RFFI (15-01-08482).

Keywords: actuator dynamics, back stepping, discrete-time controller, Lyapunov function, wheeled mobile robot

Procedia PDF Downloads 395
24560 Estimation of Reservoirs Fracture Network Properties Using an Artificial Intelligence Technique

Authors: Reda Abdel Azim, Tariq Shehab

Abstract:

The main objective of this study is to develop a subsurface fracture map of naturally fractured reservoirs by overcoming the limitations associated with different data sources in characterising fracture properties. Some of these limitations are overcome by employing a nested neuro-stochastic technique to establish inter-relationship between different data, as conventional well logs, borehole images (FMI), core description, seismic attributes, and etc. and then characterise fracture properties in terms of fracture density and fractal dimension for each data source. Fracture density is an important property of a system of fracture network as it is a measure of the cumulative area of all the fractures in a unit volume of a fracture network system and Fractal dimension is also used to characterize self-similar objects such as fractures. At the wellbore locations, fracture density and fractal dimension can only be estimated for limited sections where FMI data are available. Therefore, artificial intelligence technique is applied to approximate the quantities at locations along the wellbore, where the hard data is not available. It should be noted that Artificial intelligence techniques have proven their effectiveness in this domain of applications.

Keywords: naturally fractured reservoirs, artificial intelligence, fracture intensity, fractal dimension

Procedia PDF Downloads 237
24559 Governance, Risk Management, and Compliance Factors Influencing the Adoption of Cloud Computing in Australia

Authors: Tim Nedyalkov

Abstract:

A business decision to move to the cloud brings fundamental changes in how an organization develops and delivers its Information Technology solutions. The accelerated pace of digital transformation across businesses and government agencies increases the reliance on cloud-based services. They are collecting, managing, and retaining large amounts of data in cloud environments makes information security and data privacy protection essential. It becomes even more important to understand what key factors drive successful cloud adoption following the commencement of the Privacy Amendment Notifiable Data Breaches (NDB) Act 2017 in Australia as the regulatory changes impact many organizations and industries. This quantitative correlational research investigated the governance, risk management, and compliance factors contributing to cloud security success. The factors influence the adoption of cloud computing within an organizational context after the commencement of the NDB scheme. The results and findings demonstrated that corporate information security policies, data storage location, management understanding of data governance responsibilities, and regular compliance assessments are the factors influencing cloud computing adoption. The research has implications for organizations, future researchers, practitioners, policymakers, and cloud computing providers to meet the rapidly changing regulatory and compliance requirements.

Keywords: cloud compliance, cloud security, data governance, privacy protection

Procedia PDF Downloads 103
24558 Simulations to Predict Solar Energy Potential by ERA5 Application at North Africa

Authors: U. Ali Rahoma, Nabil Esawy, Fawzia Ibrahim Moursy, A. H. Hassan, Samy A. Khalil, Ashraf S. Khamees

Abstract:

The design of any solar energy conversion system requires the knowledge of solar radiation data obtained over a long period. Satellite data has been widely used to estimate solar energy where no ground observation of solar radiation is available, yet there are limitations on the temporal coverage of satellite data. Reanalysis is a “retrospective analysis” of the atmosphere parameters generated by assimilating observation data from various sources, including ground observation, satellites, ships, and aircraft observation with the output of NWP (Numerical Weather Prediction) models, to develop an exhaustive record of weather and climate parameters. The evaluation of the performance of reanalysis datasets (ERA-5) for North Africa against high-quality surface measured data was performed using statistical analysis. The estimation of global solar radiation (GSR) distribution over six different selected locations in North Africa during ten years from the period time 2011 to 2020. The root means square error (RMSE), mean bias error (MBE) and mean absolute error (MAE) of reanalysis data of solar radiation range from 0.079 to 0.222, 0.0145 to 0.198, and 0.055 to 0.178, respectively. The seasonal statistical analysis was performed to study seasonal variation of performance of datasets, which reveals the significant variation of errors in different seasons—the performance of the dataset changes by changing the temporal resolution of the data used for comparison. The monthly mean values of data show better performance, but the accuracy of data is compromised. The solar radiation data of ERA-5 is used for preliminary solar resource assessment and power estimation. The correlation coefficient (R2) varies from 0.93 to 99% for the different selected sites in North Africa in the present research. The goal of this research is to give a good representation for global solar radiation to help in solar energy application in all fields, and this can be done by using gridded data from European Centre for Medium-Range Weather Forecasts ECMWF and producing a new model to give a good result.

Keywords: solar energy, solar radiation, ERA-5, potential energy

Procedia PDF Downloads 195
24557 Efficient Pre-Processing of Single-Cell Assay for Transposase Accessible Chromatin with High-Throughput Sequencing Data

Authors: Fan Gao, Lior Pachter

Abstract:

The primary tool currently used to pre-process 10X Chromium single-cell ATAC-seq data is Cell Ranger, which can take very long to run on standard datasets. To facilitate rapid pre-processing that enables reproducible workflows, we present a suite of tools called scATAK for pre-processing single-cell ATAC-seq data that is 15 to 18 times faster than Cell Ranger on mouse and human samples. Our tool can also calculate chromatin interaction potential matrices, and generate open chromatin signal and interaction traces for cell groups. We use scATAK tool to explore the chromatin regulatory landscape of a healthy adult human brain and unveil cell-type specific features, and show that it provides a convenient and computational efficient approach for pre-processing single-cell ATAC-seq data.

Keywords: single-cell, ATAC-seq, bioinformatics, open chromatin landscape, chromatin interactome

Procedia PDF Downloads 144
24556 Potential of Ozonation and Phytoremediation to Reduce Hydrocarbon Levels Remaining after the Pilot Scale Microbial Based Bioremediation (Land-Farming) of a Heavily Polluted Soil

Authors: Hakima Althalb

Abstract:

Petroleum contamination of sandy soils is a severe environmental problem in Libya, but relatively little work has been carried out to optimize the bioremediation of such heavily contaminated soil, particularly at a pilot scale. The purpose of this research was to determine the potential for the microbial-based bioremediation of hydrocarbon-contaminated soil obtained from an oil refinery in Libya and to assess the potential of both ozonation and phytoremediation (both applied after initial bioremediation) to reduce residual hydrocarbon levels. Plots containing 500 kg soil (triplicates) (contaminated soil diluted with clean soil 50% volume) were set up, (designated as Land Treatment Units; LTUs) containing five different nutrient levels and mixtures (Urea + NPK (nitrogen; phosphor; potassium) mixtures) to obtain C:N:P ratios 100:10:1, and monitored for 90 days. Hydrocarbon levels, microbial numbers, and toxicity (EC50 using luminescent microbial based tests) were assessed. Hydrocarbon levels in non-diluted and diluted soil ranged from 20 733-22 366 mg/kg and from 16 000-17 000 mg/kg respectively. Although all the land treatment units revealed a significant hydrocarbon reduction over time, the highest reduction in hydrocarbon levels obtained was around 60%. For example, 63% hydrocarbon removal was observed using a mixture of urea and NPK with a C:N:P ratio of 100:10:1). Soil toxicity (as assessed using luminescence based toxicity assays) reduced in line with the reduction in total petroleum hydrocarbons observed. However, as relatively high residual TPH (total petroleum hydrocarbon) levels (ranging from 6033-14166mg/kg) were still present after initial bioremediation two ‘post-treatments’ (phytoremediation and ozonation) were attempted to remove residual hydrocarbons remaining. Five locally grown (agriculturally important) plant species were tested. The germination of all plants examined was strongly inhibited (80-100%) and seedlings failed to grow well in the contaminated soil, indicating that the previously bioremediated soils were still toxic to the plants. Subsequent ozonation followed by another bioremediation of soil was more successful than phytoremediation. But even the most promising successful treatment in this study (ozonation for 6 hours at 25ppm followed by bioremediation) still only removed approximately 31% of the residual hydrocarbons. Overall, this work showed that the bioremediation of such highly contaminated soils is difficult and that a combination of treatments would be required to achieve successful remediation. Even after initial dilution and bioremediation the soils remained toxic to plant growth and were therefore not suitable for phytoremediation.

Keywords: bioremediation, petroleum hydrocarbons, ozone, phytoremediation

Procedia PDF Downloads 160
24555 Methylation Analysis of PHF20L1 and DACT2 Gene Promoters in Women with Breast Cancer

Authors: Marta E. Hernandez-Caballero, Veronica Borgonio-Cuadra, Antonio Miranda-Duarte, Xochitl Rojas-Toledo, Normand Garcia-Hernandez, Maura Cardenas-Garcia, Teresa Abad-Camacho

Abstract:

Breast cancer (BC) is the most common tumor in women over the world. DNA methylation is an epigenetic modification critical in CpG sites, aberrant methylation of CpG islands in promoters is a hallmark of cancer. So, gene expression can be regulated by alterations in DNA methylation. In cell lines DACT2 gene reduces the growth and migration of tumor cells by its participation in the suppression of TGFb/SMAD2/3. PHF20L1 is involved in histone acetylation therefore, it regulates transcription. Our aim was to analyze the methylation status of the DACT2 and PHF20L1 promoter regions in tumoral and healthy mammary tissue from women with BC in different progression states. The study included 77 patients from Centro Medico Nacional La Raza in Mexico City. After identifying a CpG island in DACT2 and PHF20L1 promoters, DNA methylation status was analyzed through sodium bisulfite with subsequent amplification using methylation-specific PCR. Results revealed no changes in methylation status of PHF20L1 and cancer stages (II y III) or in comparison to healthy tissues, it was demethylated. DACT2 promoter methylation was no significant between tumoral stages (II, P = 0.37; III, P = 0.17) or with healthy tissue. Previous data reported DACT2 methylated in nasopharyngeal carcinoma but in this study promoter methylation was not observed. PHF20L1 protein contains N-terminal Tudor and C-terminal plant homeodomain domains, it has been suggested that can stabilize DNMT1 regulating DNA methylation, therefore, was associated with poor prognostic in BC. We found no evidence of methylation in patients and controls in PHF20L1 promoter, so its association with BC may have no direct relation with promoter methylation. More studies including other methylation sites in these genes in BC are necessary.

Keywords: bisulfite conversion, breast cancer, DACT2, DNA methylation, PHF20L1, tumoral status

Procedia PDF Downloads 284
24554 Green Synthesis, Characterization and Application of Zinc Oxide and Silver Oxide Nonparticipants

Authors: Nassima Khanfri, Ali Boucenna

Abstract:

As metallic nanoparticles are increasingly used in many economic sectors, there is interest in the biological and environmental safety of their production. The main methods of synthesizing nanoparticales are chemical and physical approaches that are often expensive and potentially harmful to the environment. The present study is devoted to the possibility of the synthesis of silver nanoparticales and zinc oxide from silver nitrate and zinc acetate using basilica plant extracts. The products obtained are characterized by various analysis techniques, such as UV/V, XRD, MEB-EDX, FTIR, and RAMAN. These analyzes confirm the crystalline nature of AgNps and ZnONps. These crystalline powders having effective biological activities regarding the antioxidant and antibacterial, which could be used in several biological applications.

Keywords: green synthesis, bio-reduction, metals nan Oparticales, Plants extracts

Procedia PDF Downloads 184
24553 Meta Mask Correction for Nuclei Segmentation in Histopathological Image

Authors: Jiangbo Shi, Zeyu Gao, Chen Li

Abstract:

Nuclei segmentation is a fundamental task in digital pathology analysis and can be automated by deep learning-based methods. However, the development of such an automated method requires a large amount of data with precisely annotated masks which is hard to obtain. Training with weakly labeled data is a popular solution for reducing the workload of annotation. In this paper, we propose a novel meta-learning-based nuclei segmentation method which follows the label correction paradigm to leverage data with noisy masks. Specifically, we design a fully conventional meta-model that can correct noisy masks by using a small amount of clean meta-data. Then the corrected masks are used to supervise the training of the segmentation model. Meanwhile, a bi-level optimization method is adopted to alternately update the parameters of the main segmentation model and the meta-model. Extensive experimental results on two nuclear segmentation datasets show that our method achieves the state-of-the-art result. In particular, in some noise scenarios, it even exceeds the performance of training on supervised data.

Keywords: deep learning, histopathological image, meta-learning, nuclei segmentation, weak annotations

Procedia PDF Downloads 126
24552 Laboratory Evaluation of Gilsonite Modified Bituminous Mixes

Authors: R. Vishnu, K. S. Reddy, Amrendra Kumar

Abstract:

The present guideline for the construction of flexible pavement in India, IRC 37: 2012 recommends to use viscous grade VG 40 bitumen in both wearing and binder bituminous layers. However, most of the bitumen production plants in India are unable to produce the air-blown VG40 grade bitumen. This requires plant’s air-blowing technique modification, and often the manufactures finds it as uneconomical. In this context, stiffer grade bitumen can be produced if bitumen is modified. Gilsonite, which is naturally occurring asphalt have been found to be used for increasing the stiffness of binders. The present study evaluates the physical, rheological characteristics of Gilsonite modified binders and the performance characteristics of these binders when used in the mix.

Keywords: bitumen, gilsonite, stiffness, laboratory evaluation

Procedia PDF Downloads 451