Search results for: synthetic dataset
830 Chemopreventive Efficacy Of Cdcl2(C14H21N3O2) in Rat Colon Carcinogenesis Model Using Aberrant Crypt Foci (ACF) as Endpoint Marker
Authors: Maryam Hajrezaie, Mahmood Ameen Abdulla, Nazia AbdulMajid, Maryam Zahedifard
Abstract:
Colon cancer is one of the most prevalent cancers in the world. Cancer chemoprevention is defined as the use of natural or synthetic compounds capable of inducing biological mechanisms necessary to preserve genomic fidelity. New schiff based compounds are reported to exhibit a wide spectrum of biological activities of therapeutic importance. To evaluate inhibitory properties of CdCl2(C14H21N3O2) complex on colonic aberrant crypt foci, five groups of 7-week-old male rats were used. Control group was fed with 10% Tween 20 once a day, cancer control group was intra-peritoneally injected with 15 mg/kg Azoxymethan, drug control group was injected with 15 mg/kg azoxymethan and 5-Flourouracil, experimental groups were fed with 2.5 and 5 mg/kg CdCl2(C14H21N3O2) compound each once a day. Administration of compound were found to be effectively chemoprotective. Andrographolide suppressed total colonic ACF formation up to 72% to 74%, respectively, when compared with control group. The results also showed a significant increase in glutathione peroxidase, superoxide dismutase, catalase activities and a decrease in malondialdehyde level. Immunohistochemical staining demonstrated down-regulation of PCNA protein. According to the Western blot comparison analysis, COX-2 and Bcl2 is up-regulated whilst the Bax is down-regulated. according to these data, this compound plays promising chemoprotective activity, in a model of AOM-induced in ACF.Keywords: chemopreventive, Schiff based compound, aberrant crypt foci (ACF), immunohistochemical staining
Procedia PDF Downloads 400829 Grey Wolf Optimization Technique for Predictive Analysis of Products in E-Commerce: An Adaptive Approach
Authors: Shital Suresh Borse, Vijayalaxmi Kadroli
Abstract:
E-commerce industries nowadays implement the latest AI, ML Techniques to improve their own performance and prediction accuracy. This helps to gain a huge profit from the online market. Ant Colony Optimization, Genetic algorithm, Particle Swarm Optimization, Neural Network & GWO help many e-commerce industries for up-gradation of their predictive performance. These algorithms are providing optimum results in various applications, such as stock price prediction, prediction of drug-target interaction & user ratings of similar products in e-commerce sites, etc. In this study, customer reviews will play an important role in prediction analysis. People showing much interest in buying a lot of services& products suggested by other customers. This ultimately increases net profit. In this work, a convolution neural network (CNN) is proposed which further is useful to optimize the prediction accuracy of an e-commerce website. This method shows that CNN is used to optimize hyperparameters of GWO algorithm using an appropriate coding scheme. Accurate model results are verified by comparing them to PSO results whose hyperparameters have been optimized by CNN in Amazon's customer review dataset. Here, experimental outcome proves that this proposed system using the GWO algorithm achieves superior execution in terms of accuracy, precision, recovery, etc. in prediction analysis compared to the existing systems.Keywords: prediction analysis, e-commerce, machine learning, grey wolf optimization, particle swarm optimization, CNN
Procedia PDF Downloads 113828 AutoML: Comprehensive Review and Application to Engineering Datasets
Authors: Parsa Mahdavi, M. Amin Hariri-Ardebili
Abstract:
The development of accurate machine learning and deep learning models traditionally demands hands-on expertise and a solid background to fine-tune hyperparameters. With the continuous expansion of datasets in various scientific and engineering domains, researchers increasingly turn to machine learning methods to unveil hidden insights that may elude classic regression techniques. This surge in adoption raises concerns about the adequacy of the resultant meta-models and, consequently, the interpretation of the findings. In response to these challenges, automated machine learning (AutoML) emerges as a promising solution, aiming to construct machine learning models with minimal intervention or guidance from human experts. AutoML encompasses crucial stages such as data preparation, feature engineering, hyperparameter optimization, and neural architecture search. This paper provides a comprehensive overview of the principles underpinning AutoML, surveying several widely-used AutoML platforms. Additionally, the paper offers a glimpse into the application of AutoML on various engineering datasets. By comparing these results with those obtained through classical machine learning methods, the paper quantifies the uncertainties inherent in the application of a single ML model versus the holistic approach provided by AutoML. These examples showcase the efficacy of AutoML in extracting meaningful patterns and insights, emphasizing its potential to revolutionize the way we approach and analyze complex datasets.Keywords: automated machine learning, uncertainty, engineering dataset, regression
Procedia PDF Downloads 61827 Rheological and Computational Analysis of Crude Oil Transportation
Authors: Praveen Kumar, Satish Kumar, Jashanpreet Singh
Abstract:
Transportation of unrefined crude oil from the production unit to a refinery or large storage area by a pipeline is difficult due to the different properties of crude in various areas. Thus, the design of a crude oil pipeline is a very complex and time consuming process, when considering all the various parameters. There were three very important parameters that play a significant role in the transportation and processing pipeline design; these are: viscosity profile, temperature profile and the velocity profile of waxy crude oil through the crude oil pipeline. Knowledge of the Rheological computational technique is required for better understanding the flow behavior and predicting the flow profile in a crude oil pipeline. From these profile parameters, the material and the emulsion that is best suited for crude oil transportation can be predicted. Rheological computational fluid dynamic technique is a fast method used for designing flow profile in a crude oil pipeline with the help of computational fluid dynamics and rheological modeling. With this technique, the effect of fluid properties including shear rate range with temperature variation, degree of viscosity, elastic modulus and viscous modulus was evaluated under different conditions in a transport pipeline. In this paper, two crude oil samples was used, as well as a prepared emulsion with natural and synthetic additives, at different concentrations ranging from 1,000 ppm to 3,000 ppm. The rheological properties was then evaluated at a temperature range of 25 to 60 °C and which additive was best suited for transportation of crude oil is determined. Commercial computational fluid dynamics (CFD) has been used to generate the flow, velocity and viscosity profile of the emulsions for flow behavior analysis in crude oil transportation pipeline. This rheological CFD design can be further applied in developing designs of pipeline in the future.Keywords: surfactant, natural, crude oil, rheology, CFD, viscosity
Procedia PDF Downloads 454826 Multi-Atlas Segmentation Based on Dynamic Energy Model: Application to Brain MR Images
Authors: Jie Huo, Jonathan Wu
Abstract:
Segmentation of anatomical structures in medical images is essential for scientific inquiry into the complex relationships between biological structure and clinical diagnosis, treatment and assessment. As a method of incorporating the prior knowledge and the anatomical structure similarity between a target image and atlases, multi-atlas segmentation has been successfully applied in segmenting a variety of medical images, including the brain, cardiac, and abdominal images. The basic idea of multi-atlas segmentation is to transfer the labels in atlases to the coordinate of the target image by matching the target patch to the atlas patch in the neighborhood. However, this technique is limited by the pairwise registration between target image and atlases. In this paper, a novel multi-atlas segmentation approach is proposed by introducing a dynamic energy model. First, the target is mapped to each atlas image by minimizing the dynamic energy function, then the segmentation of target image is generated by weighted fusion based on the energy. The method is tested on MICCAI 2012 Multi-Atlas Labeling Challenge dataset which includes 20 target images and 15 atlases images. The paper also analyzes the influence of different parameters of the dynamic energy model on the segmentation accuracy and measures the dice coefficient by using different feature terms with the energy model. The highest mean dice coefficient obtained with the proposed method is 0.861, which is competitive compared with the recently published method.Keywords: brain MRI segmentation, dynamic energy model, multi-atlas segmentation, energy minimization
Procedia PDF Downloads 336825 GeneNet: Temporal Graph Data Visualization for Gene Nomenclature and Relationships
Authors: Jake Gonzalez, Tommy Dang
Abstract:
This paper proposes a temporal graph approach to visualize and analyze the evolution of gene relationships and nomenclature over time. An interactive web-based tool implements this temporal graph, enabling researchers to traverse a timeline and observe coupled dynamics in network topology and naming conventions. Analysis of a real human genomic dataset reveals the emergence of densely interconnected functional modules over time, representing groups of genes involved in key biological processes. For example, the antimicrobial peptide DEFA1A3 shows increased connections to related alpha-defensins involved in infection response. Tracking degree and betweenness centrality shifts over timeline iterations also quantitatively highlight the reprioritization of certain genes’ topological importance as knowledge advances. Examination of the CNR1 gene encoding the cannabinoid receptor CB1 demonstrates changing synonymous relationships and consolidating naming patterns over time, reflecting its unique functional role discovery. The integrated framework interconnecting these topological and nomenclature dynamics provides richer contextual insights compared to isolated analysis methods. Overall, this temporal graph approach enables a more holistic study of knowledge evolution to elucidate complex biology.Keywords: temporal graph, gene relationships, nomenclature evolution, interactive visualization, biological insights
Procedia PDF Downloads 61824 Study on the Relationship between the Urban Geography and Urban Agglomeration to the Effects of Carbon Emissions
Authors: Peng-Shao Chen, Yen-Jong Chen
Abstract:
In recent years, global warming, the dramatic change in energy prices and the exhaustion of natural resources illustrated that energy-related topic cannot be ignored. Despite the relationship between the cities and CO₂ emissions has been extensively studied in recent years, little attention has been paid to differences in the geographical location of the city. However, the geographical climate has a great impact on lifestyle from city to city, such as the type of buildings, the major industry of the city, etc. Therefore, the paper instigates empirically the effects of kinds of urban factors and CO₂ emissions with consideration of the different geographic, climatic zones which cities are located. Using the regression model and a dataset of urban agglomeration in East Asia cities with over one million population, including 2005, 2010, and 2015 three years, the findings suggest that the impact of urban factors on CO₂ emissions vary with the latitude of the cities. Surprisingly, all kinds of urban factors, including the urban population, the share of GDP in service industry, per capita income, and others, have different level of impact on the cities locate in the tropical climate zone and temperate climate zone. The results of the study analyze the impact of different urban factors on CO₂ emissions in urban area with different geographical climate zones. These findings will be helpful for the formulation of relevant policies for urban planners and policy makers in different regions.Keywords: carbon emissions, urban agglomeration, urban factor, urban geography
Procedia PDF Downloads 267823 Burnout Recognition for Call Center Agents by Using Skin Color Detection with Hand Poses
Authors: El Sayed A. Sharara, A. Tsuji, K. Terada
Abstract:
Call centers have been expanding and they have influence on activation in various markets increasingly. A call center’s work is known as one of the most demanding and stressful jobs. In this paper, we propose the fatigue detection system in order to detect burnout of call center agents in the case of a neck pain and upper back pain. Our proposed system is based on the computer vision technique combined skin color detection with the Viola-Jones object detector. To recognize the gesture of hand poses caused by stress sign, the YCbCr color space is used to detect the skin color region including face and hand poses around the area related to neck ache and upper back pain. A cascade of clarifiers by Viola-Jones is used for face recognition to extract from the skin color region. The detection of hand poses is given by the evaluation of neck pain and upper back pain by using skin color detection and face recognition method. The system performance is evaluated using two groups of dataset created in the laboratory to simulate call center environment. Our call center agent burnout detection system has been implemented by using a web camera and has been processed by MATLAB. From the experimental results, our system achieved 96.3% for upper back pain detection and 94.2% for neck pain detection.Keywords: call center agents, fatigue, skin color detection, face recognition
Procedia PDF Downloads 293822 Quality Parameters of Offset Printing Wastewater
Authors: Kiurski S. Jelena, Kecić S. Vesna, Aksentijević M. Snežana
Abstract:
Samples of tap and wastewater were collected in three offset printing facilities in Novi Sad, Serbia. Ten physicochemical parameters were analyzed within all collected samples: pH, conductivity, m - alkalinity, p - alkalinity, acidity, carbonate concentration, hydrogen carbonate concentration, active oxygen content, chloride concentration and total alkali content. All measurements were conducted using the standard analytical and instrumental methods. Comparing the obtained results for tap water and wastewater, a clear quality difference was noticeable, since all physicochemical parameters were significantly higher within wastewater samples. The study also involves the application of simple linear regression analysis on the obtained dataset. By using software package ORIGIN 5 the pH value was mutually correlated with other physicochemical parameters. Based on the obtained values of Pearson coefficient of determination a strong positive correlation between chloride concentration and pH (r = -0.943), as well as between acidity and pH (r = -0.855) was determined. In addition, statistically significant difference was obtained only between acidity and chloride concentration with pH values, since the values of parameter F (247.634 and 182.536) were higher than Fcritical (5.59). In this way, results of statistical analysis highlighted the most influential parameter of water contamination in offset printing, in the form of acidity and chloride concentration. The results showed that variable dependence could be represented by the general regression model: y = a0 + a1x+ k, which further resulted with matching graphic regressions.Keywords: pollution, printing industry, simple linear regression analysis, wastewater
Procedia PDF Downloads 235821 A Bibliometric Analysis on Filter Bubble
Authors: Misbah Fatma, Anam Saiyeda
Abstract:
This analysis charts the introduction and expansion of research into the filter bubble phenomena over the last 10 years using a large dataset of academic publications. This bibliometric study demonstrates how interdisciplinary filter bubble research is. The identification of key authors and organizations leading the filter bubble study sheds information on collaborative networks and knowledge transfer. Relevant papers are organized based on themes including algorithmic bias, polarisation, social media, and ethical implications through a systematic examination of the literature. In order to shed light on how these patterns have changed over time, the study plots their historical history. The study also looks at how research is distributed globally, showing geographic patterns and discrepancies in scholarly output. The results of this bibliometric analysis let us fully comprehend the development and reach of filter bubble research. This study offers insights into the ongoing discussion surrounding information personalization and its implications for societal discourse, democratic participation, and the potential risks to an informed citizenry by exposing dominant themes, interdisciplinary collaborations, and geographic patterns. In order to solve the problems caused by filter bubbles and to advance a more diverse and inclusive information environment, this analysis is essential for scholars and researchers.Keywords: bibliometric analysis, social media, social networking, algorithmic personalization, self-selection, content moderation policies and limited access to information, recommender system and polarization
Procedia PDF Downloads 118820 Melanoma and Non-Melanoma, Skin Lesion Classification, Using a Deep Learning Model
Authors: Shaira L. Kee, Michael Aaron G. Sy, Myles Joshua T. Tan, Hezerul Abdul Karim, Nouar AlDahoul
Abstract:
Skin diseases are considered the fourth most common disease, with melanoma and non-melanoma skin cancer as the most common type of cancer in Caucasians. The alarming increase in Skin Cancer cases shows an urgent need for further research to improve diagnostic methods, as early diagnosis can significantly improve the 5-year survival rate. Machine Learning algorithms for image pattern analysis in diagnosing skin lesions can dramatically increase the accuracy rate of detection and decrease possible human errors. Several studies have shown the diagnostic performance of computer algorithms outperformed dermatologists. However, existing methods still need improvements to reduce diagnostic errors and generate efficient and accurate results. Our paper proposes an ensemble method to classify dermoscopic images into benign and malignant skin lesions. The experiments were conducted using the International Skin Imaging Collaboration (ISIC) image samples. The dataset contains 3,297 dermoscopic images with benign and malignant categories. The results show improvement in performance with an accuracy of 88% and an F1 score of 87%, outperforming other existing models such as support vector machine (SVM), Residual network (ResNet50), EfficientNetB0, EfficientNetB4, and VGG16.Keywords: deep learning - VGG16 - efficientNet - CNN – ensemble – dermoscopic images - melanoma
Procedia PDF Downloads 81819 When Sex Matters: A Comparative Generalized Structural Equation Model (GSEM) for the Determinants of Stunting Amongst Under-fives in Uganda
Authors: Vallence Ngabo M., Leonard Atuhaire, Peter Clever Rutayisire
Abstract:
The main aim of this study was to establish the differences in both the determinants of stunting and the causal mechanism through which the identified determinants influence stunting amongst male and female under-fives in Uganda. Literature shows that male children below the age of five years are at a higher risk of being stunted than their female counterparts. Specifically, studies in Uganda indicate that being a male child is positively associated with stunting, while being a female is negatively associated with stunting. Data for 904 males and 829 females under-fives was extracted form UDHS-2016 survey dataset. Key variables for this study were identified and used in generating relevant models and paths. Structural equation modeling techniques were used in their generalized form (GSEM). The generalized nature necessitated specifying both the family and link functions for each response variable in the system of the model. The sex of the child (b4) was used as a grouping factor and the height for age (HAZ) scores were used to construct the status for stunting of under-fives. The estimated models and path clearly indicated that the set of underlying factors that influence male and female under-fives respectively was different and the path through which they influence stunting was different. However, some of the determinants that influenced stunting amongst male under-fives also influenced stunting amongst the female under-fives. To reduce the stunting problem to the desirable state, it is important to consider the multifaceted and complex nature of the risk factors that influence stunting amongst the under-fives but, more importantly, consider the different sex-specific factors and their causal mechanism or paths through which they influence stunting.Keywords: stunting, underfives, sex of the child, GSEM, causal mechanism
Procedia PDF Downloads 140818 Fluoranthene Removal in Wastewater Using Biological and Physico-Chemical Methods
Authors: Angelica Salmeron Alcocer, Deifilia Ahuatzi Chacon, Felipe Rodriguez Casasola
Abstract:
Polycyclic aromatic hydrocarbons (PAHs) are produced naturally (forest fires, volcanic eruptions) and human activity (burning fossil fuels). Concern for PAHs is due to their toxic, mutagenic and carcinogenic effects and so pose a potential risk to human health and ecology. Therefore these are considered the most toxic components of oil, they are highly hydrophobic, making them easily depositable on the floor, air and water. One method of removing PAHs of contaminated soil used surfactants such as Tween 80, which it has been reported as less toxic and also increases the solubility of the PAH compared to other surfactants, fluoranthene is a PAH with molecular formula C16H10, its name derives from the fluorescence which presents to UV light. In this paper, a study of the fluoranthene removal solubilized with Tween 80 in synthetic wastewater using a microbial community (isolated from soil of coffee plantations in the state of Veracruz, Mexico) and Fenton oxidation method was performed. The microbial community was able to use both tween 80 and fluoranthene as carbon sources for growth, when the biological treatment in batch culture was applied, 100% of fluoranthene was mineralized, this only occurred at an initial concentration of 100 ppm, but by increasing the initial concentration of fluoranthene the removal efficiencies decay and degradation time increases due to the accumulation of byproducts more toxic or less biodegradable, however when the Fenton oxidation was previously applied to the biological treatment, it was observed that removal of fluoranthene improved because it is consumed approximately 2.4 times faster.Keywords: fluoranthene, polycyclic aromatic hydrocarbons, biological treatment, fenton oxidation
Procedia PDF Downloads 239817 The Relationship between Operating Condition and Sludge Wasting of an Aerobic Suspension-Sequencing Batch Reactor (ASSBR) Treating Phenolic Wastewater
Authors: Ali Alattabi, Clare Harris, Rafid Alkhaddar, Ali Alzeyadi
Abstract:
Petroleum refinery wastewater (PRW) can be considered as one of the most significant source of aquatic environmental pollution. It consists of oil and grease along with many other toxic organic pollutants. In recent years, a new technique was implemented using different types of membranes and sequencing batch reactors (SBRs) to treat PRW. SBR is a fill and draw type sludge system which operates in time instead of space. Many researchers have optimised SBRs’ operating conditions to obtain maximum removal of undesired wastewater pollutants. It has gained more importance mainly because of its essential flexibility in cycle time. It can handle shock loads, requires less area for operation and easy to operate. However, bulking sludge or discharging floating or settled sludge during the draw or decant phase with some SBR configurations are still one of the problems of SBR system. The main aim of this study is to develop and innovative design for the SBR optimising the process variables to result is a more robust and efficient process. Several experimental tests will be developed to determine the removal percentages of chemical oxygen demand (COD), Phenol and nitrogen compounds from synthetic PRW. Furthermore, the dissolved oxygen (DO), pH and oxidation-reduction potential (ORP) of the SBR system will be monitored online to ensure a good environment for the microorganisms to biodegrade the organic matter effectively.Keywords: petroleum refinery wastewater, sequencing batch reactor, hydraulic retention time, Phenol, COD, mixed liquor suspended solids (MLSS)
Procedia PDF Downloads 260816 Modeling Core Flooding Experiments for Co₂ Geological Storage Applications
Authors: Avinoam Rabinovich
Abstract:
CO₂ geological storage is a proven technology for reducing anthropogenic carbon emissions, which is paramount for achieving the ambitious net zero emissions goal. Core flooding experiments are an important step in any CO₂ storage project, allowing us to gain information on the flow of CO₂ and brine in the porous rock extracted from the reservoir. This information is important for understanding basic mechanisms related to CO₂ geological storage as well as for reservoir modeling, which is an integral part of a field project. In this work, a different method for constructing accurate models of CO₂-brine core flooding will be presented. Results for synthetic cases and real experiments will be shown and compared with numerical models to exhibit their predictive capabilities. Furthermore, the various mechanisms which impact the CO₂ distribution and trapping in the rock samples will be discussed, and examples from models and experiments will be provided. The new method entails solving an inverse problem to obtain a three-dimensional permeability distribution which, along with the relative permeability and capillary pressure functions, constitutes a model of the flow experiments. The model is more accurate when data from a number of experiments are combined to solve the inverse problem. This model can then be used to test various other injection flow rates and fluid fractions which have not been tested in experiments. The models can also be used to bridge the gap between small-scale capillary heterogeneity effects (sub-core and core scale) and large-scale (reservoir scale) effects, known as the upscaling problem.Keywords: CO₂ geological storage, residual trapping, capillary heterogeneity, core flooding, CO₂-brine flow
Procedia PDF Downloads 70815 Culturable Diversity of Halophilic Bacteria in Chott Tinsilt, Algeria
Authors: Nesrine Lenchi, Salima Kebbouche-Gana, Laddada Belaid, Mohamed Lamine Khelfaoui, Mohamed Lamine Gana
Abstract:
Saline lakes are extreme hypersaline environments that are considered five to ten times saltier than seawater (150 – 300 g L-1 salt concentration). Hypersaline regions differ from each other in terms of salt concentration, chemical composition and geographical location, which determine the nature of inhabitant microorganisms. In order to explore the diversity of moderate and extreme halophiles Bacteria in Chott Tinsilt (East of Algeria), an isolation program was performed. In the first time, water samples were collected from the saltern during pre-salt harvesting phase. Salinity, pH and temperature of the sampling site were determined in situ. Chemical analysis of water sample indicated that Na +and Cl- were the most abundant ions. Isolates were obtained by plating out the samples in complex and synthetic media. In this study, seven halophiles cultures of Bacteria were isolated. Isolates were studied for Gram’s reaction, cell morphology and pigmentation. Enzymatic assays (oxidase, catalase, nitrate reductase and urease), and optimization of growth conditions were done. The results indicated that the salinity optima varied from 50 to 250 g L-1, whereas the optimum of temperature range from 25°C to 35°C. Molecular identification of the isolates was performed by sequencing the 16S rRNA gene. The results showed that these cultured isolates included members belonging to the Halomonas, Staphylococcus, Salinivibrio, Idiomarina, Halobacillus Thalassobacillus and Planococcus genera and what may represent a new bacterial genus.Keywords: bacteria, Chott, halophilic, 16S rRNA
Procedia PDF Downloads 281814 RNA-Seq Based Transcriptomic Analysis of Wheat Cultivars for Unveiling of Genomic Variations and Isolation of Drought Tolerant Genes for Genome Editing
Authors: Ghulam Muhammad Ali
Abstract:
Unveiling of genes involved in drought and root architecture using transcriptomic analyses remained fragmented for further improvement of wheat through genome editing. The purpose of this research endeavor was to unveil the variations in different genes implicated in drought tolerance and root architecture in wheat through RNA-seq data analysis. In this study seedlings of 8 days old, 6 cultivars of wheat namely, Batis, Blue Silver, Local White, UZ888, Chakwal 50 and Synthetic wheat S22 were subjected to transcriptomic analysis for root and shoot genes. Total of 12 RNA samples was sequenced by Illumina. Using updated wheat transcripts from Ensembl and IWGC references with 54,175 gene models, we found that 49,621 out of 54,175 (91.5%) genes are expressed at an RPKM of 0.1 or more (in at least 1 sample). The number of genes expressed was higher in Local White than Batis. Differentially expressed genes (DEG) were higher in Chakwal 50. Expression-based clustering indicated conserved function of DRO1and RPK1 between Arabidopsis and wheat. Dendrogram showed that Local White is sister to Chakwal 50 while Batis is closely related to Blue Silver. This study flaunts transcriptomic sequence variations in different cultivars that showed mutations in genes associated with drought that may directly contribute to drought tolerance. DRO1 and RPK1 genes were fetched/isolated for genome editing. These genes are being edited in wheat through CRISPR-Cas9 for yield enhancement.Keywords: transcriptomic, wheat, genome editing, drought, CRISPR-Cas9, yield enhancement
Procedia PDF Downloads 147813 Bioremediation of Paper Mill Effluent by Microbial Consortium Comprising Bacterial and Fungal Strain and Optimizing the Effect of Carbon Source
Authors: Priya Tomar, Pallavi Mittal
Abstract:
Bioremediation has been recognized as an environment friendly and less expensive method which involves the natural processes resulting in the efficient conversion of hazardous compounds into innocuous products. The pulp and paper mill effluent is one of the high polluting effluents amongst the effluents obtained from polluting industries. The colouring body present in the wastewater from pulp and paper mill is organic in nature and is comprised of wood extractives, tannin, resins, synthetic dyes, lignin, and its degradation products formed by the action of chlorine on lignin which imparts an offensive colour to the water. These mills use different chemical process for paper manufacturing due to which lignified chemicals are released into the environment. Therefore, the chemical oxygen demand (COD) of the emanating stream is quite high. For solving the above problem we present this paper with some new techniques that were developed for the efficiency of paper mill effluents. In the present study we utilized the consortia of fungal and bacterial strain and the treatment named as C1, C2, and C3 for the decolourization of paper mill effluent. During the study, role of carbon source i.e. glucose was studied for decolourization. From the results it was observed that a maximum colour reduction of 66.9%, COD reduction of 51.8%, TSS reduction of 0.34%, TDS reduction of 0.29% and pH changes of 4.2 is achieved by consortia of Aspergillus niger with Pseudomonas aeruginosa. Data indicated that consortia of Aspergillus niger with Pseudomonas aeruginosa is giving better result with glucose.Keywords: bioremediation, decolourization, black liquor, mycoremediation
Procedia PDF Downloads 410812 Unattended Crowdsensing Method to Monitor the Quality Condition of Dirt Roads
Authors: Matias Micheletto, Rodrigo Santos, Sergio F. Ochoa
Abstract:
In developing countries, the most roads in rural areas are dirt road. They require frequent maintenance since are affected by erosive events, such as rain or wind, and the transit of heavy-weight trucks and machinery. Early detection of damages on the road condition is a key aspect, since it allows to reduce the main-tenance time and cost, and also the limitations for other vehicles to travel through. Most proposals that help address this problem require the explicit participation of drivers, a permanent internet connection, or important instrumentation in vehicles or roads. These constraints limit the suitability of these proposals when applied into developing regions, like in Latin America. This paper proposes an alternative method, based on unattended crowdsensing, to determine the quality of dirt roads in rural areas. This method involves the use of a mobile application that complements the road condition surveys carried out by organizations in charge of the road network maintenance, giving them early warnings about road areas that could be requiring maintenance. Drivers can also take advantage of the early warnings while they move through these roads. The method was evaluated using information from a public dataset. Although they are preliminary, the results indicate the proposal is potentially suitable to provide awareness about dirt roads condition to drivers, transportation authority and road maintenance companies.Keywords: dirt roads automatic quality assessment, collaborative system, unattended crowdsensing method, roads quality awareness provision
Procedia PDF Downloads 199811 A Machine Learning Approach for Earthquake Prediction in Various Zones Based on Solar Activity
Authors: Viacheslav Shkuratskyy, Aminu Bello Usman, Michael O’Dea, Saifur Rahman Sabuj
Abstract:
This paper examines relationships between solar activity and earthquakes; it applied machine learning techniques: K-nearest neighbour, support vector regression, random forest regression, and long short-term memory network. Data from the SILSO World Data Center, the NOAA National Center, the GOES satellite, NASA OMNIWeb, and the United States Geological Survey were used for the experiment. The 23rd and 24th solar cycles, daily sunspot number, solar wind velocity, proton density, and proton temperature were all included in the dataset. The study also examined sunspots, solar wind, and solar flares, which all reflect solar activity and earthquake frequency distribution by magnitude and depth. The findings showed that the long short-term memory network model predicts earthquakes more correctly than the other models applied in the study, and solar activity is more likely to affect earthquakes of lower magnitude and shallow depth than earthquakes of magnitude 5.5 or larger with intermediate depth and deep depth.Keywords: k-nearest neighbour, support vector regression, random forest regression, long short-term memory network, earthquakes, solar activity, sunspot number, solar wind, solar flares
Procedia PDF Downloads 73810 An Attempt at the Multi-Criterion Classification of Small Towns
Authors: Jerzy Banski
Abstract:
The basic aim of this study is to discuss and assess different classifications and research approaches to small towns that take their social and economic functions into account, as well as relations with surrounding areas. The subject literature typically includes three types of approaches to the classification of small towns: 1) the structural, 2) the location-related, and 3) the mixed. The structural approach allows for the grouping of towns from the point of view of the social, cultural and economic functions they discharge. The location-related approach draws on the idea of there being a continuum between the center and the periphery. A mixed classification making simultaneous use of the different approaches to research brings the most information to bear in regard to categories of the urban locality. Bearing in mind the approaches to classification, it is possible to propose a synthetic method for classifying small towns that takes account of economic structure, location and the relationship between the towns and their surroundings. In the case of economic structure, the small centers may be divided into two basic groups – those featuring a multi-branch structure and those that are specialized economically. A second element of the classification reflects the locations of urban centers. Two basic types can be identified – the small town within the range of impact of a large agglomeration, or else the town outside such areas, which is to say located peripherally. The third component of the classification arises out of small towns’ relations with their surroundings. In consequence, it is possible to indicate 8 types of small-town: from local centers enjoying good accessibility and a multi-branch economic structure to peripheral supra-local centers characterised by a specialized economic structure.Keywords: small towns, classification, functional structure, localization
Procedia PDF Downloads 182809 Multimodal Deep Learning for Human Activity Recognition
Authors: Ons Slimene, Aroua Taamallah, Maha Khemaja
Abstract:
In recent years, human activity recognition (HAR) has been a key area of research due to its diverse applications. It has garnered increasing attention in the field of computer vision. HAR plays an important role in people’s daily lives as it has the ability to learn advanced knowledge about human activities from data. In HAR, activities are usually represented by exploiting different types of sensors, such as embedded sensors or visual sensors. However, these sensors have limitations, such as local obstacles, image-related obstacles, sensor unreliability, and consumer concerns. Recently, several deep learning-based approaches have been proposed for HAR and these approaches are classified into two categories based on the type of data used: vision-based approaches and sensor-based approaches. This research paper highlights the importance of multimodal data fusion from skeleton data obtained from videos and data generated by embedded sensors using deep neural networks for achieving HAR. We propose a deep multimodal fusion network based on a twostream architecture. These two streams use the Convolutional Neural Network combined with the Bidirectional LSTM (CNN BILSTM) to process skeleton data and data generated by embedded sensors and the fusion at the feature level is considered. The proposed model was evaluated on a public OPPORTUNITY++ dataset and produced a accuracy of 96.77%.Keywords: human activity recognition, action recognition, sensors, vision, human-centric sensing, deep learning, context-awareness
Procedia PDF Downloads 101808 Targeting NLRP3 Inflammasome Activation: A New Mechanism Underlying the Protective Effects of Nafamostat Against Acute Pancreatitis
Authors: Jiandong Ren, Lijun Zhao, Peng Chen
Abstract:
Nafamostat (NA), a synthetic broad-spectrum serine protease inhibitor, has been routinely employed for the treatment of acute pancreatitis (AP) and other inflammatory-associated diseases in some East Asia countries. Although the potent inhibitory activity against inflammation-related proteases such as thrombin, trypsin, kallikrein, plasmin, coagulation factors and complement factors is generally considered to be responsible for the anti-inflammatory effects of NA, precise target and molecular mechanism underlying the anti-inflammatory activity in the treatment of AP remain largely unknown yet. As an intracellular inflammatory signaling platform, the NOD-like receptor protein 3 (NLRP3) inflammasome is recently identified to be involved in the development of AP. In present study, we have revealed that NA alleviated pancreatic injury in a caerulein-induced AP model by inhibiting the NLRP3 inflammasome activation in pancreas. Mechanistically, NA interacted with HDAC6, a cytoplasmic deacetylase implicated in the NLRP3 inflammasome pathway, and efficiently abrogated the function of HDAC6. This property enabled NA to influence HDAC6 dependent NF-κB transcriptional activity and thus block NF-κB-driven transcriptional priming of NLRP3 inflammasome. Moreover, NA exerted the potential to interfere HDAC6-mediated intracellular transport of NLRP3, thereby leading to the failure of NLRP3 inflammasome activation. Our current work has provided valuable insight into the molecular mechanism underlying the immunomodulatory effect of NA in treatment of AP, highlighting its promising application in prevention of NLRP3 inflammasome-associated inflammatory pathological damage.Keywords: acute pancreatitis, HDAC6, nafamostat, NLRP3 inflammasome
Procedia PDF Downloads 70807 Glaucoma Detection in Retinal Tomography Using the Vision Transformer
Authors: Sushish Baral, Pratibha Joshi, Yaman Maharjan
Abstract:
Glaucoma is a chronic eye condition that causes vision loss that is irreversible. Early detection and treatment are critical to prevent vision loss because it can be asymptomatic. For the identification of glaucoma, multiple deep learning algorithms are used. Transformer-based architectures, which use the self-attention mechanism to encode long-range dependencies and acquire extremely expressive representations, have recently become popular. Convolutional architectures, on the other hand, lack knowledge of long-range dependencies in the image due to their intrinsic inductive biases. The aforementioned statements inspire this thesis to look at transformer-based solutions and investigate the viability of adopting transformer-based network designs for glaucoma detection. Using retinal fundus images of the optic nerve head to develop a viable algorithm to assess the severity of glaucoma necessitates a large number of well-curated images. Initially, data is generated by augmenting ocular pictures. After that, the ocular images are pre-processed to make them ready for further processing. The system is trained using pre-processed images, and it classifies the input images as normal or glaucoma based on the features retrieved during training. The Vision Transformer (ViT) architecture is well suited to this situation, as it allows the self-attention mechanism to utilise structural modeling. Extensive experiments are run on the common dataset, and the results are thoroughly validated and visualized.Keywords: glaucoma, vision transformer, convolutional architectures, retinal fundus images, self-attention, deep learning
Procedia PDF Downloads 191806 Pozzolanic Properties of Synthetic Zeolites as Materials Used for the Production of Building Materials
Authors: Joanna Styczen, Wojciech Franus
Abstract:
Currently, cement production reaches 3-6 Gt per year. The production of one ton of cement is associated with the emission of 0.5 to 1 ton of carbon dioxide into the atmosphere, which means that this process is responsible for 5% of global CO2 emissions. Simply improving the cement manufacturing process is not enough. An effective solution is the use of pozzolanic materials, which can partly replace clinker and thus reduce energy consumption, and emission of pollutants and give mortars the desired characteristics, shaping their microstructure. Pozzolanic additives modify the phase composition of cement, reducing the amount of portlandite and changing the CaO/SiO2 ratio in the C-S-H phase. Zeolites are a pozzolanic additive that is not commonly used. Three types of zeolites were synthesized in work: Na-A, sodalite and ZSM-5 (these zeolites come from three different structural groups). Zeolites were obtained by hydrothermal synthesis of fly ash in an aqueous NaOH solution. Then, the pozzolanicity of the obtained materials was assessed. The pozzolanic activity of the zeolites synthesized for testing was tested by chemical methods in accordance with the ASTM C 379-65 standard. The method consisted in determining the percentage content of active ingredients (soluble silicon oxide and aluminum).in alkaline solutions, i.e. those that are potentially reactive towards calcium hydroxide. The highest amount of active silica was found in zeolite ZSM-5 - 88.15%. The amount of active Al2O3 was small - 1%. The smallest pozzolanic activity was found in the Na-A zeolite (active SiO2 - 4.4%, and active Al2O3 - 2.52). The tests carried out using the XRD, SEM, XRF and textural tests showed that the obtained zeolites are characterized by high porosity, which makes them a valuable addition to mortars.Keywords: pozzolanic properties, hydration, zeolite, alite
Procedia PDF Downloads 78805 Development of Orthogonally Protected 2,1':4,6-Di-O-Diisopropylidene Sucrose as the Versatile Intermediate for Diverse Synthesis of Phenylpropanoid Sucrose Esters
Authors: Li Lin Ong, Duc Thinh Khong, Zaher M. A. Judeh
Abstract:
Phenylpropanoid sucrose esters (PSEs) are natural compounds found in various medicinal plants which exhibit important biological activities such as antiproliferation and α- and β-glucosidase inhibitory activities. Despite their potential as new therapeutics, total synthesis of PSEs has been very limited as their inherent structures contain one or more (substituted) cinnamoyl groups randomly allocated on the sucrose core via ester linkage. Since direct acylation of unprotected sucrose would be complex and tedious due to the presence of eight free hydroxyl groups, partially protected 2,1’:4,6-di-O-diisopropylidene sucrose was used as the starting material instead. However, similar reactivity between the remaining four hydroxyl groups still pose a challenge in the total synthesis of PSEs as the lack of selectivity can restrict customisation where acylation at specific OH is desired. To overcome this problem, a 4-step orthogonal protection scheme was developed. In this scheme, the remaining four hydroxyl groups on 2,1’:4,6-di-O-diisopropylidene sucrose, 6’-OH, 3’-OH, 4’-OH, and 3-OH, were protected with different protecting groups with an overall yield of > 40%. This orthogonally protected intermediate would provide a convenient and divergent access to a wider range of natural and synthetic PSEs as (substituted) cinnamoyl groups can be selectively introduced at desired positions. Using this scheme, three different series of monosubstituted PSEs were successfully synthesized where (substituted) cinnamoyl groups were introduced selectively at O-3, O-3’, and O-4’ positions, respectively. The expanded library of PSEs would aid in structural-activity relationship study of PSEs for identifying key components responsible for their biological activities.Keywords: orthogonal protection, phenylpropanoid sucrose esters, selectivity, sucrose
Procedia PDF Downloads 158804 In-silico Antimicrobial Activity of Bioactive Compounds of Ricinus communis against DNA Gyrase of Staphylococcus aureus as Molecular Target
Authors: S. Rajeswari
Abstract:
Medicinal Plant extracts and their bioactive compounds have been used for antimicrobial activities and have significant remedial properties. In the recent years, a wide range of investigations have been carried out throughout the world to confirm antimicrobial properties of different medicinally important plants. A number of plants showed efficient antimicrobial activities, which were comparable to that of synthetic standard drugs or antimicrobial agents. The large family Euphorbiaceae contains nearly about 300 genera and 7,500 speciesand one among is Ricinus communis or castor plant which has high traditional and medicinal value for disease free healthy life. Traditionally the plant is used as laxative, purgative, fertilizer and fungicide etc. whereas the plant possess beneficial effects such as anti-oxidant, antihistamine, antinociceptive, antiasthmatic, antiulcer, immunomodulatory anti diabetic, hepatoprotective, anti inflammatory, antimicrobial, and many other medicinal properties. This activity of the plant possess due to the important phytochemical constituents like flavonoids, saponins, glycosides, alkaloids and steroids. The presents study includes the phytochemical properties of Ricinus communis and to prediction of the anti-microbial activity of Ricinus communis using DNA gyrase of Staphylococcus aureus as molecular target. Docking results of varies chemicals compounds of Ricinus communis against DNA gyrase of Staphylococcus aureus by maestro 9.8 of Schrodinger show that the phytochemicals are effective against the target protein DNA gyrase. our studies suggest that the phytochemical from Ricinus communis such has INDICAN (G.Score 4.98) and SUPLOPIN-2(G.Score 5.74) can be used as lead molecule against Staphylococcus infections.Keywords: euphorbiaceae, antimicrobial activity, Ricinus communis, Staphylococcus aureus
Procedia PDF Downloads 479803 Classification of Manufacturing Data for Efficient Processing on an Edge-Cloud Network
Authors: Onyedikachi Ulelu, Andrew P. Longstaff, Simon Fletcher, Simon Parkinson
Abstract:
The widespread interest in 'Industry 4.0' or 'digital manufacturing' has led to significant research requiring the acquisition of data from sensors, instruments, and machine signals. In-depth research then identifies methods of analysis of the massive amounts of data generated before and during manufacture to solve a particular problem. The ultimate goal is for industrial Internet of Things (IIoT) data to be processed automatically to assist with either visualisation or autonomous system decision-making. However, the collection and processing of data in an industrial environment come with a cost. Little research has been undertaken on how to specify optimally what data to capture, transmit, process, and store at various levels of an edge-cloud network. The first step in this specification is to categorise IIoT data for efficient and effective use. This paper proposes the required attributes and classification to take manufacturing digital data from various sources to determine the most suitable location for data processing on the edge-cloud network. The proposed classification framework will minimise overhead in terms of network bandwidth/cost and processing time of machine tool data via efficient decision making on which dataset should be processed at the ‘edge’ and what to send to a remote server (cloud). A fast-and-frugal heuristic method is implemented for this decision-making. The framework is tested using case studies from industrial machine tools for machine productivity and maintenance.Keywords: data classification, decision making, edge computing, industrial IoT, industry 4.0
Procedia PDF Downloads 180802 Unsupervised Feature Learning by Pre-Route Simulation of Auto-Encoder Behavior Model
Authors: Youngjae Jin, Daeshik Kim
Abstract:
This paper describes a cycle accurate simulation results of weight values learned by an auto-encoder behavior model in terms of pre-route simulation. Given the results we visualized the first layer representations with natural images. Many common deep learning threads have focused on learning high-level abstraction of unlabeled raw data by unsupervised feature learning. However, in the process of handling such a huge amount of data, the learning method’s computation complexity and time limited advanced research. These limitations came from the fact these algorithms were computed by using only single core CPUs. For this reason, parallel-based hardware, FPGAs, was seen as a possible solution to overcome these limitations. We adopted and simulated the ready-made auto-encoder to design a behavior model in Verilog HDL before designing hardware. With the auto-encoder behavior model pre-route simulation, we obtained the cycle accurate results of the parameter of each hidden layer by using MODELSIM. The cycle accurate results are very important factor in designing a parallel-based digital hardware. Finally this paper shows an appropriate operation of behavior model based pre-route simulation. Moreover, we visualized learning latent representations of the first hidden layer with Kyoto natural image dataset.Keywords: auto-encoder, behavior model simulation, digital hardware design, pre-route simulation, Unsupervised feature learning
Procedia PDF Downloads 446801 A Real-Time Snore Detector Using Neural Networks and Selected Sound Features
Authors: Stelios A. Mitilineos, Nicolas-Alexander Tatlas, Georgia Korompili, Lampros Kokkalas, Stelios M. Potirakis
Abstract:
Obstructive Sleep Apnea Hypopnea Syndrome (OSAHS) is a widespread chronic disease that mostly remains undetected, mainly due to the fact that it is diagnosed via polysomnography which is a time and resource-intensive procedure. Screening the disease’s symptoms at home could be used as an alternative approach in order to alert individuals that potentially suffer from OSAHS without compromising their everyday routine. Since snoring is usually linked to OSAHS, developing a snore detector is appealing as an enabling technology for screening OSAHS at home using ubiquitous equipment like commodity microphones (included in, e.g., smartphones). In this context, this study developed a snore detection tool and herein present the approach and selection of specific sound features that discriminate snoring vs. environmental sounds, as well as the performance of the proposed tool. Furthermore, a Real-Time Snore Detector (RTSD) is built upon the snore detection tool and employed in whole-night sleep sound recordings resulting to a large dataset of snoring sound excerpts that are made freely available to the public. The RTSD may be used either as a stand-alone tool that offers insight to an individual’s sleep quality or as an independent component of OSAHS screening applications in future developments.Keywords: obstructive sleep apnea hypopnea syndrome, apnea screening, snoring detection, machine learning, neural networks
Procedia PDF Downloads 207