Search results for: yeast processing wastewater
1484 Proposition of an Intelligent System Based on the Augmented Reality for Warehouse Logistics
Authors: Safa Gharbi, Hayfa Zgaya, Nesrine Zoghlami, Slim Hammadi, Cyril De Barbarin, Laurent Vinatier, Christiane Coupier
Abstract:
Increasing productivity and quality of service, improving the working comfort and ensuring the efficiency of all processes are important challenges for every warehouse. The order picking is recognized to be the most important and costly activity of all the process in warehouses. This paper presents a new approach using Augmented Reality (AR) in the field of logistics. It aims to create a Head-Up Display (HUD) interface with a Warehouse Management System (WMS), using AR glasses. Integrating AR technology allows the optimization of order picking by reducing time of picking process, increasing the efficiency and delivering quickly. The picker will be able to access immediately to all the information needed for his tasks. All the information is displayed when needed in the field of vision (FOV) of the operator, without any action requested from him. These research works are part of the industrial project RASL (Réalité Augmentée au Service de la Logistique) which gathers two major partners: the LAGIS (Laboratory of Automatics, Computer Engineering and Signal Processing in Lille-France) and Genrix Group, European leader in warehouses logistics, who provided his software for implementation, and his logistics expertise.Keywords: Augmented Reality (AR), logistics and optimization, Warehouse Management System (WMS), Head-Up Display (HUD)
Procedia PDF Downloads 4811483 Effect of Environmental Conditions on E. Coli o157:h7 Atcc 43888 and L. Monocytogenes Atcc 7644 Cell Surface Hydrophobicity, Motility and Cell Attachment on Food-Contact Surfaces
Authors: Stanley Dula, Oluwatosini A. Ijabadeniyi
Abstract:
Biofilm formation is a major source of materials and foodstuffs contamination, contributing to occurrence of pathogenic and spoilage microbes in food processing resulting in food spoilage, transmission of diseases and significant food hygiene and safety issues. This study elucidates biofilm formation of E. coli O157:H7 and L. monocytogenes ATCC 7644 grown under food related environmental stress conditions of varying pH (5.0;7.0; and 8.5) and temperature (15, 25 and 37 ℃). Both strains showed confluent biofilm formation at 25 ℃ and 37 ℃, at pH 8.5 after 5 days. E. coli showed curli fimbriae production at various temperatures, while L. monocytogenes did not show pronounced expression. Swarm, swimming and twitching plate assays were used to determine strain motilities. Characterization of cell hydrophobicity was done using the microbial adhesion to hydrocarbons (MATH) assay using n-hexadecane. Both strains showed hydrophilic characteristics as they fell within a < 20 % interval. FT-IR revealed COOH at 1622 cm-1, and a strong absorption band at 3650 cm-1 – 3200 cm-1 indicating the presence of both -OH and -NH groups. Both strains were hydrophilic and could form biofilm at different combinations of temperature and pH. EPS produced in both species proved to be an acidic hetero-polysaccharide.Keywords: biofilm, pathogens, hydrophobicity, motility
Procedia PDF Downloads 2351482 Accurate Cortical Reconstruction in Narrow Sulci with Zero-Non-Zero Distance (ZNZD) Vector Field
Authors: Somojit Saha, Rohit K. Chatterjee, Sarit K. Das, Avijit Kar
Abstract:
A new force field is designed for propagation of the parametric contour into deep narrow cortical fold in the application of knowledge based reconstruction of cerebral cortex from MR image of brain. Designing of this force field is highly inspired by the Generalized Gradient Vector Flow (GGVF) model and markedly differs in manipulation of image information in order to determine the direction of propagation of the contour. While GGVF uses edge map as its main driving force, the newly designed force field uses the map of distance between zero valued pixels and their nearest non-zero valued pixel as its main driving force. Hence, it is called Zero-Non-Zero Distance (ZNZD) force field. The objective of this force field is forceful propagation of the contour beyond spurious convergence due to partial volume effect (PVE) in to narrow sulcal fold. Being function of the corresponding non-zero pixel value, the force field has got an inherent property to determine spuriousness of the edge automatically. It is effectively applied along with some morphological processing in the application of cortical reconstruction to breach the hindrance of PVE in narrow sulci where conventional GGVF fails.Keywords: deformable model, external force field, partial volume effect, cortical reconstruction, MR image of brain
Procedia PDF Downloads 3951481 Harnessing the Power of Artificial Intelligence: Advancements and Ethical Considerations in Psychological and Behavioral Sciences
Authors: Nayer Mofidtabatabaei
Abstract:
Advancements in artificial intelligence (AI) have transformed various fields, including psychology and behavioral sciences. This paper explores the diverse ways in which AI is applied to enhance research, diagnosis, therapy, and understanding of human behavior and mental health. We discuss the potential benefits and challenges associated with AI in these fields, emphasizing the ethical considerations and the need for collaboration between AI researchers and psychological and behavioral science experts. Artificial Intelligence (AI) has gained prominence in recent years, revolutionizing multiple industries, including healthcare, finance, and entertainment. One area where AI holds significant promise is the field of psychology and behavioral sciences. AI applications in this domain range from improving the accuracy of diagnosis and treatment to understanding complex human behavior patterns. This paper aims to provide an overview of the various AI applications in psychological and behavioral sciences, highlighting their potential impact, challenges, and ethical considerations. Mental Health Diagnosis AI-driven tools, such as natural language processing and sentiment analysis, can analyze large datasets of text and speech to detect signs of mental health issues. For example, chatbots and virtual therapists can provide initial assessments and support to individuals suffering from anxiety or depression. Autism Spectrum Disorder (ASD) Diagnosis AI algorithms can assist in early ASD diagnosis by analyzing video and audio recordings of children's behavior. These tools help identify subtle behavioral markers, enabling earlier intervention and treatment. Personalized Therapy AI-based therapy platforms use personalized algorithms to adapt therapeutic interventions based on an individual's progress and needs. These platforms can provide continuous support and resources for patients, making therapy more accessible and effective. Virtual Reality Therapy Virtual reality (VR) combined with AI can create immersive therapeutic environments for treating phobias, PTSD, and social anxiety. AI algorithms can adapt VR scenarios in real-time to suit the patient's progress and comfort level. Data Analysis AI aids researchers in processing vast amounts of data, including survey responses, brain imaging, and genetic information. Privacy Concerns Collecting and analyzing personal data for AI applications in psychology and behavioral sciences raise significant privacy concerns. Researchers must ensure the ethical use and protection of sensitive information. Bias and Fairness AI algorithms can inherit biases present in training data, potentially leading to biased assessments or recommendations. Efforts to mitigate bias and ensure fairness in AI applications are crucial. Transparency and Accountability AI-driven decisions in psychology and behavioral sciences should be transparent and subject to accountability. Patients and practitioners should understand how AI algorithms operate and make decisions. AI applications in psychological and behavioral sciences have the potential to transform the field by enhancing diagnosis, therapy, and research. However, these advancements come with ethical challenges that require careful consideration. Collaboration between AI researchers and psychological and behavioral science experts is essential to harness AI's full potential while upholding ethical standards and privacy protections. The future of AI in psychology and behavioral sciences holds great promise, but it must be navigated with caution and responsibility.Keywords: artificial intelligence, psychological sciences, behavioral sciences, diagnosis and therapy, ethical considerations
Procedia PDF Downloads 691480 Particle Size Analysis of Itagunmodi Southwestern Nigeria Alluvial Gold Ore Sample by Gaudin Schumann Method
Authors: Olaniyi Awe, Adelana R. Adetunji, Abraham Adeleke
Abstract:
Mining of alluvial gold ore by artisanal miners has been going on for decades at Itagunmodi, Southwestern Nigeria. In order to optimize the traditional panning gravity separation method commonly used in the area, a mineral particle size analysis study is critical. This study analyzed alluvial gold ore samples collected at identified five different locations in the area with a view to determine the ore particle size distributions. 500g measured of as-received alluvial gold ore sample was introduced into the uppermost sieve of an electrical sieve shaker consisting of sieves arranged in the order of decreasing nominal apertures of 5600μm, 3350μm, 2800μm, 355μm, 250μm, 125μm and 90μm, and operated for 20 minutes. The amount of material retained on each sieve was measured and tabulated for analysis. A screen analysis graph using the Gaudin Schuman method was drawn for each of the screen tests on the alluvial samples. The study showed that the percentages of fine particle size -125+90 μm fraction were 45.00%, 36.00%, 39.60%, 43.00% and 36.80% for the selected samples. These primary ore characteristic results provide reference data for the alluvial gold ore processing method selection, process performance measurement and optimization.Keywords: alluvial gold ore, sieve shaker, particle size, Gaudin Schumann
Procedia PDF Downloads 601479 Mining in Nigeria and Development Effort of Metallurgical Technologies at National Metallurgical Development Center Jos, Plateau State-Nigeria
Authors: Linus O. Asuquo
Abstract:
Mining in Nigeria and development effort of metallurgical technologies at National Metallurgical Development Centre Jos has been addressed in this paper. The paper has looked at the history of mining in Nigeria, the impact of mining on social and industrial development, and the contribution of the mining sector to Nigeria’s Gross Domestic Product (GDP). The paper clearly stated that Nigeria’s mining sector only contributes 0.5% to the nation’s GDP unlike Botswana that the mining sector contributes 38% to the nation’s GDP. Nigeria Bureau of Statistics has it on record that Nigeria has about 44 solid minerals awaiting to be exploited. Clearly highlighted by this paper is the abundant potentials that exist in the mining sector for investment. The paper made an exposition on the extensive efforts made at National Metallurgical Development Center (NMDC) to develop metallurgical technologies in various areas of the metals sector; like mineral processing, foundry development, nonferrous metals extraction, materials testing, lime calcination, ANO (Trade name for powder lubricant) wire drawing lubricant, refractories and many others. The paper went ahead to draw a conclusion that there is a need to develop the mining sector in Nigeria and to give a sustainable support to the efforts currently made at NMDC to develop metallurgical technologies which are capable of transforming the metals sector in Nigeria, which will lead to industrialization. Finally the paper made some recommendations which traverse the topic for the best expectation.Keywords: mining, minerals, technologies, value addition
Procedia PDF Downloads 1011478 Estimation of Lungs Physiological Motion for Patient Undergoing External Lung Irradiation
Authors: Yousif Mohamed Y. Abdallah
Abstract:
This is an experimental study deals with detection, measurement and analysis of the periodic physiological organ motion during external beam radiotherapy; to improve the accuracy of the radiation field placement, and to reduce the exposure of healthy tissue during radiation treatments. The importance of this study is to detect the maximum path of the mobile structures during radiotherapy delivery, to define the planning target volume (PTV) and irradiated volume during both inspiration and expiration period and to verify the target volume. In addition to its role to highlight the importance of the application of Intense Guided Radiotherapy (IGRT) methods in the field of radiotherapy. The results showed (body contour was equally (3.17 + 0.23 mm), for left lung displacement reading (2.56 + 0.99 mm) and right lung is (2.42 + 0.77 mm) which the radiation oncologist to take suitable countermeasures in case of significant errors. In addition, the use of the image registration technique for automatic position control is predicted potential motion. The motion ranged between 2.13 mm and 12.2 mm (low and high). In conclusion, individualized assessment of tumor mobility can improve the accuracy of target areas definition in patients undergo Sterostatic RT for stage I, II and III lung cancer (NSCLC). Definition of the target volume based on a single CT scan with a margin of 10 mm is clearly inappropriate.Keywords: respiratory motion, external beam radiotherapy, image processing, lung
Procedia PDF Downloads 5331477 Visualization of Corrosion at Plate-Like Structures Based on Ultrasonic Wave Propagation Images
Authors: Aoqi Zhang, Changgil Lee Lee, Seunghee Park
Abstract:
A non-contact nondestructive technique using laser-induced ultrasonic wave generation method was applied to visualize corrosion damage at aluminum alloy plate structures. The ultrasonic waves were generated by a Nd:YAG pulse laser, and a galvanometer-based laser scanner was used to scan specific area at a target structure. At the same time, wave responses were measured at a piezoelectric sensor which was attached on the target structure. The visualization of structural damage was achieved by calculating logarithmic values of root mean square (RMS). Damage-sensitive feature was defined as the scattering characteristics of the waves that encounter corrosion damage. The corroded damage was artificially formed by hydrochloric acid. To observe the effect of the location where the corrosion was formed, the both sides of the plate were scanned with same scanning area. Also, the effect on the depth of the corrosion was considered as well as the effect on the size of the corrosion. The results indicated that the damages were successfully visualized for almost cases, whether the damages were formed at the front or back side. However, the damage could not be clearly detected because the depth of the corrosion was shallow. In the future works, it needs to develop signal processing algorithm to more clearly visualize the damage by improving signal-to-noise ratio.Keywords: non-destructive testing, corrosion, pulsed laser scanning, ultrasonic waves, plate structure
Procedia PDF Downloads 2991476 Non-Targeted Adversarial Object Detection Attack: Fast Gradient Sign Method
Authors: Bandar Alahmadi, Manohar Mareboyana, Lethia Jackson
Abstract:
Today, there are many applications that are using computer vision models, such as face recognition, image classification, and object detection. The accuracy of these models is very important for the performance of these applications. One challenge that facing the computer vision models is the adversarial examples attack. In computer vision, the adversarial example is an image that is intentionally designed to cause the machine learning model to misclassify it. One of very well-known method that is used to attack the Convolution Neural Network (CNN) is Fast Gradient Sign Method (FGSM). The goal of this method is to find the perturbation that can fool the CNN using the gradient of the cost function of CNN. In this paper, we introduce a novel model that can attack Regional-Convolution Neural Network (R-CNN) that use FGSM. We first extract the regions that are detected by R-CNN, and then we resize these regions into the size of regular images. Then, we find the best perturbation of the regions that can fool CNN using FGSM. Next, we add the resulted perturbation to the attacked region to get a new region image that looks similar to the original image to human eyes. Finally, we placed the regions back to the original image and test the R-CNN with the attacked images. Our model could drop the accuracy of the R-CNN when we tested with Pascal VOC 2012 dataset.Keywords: adversarial examples, attack, computer vision, image processing
Procedia PDF Downloads 1911475 Reduction Behavior of Some Low-Grade Iron Ores for Application in Blast Furnace
Authors: Heba Al-Kelesh
Abstract:
Day after day, high-grade iron ores are consumed. Because of the strong global demand for iron and steel, it has necessitated the utilization of various low-grade iron ores, which are not suitable for direct exploitation in the iron industry. The low-grade ores cannot be dressed using traditional mineral processing methods because of complicated mineral compositions. The present work is aimed to investigate the reducibility of some Egyptian iron ores and concentrates by conditions emulate different blast furnace areas. Representative specimens are collected from El-Gedida–Baharia oasis, Eastern South Aswan, and Eastern desert-wadi Kareem (EDC). Some mineralogical and morphological characterizations are executed. The reactivity arrangement of green samples is Baharia>Aswan>EDC. The presence of magnetite decreased reactivity of EDC. The reducibility of the Aswan sample is lower than Baharia due to the presence of agglomerated metallic grain surrounded by semi-melted phases. Specimens are annealed at 1000ᵒC for 3 hours. After firing, the reducibility of Aswan becomes the lowest due to the formation of fayalite and calcium phosphate phases. The relative attitude for green and fired samples reduced at different conditions are studied. For thermal and top areas, the reactivity of fired samples is greater than green ones, which were confirmed by morphological examinations.Keywords: reducibility, low grade, iron industry, blast furnace
Procedia PDF Downloads 1251474 Cesium 137 Leaching from Soils of Territories, Polluted by Radionuclides
Authors: S. V. Vasilenkov, O. N. Demina
Abstract:
Chernobyl NPP accident is the biggest in history of nuclear energetic. Bryansk region of Russia was exposed by the most intensive radiation pollution. For that, we made some researches in order to find the methods of soil rehabilitation on territories, polluted by radionuclides with the means of Cesium 137 leaching by watering. For experiments we took the soil from the upper more polluted 10 cm layer of different species. Cesium 137 leaching was made by different methods in washing columns. Washout of Cesium was made by periodical cycles in terms of 4-6 days. In experiments with easy argillaceous soil with start specific radioactivity 4158 bk/kg through 17 cycles the effective reducing was achieved and contained 1512 bk/kg. Besides, results of researches showed, that in the first 6-10 cycles we can see reducing of washing rate but after application of intensificators: ultrasound water processing, aerification, application of fertilizers (KCl), lime, freezing, we can see increasing of Cesium 137 leaching. The experimental investigations in washout of Cesium (Cs) – 137 from the soil were carried out in the field and laboratorial conditions during its freezing and melting. The experiments showed, that washout of Cesium (Cs) – 137 from the soil is rather high after freezing, than non-frozen soil is. And it conforms to washout of Cesium, made under the influence of the intensificaters. This fact allows to recommend chip and easy to construct technically arrangement for regulation of the snow-melt runoff for rehabilitation of the radioactive impoundment.Keywords: pollution, radiation, Cesium 137 leaching, agriculture
Procedia PDF Downloads 2911473 JaCoText: A Pretrained Model for Java Code-Text Generation
Authors: Jessica Lopez Espejel, Mahaman Sanoussi Yahaya Alassan, Walid Dahhane, El Hassane Ettifouri
Abstract:
Pretrained transformer-based models have shown high performance in natural language generation tasks. However, a new wave of interest has surged: automatic programming language code generation. This task consists of translating natural language instructions to a source code. Despite the fact that well-known pre-trained models on language generation have achieved good performance in learning programming languages, effort is still needed in automatic code generation. In this paper, we introduce JaCoText, a model based on Transformer neural network. It aims to generate java source code from natural language text. JaCoText leverages the advantages of both natural language and code generation models. More specifically, we study some findings from state of the art and use them to (1) initialize our model from powerful pre-trained models, (2) explore additional pretraining on our java dataset, (3) lead experiments combining the unimodal and bimodal data in training, and (4) scale the input and output length during the fine-tuning of the model. Conducted experiments on CONCODE dataset show that JaCoText achieves new state-of-the-art results.Keywords: java code generation, natural language processing, sequence-to-sequence models, transformer neural networks
Procedia PDF Downloads 2831472 Cloud Support for Scientific Workflow Execution: Prototyping Solutions for Remote Sensing Applications
Authors: Sofiane Bendoukha, Daniel Moldt, Hayat Bendoukha
Abstract:
Workflow concepts are essential for the development of remote sensing applications. They can help users to manage and process satellite data and execute scientific experiments on distributed resources. The objective of this paper is to introduce an approach for the specification and the execution of complex scientific workflows in Cloud-like environments. The approach strives to support scientists during the modeling, the deployment and the monitoring of their workflows. This work takes advantage from Petri nets and more pointedly the so-called reference nets formalism, which provides a robust modeling/implementation technique. RENEWGRASS is a tool that we implemented and integrated into the Petri nets editor and simulator RENEW. It provides an easy way to support not experienced scientists during the specification of their workflows. It allows both modeling and enactment of image processing workflows from the remote sensing domain. Our case study is related to the implementation of vegetation indecies. We have implemented the Normalized Differences Vegetation Index (NDVI) workflow. Additionally, we explore the integration possibilities of the Cloud technology as a supplementary layer for the deployment of the current implementation. For this purpose, we discuss migration patterns of data and applications and propose an architecture.Keywords: cloud computing, scientific workflows, petri nets, RENEWGRASS
Procedia PDF Downloads 4461471 Test Rig Development for Up-to-Date Experimental Study of Multi-Stage Flash Distillation Process
Authors: Marek Vondra, Petr Bobák
Abstract:
Vacuum evaporation is a reliable and well-proven technology with a wide application range which is frequently used in food, chemical or pharmaceutical industries. Recently, numerous remarkable studies have been carried out to investigate utilization of this technology in the area of wastewater treatment. One of the most successful applications of vacuum evaporation principal is connected with seawater desalination. Since 1950’s, multi-stage flash distillation (MSF) has been the leading technology in this field and it is still irreplaceable in many respects, despite a rapid increase in cheaper reverse-osmosis-based installations in recent decades. MSF plants are conveniently operated in countries with a fluctuating seawater quality and at locations where a sufficient amount of waste heat is available. Nowadays, most of the MSF research is connected with alternative heat sources utilization and with hybridization, i.e. merging of different types of desalination technologies. Some of the studies are concerned with basic principles of the static flash phenomenon, but only few scientists have lately focused on the fundamentals of continuous multi-stage evaporation. Limited measurement possibilities at operating plants and insufficiently equipped experimental facilities may be the reasons. The aim of the presented study was to design, construct and test an up-to-date test rig with an advanced measurement system which will provide real time monitoring options of all the important operational parameters under various conditions. The whole system consists of a conventionally designed MSF unit with 8 evaporation chambers, versatile heating circuit for different kinds of feed water (e.g. seawater, waste water), sophisticated system for acquisition and real-time visualization of all the related quantities (temperature, pressure, flow rate, weight, conductivity, pH, water level, power input), access to a wide spectrum of operational media (salt, fresh and softened water, steam, natural gas, compressed air, electrical energy) and integrated transparent features which enable a direct visual control of selected physical mechanisms (water evaporation in chambers, water level right before brine and distillate pumps). Thanks to the adjustable process parameters, it is possible to operate the test unit at desired operational conditions. This allows researchers to carry out statistical design and analysis of experiments. Valuable results obtained in this manner could be further employed in simulations and process modeling. First experimental tests confirm correctness of the presented approach and promise interesting outputs in the future. The presented experimental apparatus enables flexible and efficient research of the whole MSF process.Keywords: design of experiment, multi-stage flash distillation, test rig, vacuum evaporation
Procedia PDF Downloads 3861470 Effect of Oil Shale Alkylresorcinols on Physico-Chemical and Thermal Properties of Polycondensation Resins
Authors: Ana Jurkeviciute, Larisa Grigorieva, Ksenia Moskvinа
Abstract:
Oil shale alkylresorcinols are formed as a by-product in oil shale processing. They are unique raw material for chemical industry. Polycondensation resins obtaining is one of the worthwhile directions of oil shale alkylresorcinols use. These resins are widely applied in many branches of industry such as wood-working, metallurgic, tire, rubber products, construction etc. Possibility of resins obtaining using overall alkylresorcinols will allow to cheapen finished products on their base and to widen the range of resins offered on the market. Synthesis of polycondensation resins on the basis of alkylresorcinols was conducted by several methods in the process of investigations. In the formulations a part of resorcinol was replaced by fractions of oil shale alkylresorcinols containing different amount of 5-methylresorcinol (40-80 mass %). Some resins were modified by aromatic alkene at the stage of synthesis. Thermal stability and degradation behavior of resins were investigated by thermogravimetric analysis (TGA) method both in an inert nitrogen environment and in an oxidative environment of air. TGA integral curves were obtained and processed in dynamic mode for interval of temperatures from 25 to 830 °C. Rate of temperature rise was 5°C/min, gas flow rate - 50 ml/min. Resins power for carbonization was evaluated by carbon residue. Physical-chemical parameters of the resins were determined. Content of resorcinol and 5-methylresorcinol not reacted in the process of synthesis were determined by gas chromatography method.Keywords: resorcinol, oil shale alkylresorcinols, aromatic alkene, polycondensation resins, modified resins
Procedia PDF Downloads 1931469 The International Monetary Fund’s Treatment Towards Argentina and Brazil During Financial Negotiations for Their First Adjustment Programs, 1958-64
Authors: Fernanda Conforto de Oliveira
Abstract:
The International Monetary Fund (IMF) has a central role in global financial governance as the world’s leading crisis lender. Its practice of conditional lending – conditioning loans on the implementation of economic policy adjustments – is the primary lever by which the institution interacts with and influences the policy choices of member countries and has been a key topic of interest to scholars and public opinion. However, empirical evidence about the economic and (geo)political determinants of IMF lending behavior remains inconclusive, and no model that explains IMF policies has been identified. This research moves beyond panel analysis to focus on financial negotiations for the first IMF programs in Argentina and Brazil in the early post-war period. It seeks to understand why negotiations achieved distinct objectives: Argentinean officials cooperated and complied with IMF policies, whereas their Brazilian counterparts hesitated. Using qualitative and automated text analysis, this paper analyses the hypothesis about whether a differential IMF treatment could help to explain these distinct outcomes. This paper contributes to historical studies on IMF-Latin America relations and the broader literature in international policy economy about IMF policies.Keywords: international monetary fund, international history, financial history, Latin American economic history, natural language processing, sentiment analysis
Procedia PDF Downloads 621468 Low Power CMOS Amplifier Design for Wearable Electrocardiogram Sensor
Authors: Ow Tze Weng, Suhaila Isaak, Yusmeeraz Yusof
Abstract:
The trend of health care screening devices in the world is increasingly towards the favor of portability and wearability, especially in the most common electrocardiogram (ECG) monitoring system. This is because these wearable screening devices are not restricting the patient’s freedom and daily activities. While the demand of low power and low cost biomedical system on chip (SoC) is increasing in exponential way, the front end ECG sensors are still suffering from flicker noise for low frequency cardiac signal acquisition, 50 Hz power line electromagnetic interference, and the large unstable input offsets due to the electrode-skin interface is not attached properly. In this paper, a high performance CMOS amplifier for ECG sensors that suitable for low power wearable cardiac screening is proposed. The amplifier adopts the highly stable folded cascode topology and later being implemented into RC feedback circuit for low frequency DC offset cancellation. By using 0.13 µm CMOS technology from Silterra, the simulation results show that this front end circuit can achieve a very low input referred noise of 1 pV/√Hz and high common mode rejection ratio (CMRR) of 174.05 dB. It also gives voltage gain of 75.45 dB with good power supply rejection ratio (PSSR) of 92.12 dB. The total power consumption is only 3 µW and thus suitable to be implemented with further signal processing and classification back end for low power biomedical SoC.Keywords: CMOS, ECG, amplifier, low power
Procedia PDF Downloads 2461467 Effect of Lignocellulose-Degrading Bacteria Isolated from Termite Gut on the Nutritive Value of Wheat Straw as Ruminant Feed
Authors: Ayoub Azizi-Shotorkhoft, Tahereh Mohammadabadi, Hosein Motamedi, Morteza Chaji, Hasan Fazaeli
Abstract:
This study was conducted to investigate nutritive value of wheat straw processed with termite gut symbiotic bacteria with lignocellulosic-degrading potential including Bacillus licheniformis, Ochrobactrum intermedium and Microbacterium paludicola in vitro. These bacteria were isolated by culturing termite guts contents in different culture media containing different lignin and lignocellulosic materials that had been prepared from water-extracted sawdust and wheat straw. Results showed that incubating wheat straw with all of three isolated bacteria increased (P<0.05) acid-precipitable polymeric lignin (APPL) compared to control, and highest amount of APPL observed following treatment with B. licheniformis. Highest and lowest (P<0.05) in vitro gas production and ruminal organic matter digestibility were obtained when treating wheat straw with B. licheniformis and control, respectively. However, other fermentation parameters such as b (i.e., gas production from the insoluble fermentable fractions at 144h), c (i.e., rate of gas production during incubation), ruminal dry matter digestibility, metabolizable energy, partitioning factor, pH and ammonia nitrogen concentration were similar between experimental treatments (P>0.05). It is concluded that processing wheat straw with isolated bacteria improved its nutritive value as ruminants feed.Keywords: termite gut bacteria, wheat straw, nutritive value, ruminant
Procedia PDF Downloads 3321466 Investigation of the Litho-Structure of Ilesa Using High Resolution Aeromagnetic Data
Authors: Oladejo Olagoke Peter, Adagunodo T. A., Ogunkoya C. O.
Abstract:
The research investigated the arrangement of some geological features under Ilesa employing aeromagnetic data. The obtained data was subjected to various data filtering and processing techniques, which are Total Horizontal Derivative (THD), Depth Continuation and Analytical Signal Amplitude using Geosoft Oasis Montaj 6.4.2 software. The Reduced to the Equator –Total Magnetic Intensity (TRE-TMI) outcomes reveal significant magnetic anomalies, with high magnitude (55.1 to 155 nT) predominantly at the Northwest half of the area. Intermediate magnetic susceptibility, ranging between 6.0 to 55.1 nT, dominates the eastern part, separated by depressions and uplifts. The southern part of the area exhibits a magnetic field of low intensity, ranging from -76.6 to 6.0 nT. The lineaments exhibit varying lengths ranging from 2.5 and 16.0 km. Analyzing the Rose Diagram and the analytical signal amplitude indicates structural styles mainly of E-W and NE-SW orientations, particularly evident in the western, SW and NE regions with an amplitude of 0.0318nT/m. The identified faults in the area demonstrate orientations of NNW-SSE, NNE-SSW and WNW-ESE, situated at depths ranging from 500 to 750 m. Considering the divergence magnetic susceptibility, structural style or orientation of the lineaments, identified fault and their depth, these lithological features could serve as a valuable foundation for assessing ground motion, particularly in the presence of sufficient seismic energy.Keywords: lineament, aeromagnetic, anomaly, fault, magnetic
Procedia PDF Downloads 731465 Statistical Comparison of Machine and Manual Translation: A Corpus-Based Study of Gone with the Wind
Authors: Yanmeng Liu
Abstract:
This article analyzes and compares the linguistic differences between machine translation and manual translation, through a case study of the book Gone with the Wind. As an important carrier of human feeling and thinking, the literature translation poses a huge difficulty for machine translation, and it is supposed to expose distinct translation features apart from manual translation. In order to display linguistic features objectively, tentative uses of computerized and statistical evidence to the systematic investigation of large scale translation corpora by using quantitative methods have been deployed. This study compiles bilingual corpus with four versions of Chinese translations of the book Gone with the Wind, namely, Piao by Chunhai Fan, Piao by Huairen Huang, translations by Google Translation and Baidu Translation. After processing the corpus with the software of Stanford Segmenter, Stanford Postagger, and AntConc, etc., the study analyzes linguistic data and answers the following questions: 1. How does the machine translation differ from manual translation linguistically? 2. Why do these deviances happen? This paper combines translation study with the knowledge of corpus linguistics, and concretes divergent linguistic dimensions in translated text analysis, in order to present linguistic deviances in manual and machine translation. Consequently, this study provides a more accurate and more fine-grained understanding of machine translation products, and it also proposes several suggestions for machine translation development in the future.Keywords: corpus-based analysis, linguistic deviances, machine translation, statistical evidence
Procedia PDF Downloads 1411464 A Neural Network Classifier for Estimation of the Degree of Infestation by Late Blight on Tomato Leaves
Authors: Gizelle K. Vianna, Gabriel V. Cunha, Gustavo S. Oliveira
Abstract:
Foliage diseases in plants can cause a reduction in both quality and quantity of agricultural production. Intelligent detection of plant diseases is an essential research topic as it may help monitoring large fields of crops by automatically detecting the symptoms of foliage diseases. This work investigates ways to recognize the late blight disease from the analysis of tomato digital images, collected directly from the field. A pair of multilayer perceptron neural network analyzes the digital images, using data from both RGB and HSL color models, and classifies each image pixel. One neural network is responsible for the identification of healthy regions of the tomato leaf, while the other identifies the injured regions. The outputs of both networks are combined to generate the final classification of each pixel from the image and the pixel classes are used to repaint the original tomato images by using a color representation that highlights the injuries on the plant. The new images will have only green, red or black pixels, if they came from healthy or injured portions of the leaf, or from the background of the image, respectively. The system presented an accuracy of 97% in detection and estimation of the level of damage on the tomato leaves caused by late blight.Keywords: artificial neural networks, digital image processing, pattern recognition, phytosanitary
Procedia PDF Downloads 3261463 Collective Intelligence-Based Early Warning Management for Agriculture
Authors: Jarbas Lopes Cardoso Jr., Frederic Andres, Alexandre Guitton, Asanee Kawtrakul, Silvio E. Barbin
Abstract:
The important objective of the CyberBrain Mass Agriculture Alarm Acquisition and Analysis (CBMa4) project is to minimize the impacts of diseases and disasters on rice cultivation. For example, early detection of insects will reduce the volume of insecticides that is applied to the rice fields through the use of CBMa4 platform. In order to reach this goal, two major factors need to be considered: (1) the social network of smart farmers; and (2) the warning data alarm acquisition and analysis component. This paper outlines the process for collecting the warning and improving the decision-making result to the warning. It involves two sub-processes: the warning collection and the understanding enrichment. Human sensors combine basic suitable data processing techniques in order to extract warning related semantic according to collective intelligence. We identify each warning by a semantic content called 'warncons' with multimedia metaphors and metadata related to these metaphors. It is important to describe the metric to measuring the relation among warncons. With this knowledge, a collective intelligence-based decision-making approach determines the action(s) to be launched regarding one or a set of warncons.Keywords: agricultural engineering, warning systems, social network services, context awareness
Procedia PDF Downloads 3811462 Limestone Briquette Production and Characterization
Authors: André C. Silva, Mariana R. Barros, Elenice M. S. Silva, Douglas. Y. Marinho, Diego F. Lopes, Débora N. Sousa, Raphael S. Tomáz
Abstract:
Modern agriculture requires productivity, efficiency and quality. Therefore, there is need for agricultural limestone implementation that provides adequate amounts of calcium and magnesium carbonates in order to correct soil acidity. During the limestone process, fine particles (with average size under 400#) are generated. These particles do not have economic value in agricultural and metallurgical sectors due their size. When limestone is used for agriculture purposes, these fine particles can be easily transported by wind generated air pollution. Therefore, briquetting, a mineral processing technique, was used to mitigate this problem resulting in an agglomerated product suitable for agriculture use. Briquetting uses compressive pressure to agglomerate fine particles. It can be aided by agglutination agents, allowing adjustments in shape, size and mechanical parameters of the mass. Briquettes can generate extra profits for mineral industry, presenting as a distinct product for agriculture, and can reduce the environmental liabilities of the fine particles storage or disposition. The produced limestone briquettes were subjected to shatter and water action resistance tests. The results show that after six minutes completely submerged in water, the briquettes where fully diluted, a highly favorable result considering its use for soil acidity correction.Keywords: agglomeration, briquetting, limestone, soil acidity correction
Procedia PDF Downloads 3891461 Differentiation between Different Rangeland Sites Using Principal Component Analysis in Semi-Arid Areas of Sudan
Authors: Nancy Ibrahim Abdalla, Abdelaziz Karamalla Gaiballa
Abstract:
Rangelands in semi-arid areas provide a good source for feeding huge numbers of animals and serving environmental, economic and social importance; therefore, these areas are considered economically very important for the pastoral sector in Sudan. This paper investigates the means of differentiating between different rangelands sites according to soil types using principal component analysis to assist in monitoring and assessment purposes. Three rangeland sites were identified in the study area as flat sandy sites, sand dune site, and hard clay site. Principal component analysis (PCA) was used to reduce the number of factors needed to distinguish between rangeland sites and produce a new set of data including the most useful spectral information to run satellite image processing. It was performed using selected types of data (two vegetation indices, topographic data and vegetation surface reflectance within the three bands of MODIS data). Analysis with PCA indicated that there is a relatively high correspondence between vegetation and soil of the total variance in the data set. The results showed that the use of the principal component analysis (PCA) with the selected variables showed a high difference, reflected in the variance and eigenvalues and it can be used for differentiation between different range sites.Keywords: principal component analysis, PCA, rangeland sites, semi-arid areas, soil types
Procedia PDF Downloads 1841460 Qualitative and Quantitative Traits of Processed Farmed Fish in N. W. Greece
Authors: Cosmas Nathanailides, Fotini Kakali, Kostas Karipoglou
Abstract:
The filleting yield and the chemical composition of farmed sea bass (Dicentrarchus labrax); rainbow trout (Oncorynchus mykiss) and meagre (Argyrosomus regius) was investigated in farmed fish in NW Greece. The results provide an estimate of the quantity of fish required to produce one kilogram of fillet weight, an estimation which is required for the operational management of fish processing companies. Furthermore in this work, the ratio of feed input required to produce one kilogram of fish fillet (FFCR) is presented for the first time as a useful indicator of the ecological footprint of consuming farmed fish. The lowest lipid content appeared in meagre (1,7%) and the highest in trout (4,91%). The lowest fillet yield and fillet yield feed conversion ratio (FYFCR) was in meagre (FY=42,17%, FFCR=2,48), the best fillet yield (FY=53,8%) and FYFCR (2,10) was exhibited in farmed rainbow trout. This research has been co-financed by the European Union (European Social Fund – ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARCHIMEDES III. Investing in knowledge society through the European Social Fund.Keywords: farmed fish, flesh quality, filleting yield, lipid
Procedia PDF Downloads 3081459 Preparation and Cutting Performance of Boron-Doped Diamond Coating on Cemented Carbide Cutting Tools with High Cobalt Content
Authors: Zhaozhi Liu, Feng Xu, Junhua Xu, Xiaolong Tang, Ying Liu, Dunwen Zuo
Abstract:
Chemical vapor deposition (CVD) diamond coated cutting tool has excellent cutting performance, it is the most ideal tool for the processing of nonferrous metals and alloys, composites, nonmetallic materials and other difficult-to-machine materials efficiently and accurately. Depositing CVD diamond coating on the cemented carbide with high cobalt content can improve its toughness and strength, therefore, it is very important to research on the preparation technology and cutting properties of CVD diamond coated cemented carbide cutting tool with high cobalt content. The preparation technology of boron-doped diamond (BDD) coating has been studied and the coated drills were prepared. BDD coating were deposited on the drills by using the optimized parameters and the SEM results show that there are no cracks or collapses in the coating. Cutting tests with the prepared drills against the silumin and aluminum base printed circuit board (PCB) have been studied. The results show that the wear amount of the coated drill is small and the machined surface has a better precision. The coating does not come off during the test, which shows good adhesion and cutting performance of the drill.Keywords: cemented carbide with high cobalt content, CVD boron-doped diamond, cutting test, drill
Procedia PDF Downloads 4191458 Analysis of Hard Turning Process of AISI D3-Thermal Aspects
Authors: B. Varaprasad, C. Srinivasa Rao
Abstract:
In the manufacturing sector, hard turning has emerged as vital machining process for cutting hardened steels. Besides many advantages of hard turning operation, one has to implement to achieve close tolerances in terms of surface finish, high product quality, reduced machining time, low operating cost and environmentally friendly characteristics. In the present study, three-dimensional CAE (Computer Aided Engineering) based simulation of hard turning by using commercial software DEFORM 3D has been compared to experimental results of stresses, temperatures and tool forces in machining of AISI D3 steel using mixed Ceramic inserts (CC6050). In the present analysis, orthogonal cutting models are proposed, considering several processing parameters such as cutting speed, feed, and depth of cut. An exhaustive friction modeling at the tool-work interfaces is carried out. Work material flow around the cutting edge is carefully modeled with adaptive re-meshing simulation capability. In process simulations, feed rate and cutting speed are constant (i.e.,. 0.075 mm/rev and 155 m/min), and analysis is focused on stresses, forces, and temperatures during machining. Close agreement is observed between CAE simulation and experimental values.Keywords: hard turning, computer aided engineering, computational machining, finite element method
Procedia PDF Downloads 4531457 Wildfires Assessed By Remote Sensed Images And Burned Land Monitoring
Authors: Maria da Conceição Proença
Abstract:
This case study implements the evaluation of burned areas that suffered successive wildfires in Portugal mainland during the summer of 2017, killing more than 60 people. It’s intended to show that this evaluation can be done with remote sensing data free of charges in a simple laptop, with open-source software, describing the not-so-simple methodology step by step, to make it available for county workers in city halls of the areas attained, where the availability of information is essential for the immediate planning of mitigation measures, such as restoring road access, allocate funds for the recovery of human dwellings and assess further restoration of the ecological system. Wildfires also devastate forest ecosystems having a direct impact on vegetation cover and killing or driving away from the animal population. The economic interest is also attained, as the pinewood burned becomes useless for the noblest applications, so its value decreases, and resin extraction ends for several years. The tools described in this paper enable the location of the areas where took place the annihilation of natural habitats and establish a baseline for major changes in forest ecosystems recovery. Moreover, the result allows the follow up of the surface fuel loading, enabling the targeting and evaluation of restoration measures in a time basis planning.Keywords: image processing, remote sensing, wildfires, burned areas evaluation, sentinel-2
Procedia PDF Downloads 2101456 Advanced Technologies for Detector Readout in Particle Physics
Authors: Y. Venturini, C. Tintori
Abstract:
Given the continuous demand for improved readout performances in particle and dark matter physics, CAEN SpA is pushing on the development of advanced technologies for detector readout. We present the Digitizers 2.0, the result of the success of the previous Digitizers generation, combined with expanded capabilities and a renovation of the user experience introducing the open FPGA. The first product of the family is the VX2740 (64 ch, 125 MS/s, 16 bit) for advanced waveform recording and Digital Pulse Processing, fitting with the special requirements of Dark Matter and Neutrino experiments. In parallel, CAEN is developing the FERS-5200 platform, a Front-End Readout System designed to read out large multi-detector arrays, such as SiPMs, multi-anode PMTs, silicon strip detectors, wire chambers, GEM, gas tubes, and others. This is a highly-scalable distributed platform, based on small Front-End cards synchronized and read out by a concentrator board, allowing to build extremely large experimental setup. We plan to develop a complete family of cost-effective Front-End cards tailored to specific detectors and applications. The first one available is the A5202, a 64-channel unit for SiPM readout based on CITIROC ASIC by Weeroc.Keywords: dark matter, digitizers, front-end electronics, open FPGA, SiPM
Procedia PDF Downloads 1251455 Study of Natural Patterns on Digital Image Correlation Using Simulation Method
Authors: Gang Li, Ghulam Mubashar Hassan, Arcady Dyskin, Cara MacNish
Abstract:
Digital image correlation (DIC) is a contactless full-field displacement and strain reconstruction technique commonly used in the field of experimental mechanics. Comparing with physical measuring devices, such as strain gauges, which only provide very restricted coverage and are expensive to deploy widely, the DIC technique provides the result with full-field coverage and relative high accuracy using an inexpensive and simple experimental setup. It is very important to study the natural patterns effect on the DIC technique because the preparation of the artificial patterns is time consuming and hectic process. The objective of this research is to study the effect of using images having natural pattern on the performance of DIC. A systematical simulation method is used to build simulated deformed images used in DIC. A parameter (subset size) used in DIC can have an effect on the processing and accuracy of DIC and even cause DIC to failure. Regarding to the picture parameters (correlation coefficient), the higher similarity of two subset can lead the DIC process to fail and make the result more inaccurate. The pictures with good and bad quality for DIC methods have been presented and more importantly, it is a systematic way to evaluate the quality of the picture with natural patterns before they install the measurement devices.Keywords: Digital Image Correlation (DIC), deformation simulation, natural pattern, subset size
Procedia PDF Downloads 417