Search results for: semisolid metals processing
3968 A Comparative Study on Automatic Feature Classification Methods of Remote Sensing Images
Authors: Lee Jeong Min, Lee Mi Hee, Eo Yang Dam
Abstract:
Geospatial feature extraction is a very important issue in the remote sensing research. In the meantime, the image classification based on statistical techniques, but, in recent years, data mining and machine learning techniques for automated image processing technology is being applied to remote sensing it has focused on improved results generated possibility. In this study, artificial neural network and decision tree technique is applied to classify the high-resolution satellite images, as compared to the MLC processing result is a statistical technique and an analysis of the pros and cons between each of the techniques.Keywords: remote sensing, artificial neural network, decision tree, maximum likelihood classification
Procedia PDF Downloads 3453967 Quality Characteristics of Treated Wastewater of 'Industrial Area Foggia'
Authors: Grazia Disciglio, Annalisa Tarantino, Emanuele Tarantino
Abstract:
The production system of Foggia province (Apulia, Southern Italy) is characterized by the presence of numerous agro-food industries whose activities include the processing of vegetables products that release large quantities of wastewater. The reuse in agriculture of these wastewaters offers the opportunity to reduce the costs of their disposal and minimizing their environmental impact. In addition, in this area, which suffers from water shortage, the use of agro-industrial wastewater is essential in the very intensive irrigation cropping systems. The present investigation was carried out in years 2009 and 2010 to monitor the physico-chemical and microbiological characteristics of the industrial wastewater (IWW) from the secondary treatment plant of the 'Industrial Area of Foggia'. The treatment plant released on average about 567,000 m3y-1 of IWW, which distribution was not uniform over the year. The monthly values were about 250,000 m3 from November to June and about 90,000 m3 from July to October. The obtained results revealed that IWW was characterized by low values of Total Suspended Solids (TSS), Biological Oxygen Demand (BOD), Chemical Oxygen Demand (COD), Electrical Conductivity (EC) and Sodium Absorption Rate (SAR). An occasional presence of heavy metal and high concentration of total phosphorus, total nitrogen, ammoniacal nitrogen and microbial organisms (Escherichia coli and Salmonella) were observed. Due to the presence of this pathogenic microorganisms and sometimes of heavy metals, which may raise sanitary and environmental problems in order to the possible irrigation reuse of this IWW, a tertiary treatment of wastewater based on filtration and disinfection in line are recommended. Researches on the reuse of treated IWW on crops (olive, artichoke, industrial tomatoes, fennel, lettuce etc.) did not show significant differences among the irrigated plots for most of the soil and yield characteristics.Keywords: agroindustrial wastewater, irrigation, microbiological characteristic, physico-chemical characteristics
Procedia PDF Downloads 3143966 Prediction, Production, and Comprehension: Exploring the Influence of Salience in Language Processing
Authors: Andy H. Clark
Abstract:
This research looks into the relationship between language comprehension and production with a specific focus on the role of salience in shaping these processes. Salience, our most immediate perception of what is most probable out of all possible situations and outcomes strongly affects our perception and action in language production and comprehension. This study investigates the impact of geographic and emotional attachments to the target language on the differences in the learners’ comprehension and production abilities. Using quantitative research methods (Qualtrics, SPSS), this study examines preferential choices of two groups of Japanese English language learners: those residing in the United States and those in Japan. By comparing and contrasting these two groups, we hope to gain a better understanding of how salience of linguistics cues influences language processing.Keywords: intercultural pragmatics, salience, production, comprehension, pragmatics, action, perception, cognition
Procedia PDF Downloads 703965 High Pressure Processing of Jackfruit Bulbs: Effect on Color, Nutrient Profile and Enzyme Inactivation
Authors: Jyoti Kumari, Pavuluri Srinivasa Rao
Abstract:
Jackfruit (ArtocarpusheterophyllusL.) is an underutilized yet highly nutritious fruit with unique flavour, known for its therapeutic and culinary properties. Fresh jackfruit bulb has a very short shelf life due to high moisture and sugar content leading to microbial and enzymatic browning, hindering its consumer acceptability and marketability. An attempt has been made for the preservation of the ripe jackfruit bulbs, by the application of high pressure (HP) over a range of 200-500 MPa at ambient temperature for dwell times ranging from 5 to 20 min. The physicochemical properties of jackfruit bulbs such as the pH, TSS, and titrable acidity were not affected by the pressurization process. The ripening index of the fruit bulb also decreased following HP treatment. While the ascorbic acid and antioxidant activity of jackfruit bulb were well retained by high pressure processing (HPP), the total phenols and carotenoids showed a slight increase. The HPP significantly affected the colour and textural properties of jackfruit bulb. High pressure processing was highly effective in reducing the browning index of jackfruit bulbs in comparison to untreated bulbs. The firmness of the bulbs improved upon the pressure treatment with longer dwelling time. The polyphenol oxidase has been identified as the most prominent oxidative enzyme in the jackfruit bulb. The enzymatic activity of polyphenol oxidase and peroxidase were significantly reduced by up to 40% following treatment at 400 MPa/15 min. HPP of jackfruit bulbs at ambient temperatures is shown to be highly beneficial in improving the shelf stability, retaining its nutrient profile, color, and appearance while ensuring the maximum inactivation of the spoilage enzymes.Keywords: antioxidant capacity, ascorbic acid, carotenoids, color, HPP-high pressure processing, jackfruit bulbs, polyphenol oxidase, peroxidase, total phenolic content
Procedia PDF Downloads 1723964 The Effect of Speech-Shaped Noise and Speaker’s Voice Quality on First-Grade Children’s Speech Perception and Listening Comprehension
Authors: I. Schiller, D. Morsomme, A. Remacle
Abstract:
Children’s ability to process spoken language develops until the late teenage years. At school, where efficient spoken language processing is key to academic achievement, listening conditions are often unfavorable. High background noise and poor teacher’s voice represent typical sources of interference. It can be assumed that these factors particularly affect primary school children, because their language and literacy skills are still low. While it is generally accepted that background noise and impaired voice impede spoken language processing, there is an increasing need for analyzing impacts within specific linguistic areas. Against this background, the aim of the study was to investigate the effect of speech-shaped noise and imitated dysphonic voice on first-grade primary school children’s speech perception and sentence comprehension. Via headphones, 5 to 6-year-old children, recruited within the French-speaking community of Belgium, listened to and performed a minimal-pair discrimination task and a sentence-picture matching task. Stimuli were randomly presented according to four experimental conditions: (1) normal voice / no noise, (2) normal voice / noise, (3) impaired voice / no noise, and (4) impaired voice / noise. The primary outcome measure was task score. How did performance vary with respect to listening condition? Preliminary results will be presented with respect to speech perception and sentence comprehension and carefully interpreted in the light of past findings. This study helps to support our understanding of children’s language processing skills under adverse conditions. Results shall serve as a starting point for probing new measures to optimize children’s learning environment.Keywords: impaired voice, sentence comprehension, speech perception, speech-shaped noise, spoken language processing
Procedia PDF Downloads 1913963 Methodological Approach for the Prioritization of Different Micro-Contaminants as Potential River Basin Specific Pollutants in the Upper Tisza River Watershed
Authors: Mihail Simion Beldean-Galea, Virginia Coman, Florina Copaciu, Mihaela Vlassa, Radu Mihaiescu, Adina Croitoru, Viorel Arghius, Modest Gertsiuk, Mikola Gertsiuk
Abstract:
Taking into consideration the huge number of chemicals released into environment compartments a proper environmental risk assessment is difficult to predict due to the gap of legislation and improper toxicological assessment of chemicals compounds. In Romania as well as in many other countries from Europe, the chemical status of the water body is characterized taking into consideration the Water Framework Directive (WFD) and the substances listed in Annex X. This Annex includes 45 substances from different classes of organic compounds and heavy metals for which AA-EQS and MAC-EQS have been established. For other compounds which are not included in Annex X, different methodologies to prioritize chemicals for risk assessment and monitoring has been proposed. These methodologies take into account Predicted No-Effect Concentrations (PNECs) of different classes of chemicals compounds available from existing risk assessments or from read-across models for acute toxicity to the standard test organisms such as Daphnia magna and Selenastrum capricornutum. Our work presents the monitoring results of 30 priority substances including polyaromatic hydrocarbons, pesticides, halogenated compounds, plasticizers and heavy metals and other 34 substances from different classes of pesticides and pharmaceuticals which are not included on the list of priority substances, performed in the Upper Tisza River Watershed from Romania and Ukraine. The obtained monitoring data were used for the establishment of the list of more relevant pollutants in the studied area and to establish the potential river basin specific pollutants. For this purpose, two indicators such as the Frequency of exceedance and Extent of exceedance of Predicted no-Effect Concentration (PNEC) were evaluated. These two indicators are based on maximum environmental concentrations (MECs) of priority substances and for other pollutants is use statistically based averages of obtained measured concentration compared to the lowest PNEC thresholds. From the obtained results it can be concluded that polyaromatic hydrocarbon such as Fluoranthene, Benzo[a]pyrene, Benzo[b]fluorathene, benzo[k]fluoranthene, Benzo(g.h.i)perylene, Indeno(1.2.3-cd)-pyrene, heavy metals such as Cadmium, Lead and Nickel can be considered as river basin specific pollutants, their concentration exceeding the Annual Average EQS concentration. Other compounds such as estrone, estriol, 174-β estradiol, naproxen or some antibiotics (Penicillin G, Tetracycline or Ceftazidime) should be taken into account for a long monitoring, in some cases their concentration exceeding PNEC. Acknowledgements: This work is performed in the frame of NATO SfP Programme, Project no. 984440.Keywords: prioritization, river basin specific pollutants, Tisza River, water framework directive
Procedia PDF Downloads 3033962 DNA Multiplier: A Design Architecture of a Multiplier Circuit Using DNA Molecules
Authors: Hafiz Md. Hasan Babu, Khandaker Mohammad Mohi Uddin, Nitish Biswas, Sarreha Tasmin Rikta, Nuzmul Hossain Nahid
Abstract:
Nanomedicine and bioengineering use biological systems that can perform computing operations. In a biocomputational circuit, different types of biomolecules and DNA (Deoxyribose Nucleic Acid) are used as active components. DNA computing has the capability of performing parallel processing and a large storage capacity that makes it diverse from other computing systems. In most processors, the multiplier is treated as a core hardware block, and multiplication is one of the time-consuming and lengthy tasks. In this paper, cost-effective DNA multipliers are designed using algorithms of molecular DNA operations with respect to conventional ones. The speed and storage capacity of a DNA multiplier are also much higher than a traditional silicon-based multiplier.Keywords: biological systems, DNA multiplier, large storage, parallel processing
Procedia PDF Downloads 2123961 Benchmarking Bert-Based Low-Resource Language: Case Uzbek NLP Models
Authors: Jamshid Qodirov, Sirojiddin Komolov, Ravilov Mirahmad, Olimjon Mirzayev
Abstract:
Nowadays, natural language processing tools play a crucial role in our daily lives, including various techniques with text processing. There are very advanced models in modern languages, such as English, Russian etc. But, in some languages, such as Uzbek, the NLP models have been developed recently. Thus, there are only a few NLP models in Uzbek language. Moreover, there is no such work that could show which Uzbek NLP model behaves in different situations and when to use them. This work tries to close this gap and compares the Uzbek NLP models existing as of the time this article was written. The authors try to compare the NLP models in two different scenarios: sentiment analysis and sentence similarity, which are the implementations of the two most common problems in the industry: classification and similarity. Another outcome from this work is two datasets for classification and sentence similarity in Uzbek language that we generated ourselves and can be useful in both industry and academia as well.Keywords: NLP, benchmak, bert, vectorization
Procedia PDF Downloads 523960 A Bottleneck-Aware Power Management Scheme in Heterogeneous Processors for Web Apps
Authors: Inyoung Park, Youngjoo Woo, Euiseong Seo
Abstract:
With the advent of WebGL, Web apps are now able to provide high quality graphics by utilizing the underlying graphic processing units (GPUs). Despite that the Web apps are becoming common and popular, the current power management schemes, which were devised for the conventional native applications, are suboptimal for Web apps because of the additional layer, the Web browser, between OS and application. The Web browser running on a CPU issues GL commands, which are for rendering images to be displayed by the Web app currently running, to the GPU and the GPU processes them. The size and number of issued GL commands determine the processing load of the GPU. While the GPU is processing the GL commands, CPU simultaneously executes the other compute intensive threads. The actual user experience will be determined by either CPU processing or GPU processing depending on which of the two is the more demanded resource. For example, when the GPU work queue is saturated by the outstanding commands, lowering the performance level of the CPU does not affect the user experience because it is already deteriorated by the retarded execution of GPU commands. Consequently, it would be desirable to lower CPU or GPU performance level to save energy when the other resource is saturated and becomes a bottleneck in the execution flow. Based on this observation, we propose a power management scheme that is specialized for the Web app runtime environment. This approach incurs two technical challenges; identification of the bottleneck resource and determination of the appropriate performance level for unsaturated resource. The proposed power management scheme uses the CPU utilization level of the Window Manager to tell which one is the bottleneck if exists. The Window Manager draws the final screen using the processed results delivered from the GPU. Thus, the Window Manager is on the critical path that determines the quality of user experience and purely executed by the CPU. The proposed scheme uses the weighted average of the Window Manager utilization to prevent excessive sensitivity and fluctuation. We classified Web apps into three categories using the analysis results that measure frame-per-second (FPS) changes under diverse CPU/GPU clock combinations. The results showed that the capability of the CPU decides user experience when the Window Manager utilization is above 90% and consequently, the proposed scheme decreases the performance level of CPU by one step. On the contrary, when its utilization is less than 60%, the bottleneck usually lies in the GPU and it is desirable to decrease the performance of GPU. Even the processing unit that is not on critical path, excessive performance drop can occur and that may adversely affect the user experience. Therefore, our scheme lowers the frequency gradually, until it finds an appropriate level by periodically checking the CPU utilization. The proposed scheme reduced the energy consumption by 10.34% on average in comparison to the conventional Linux kernel, and it worsened their FPS by 1.07% only on average.Keywords: interactive applications, power management, QoS, Web apps, WebGL
Procedia PDF Downloads 1903959 Analysis of Long-Term Response of Seawater to Change in CO₂, Heavy Metals and Nutrients Concentrations
Authors: Igor Povar, Catherine Goyet
Abstract:
The seawater is subject to multiple external stressors (ES) including rising atmospheric CO2 and ocean acidification, global warming, atmospheric deposition of pollutants and eutrophication, which deeply alter its chemistry, often on a global scale and, in some cases, at the degree significantly exceeding that in the historical and recent geological verification. In ocean systems the micro- and macronutrients, heavy metals, phosphor- and nitrogen-containing components exist in different forms depending on the concentrations of various other species, organic matter, the types of minerals, the pH etc. The major limitation to assessing more strictly the ES to oceans, such as pollutants (atmospheric greenhouse gas, heavy metals, nutrients as nitrates and phosphates) is the lack of theoretical approach which could predict the ocean resistance to multiple external stressors. In order to assess the abovementioned ES, the research has applied and developed the buffer theory approach and theoretical expressions of the formal chemical thermodynamics to ocean systems, as heterogeneous aqueous systems. The thermodynamic expressions of complex chemical equilibria, involving acid-base, complex formation and mineral ones have been deduced. This thermodynamic approach utilizes thermodynamic relationships coupled with original mass balance constraints, where the solid phases are explicitly expressed. The ocean sensitivity to different external stressors and changes in driving factors are considered in terms of derived buffering capacities or buffer factors for heterogeneous systems. Our investigations have proved that the heterogeneous aqueous systems, as ocean and seas are, manifest their buffer properties towards all their components, not only to pH, as it has been known so far, for example in respect to carbon dioxide, carbonates, phosphates, Ca2+, Mg2+, heavy metal ions etc. The derived expressions make possible to attribute changes in chemical ocean composition to different pollutants. These expressions are also useful for improving the current atmosphere-ocean-marine biogeochemistry models. The major research questions, to which the research responds, are: (i.) What kind of contamination is the most harmful for Future Ocean? (ii.) What are chemical heterogeneous processes of the heavy metal release from sediments and minerals and its impact to the ocean buffer action? (iii.) What will be the long-term response of the coastal ocean to the oceanic uptake of anthropogenic pollutants? (iv.) How will change the ocean resistance in terms of future chemical complex processes and buffer capacities and its response to external (anthropogenic) perturbations? The ocean buffer capacities towards its main components are recommended as parameters that should be included in determining the most important ocean factors which define the response of ocean environment at the technogenic loads increasing. The deduced thermodynamic expressions are valid for any combination of chemical composition, or any of the species contributing to the total concentration, as independent state variable.Keywords: atmospheric greenhouse gas, chemical thermodynamics, external stressors, pollutants, seawater
Procedia PDF Downloads 1433958 Improvement of Piezoresistive Pressure Sensor Accuracy by Means of Current Loop Circuit Using Optimal Digital Signal Processing
Authors: Peter A. L’vov, Roman S. Konovalov, Alexey A. L’vov
Abstract:
The paper presents the advanced digital modification of the conventional current loop circuit for pressure piezoelectric transducers. The optimal DSP algorithms of current loop responses by the maximum likelihood method are applied for diminishing of measurement errors. The loop circuit has some additional advantages such as the possibility to operate with any type of resistance or reactance sensors, and a considerable increase in accuracy and quality of measurements to be compared with AC bridges. The results obtained are dedicated to replace high-accuracy and expensive measuring bridges with current loop circuits.Keywords: current loop, maximum likelihood method, optimal digital signal processing, precise pressure measurement
Procedia PDF Downloads 5263957 Influence of Thermal Treatments on Ovomucoid as Allergenic Protein
Authors: Nasser A. Al-Shabib
Abstract:
Food allergens are most common non-native form when exposed to the immune system. Most food proteins undergo various treatments (e.g. thermal or proteolytic processing) during food manufacturing. Such treatments have the potential to impact the chemical structure of food allergens so as to convert them to more denatured or unfolded forms. The conformational changes in the proteins may affect the allergenicity of treated-allergens. However, most allergenic proteins possess high resistance against thermal modification or digestive enzymes. In the present study, ovomucoid (a major allergenic protein of egg white) was heated in phosphate-buffered saline (pH 7.4) at different temperatures, aqueous solutions and on different surfaces for various times. The results indicated that different antibody-based methods had different sensitivities in detecting the heated ovomucoid. When using one particular immunoassay‚ the immunoreactivity of ovomucoid increased rapidly after heating in water whereas immunoreactivity declined after heating in alkaline buffer (pH 10). Ovomucoid appeared more immunoreactive when dissolved in PBS (pH 7.4) and heated on a stainless steel surface. To the best of our knowledge‚ this is the first time that antibody-based methods have been applied for the detection of ovomucoid adsorbed onto different surfaces under various conditions. The results obtained suggest that use of antibodies to detect ovomucoid after food processing may be problematic. False assurance will be given with the use of inappropriate‚ non-validated immunoassays such as those available commercially as ‘Swab’ tests. A greater understanding of antibody-protein interaction after processing of a protein is required.Keywords: ovomucoid, thermal treatment, solutions, surfaces
Procedia PDF Downloads 4463956 Scheduling in Cloud Networks Using Chakoos Algorithm
Authors: Masoumeh Ali Pouri, Hamid Haj Seyyed Javadi
Abstract:
Nowadays, cloud processing is one of the important issues in information technology. Since scheduling of tasks graph is an NP-hard problem, considering approaches based on undeterminisitic methods such as evolutionary processing, mostly genetic and cuckoo algorithms, will be effective. Therefore, an efficient algorithm has been proposed for scheduling of tasks graph to obtain an appropriate scheduling with minimum time. In this algorithm, the new approach is based on making the length of the critical path shorter and reducing the cost of communication. Finally, the results obtained from the implementation of the presented method show that this algorithm acts the same as other algorithms when it faces graphs without communication cost. It performs quicker and better than some algorithms like DSC and MCP algorithms when it faces the graphs involving communication cost.Keywords: cloud computing, scheduling, tasks graph, chakoos algorithm
Procedia PDF Downloads 623955 Twitter Sentiment Analysis during the Lockdown on New-Zealand
Authors: Smah Almotiri
Abstract:
One of the most common fields of natural language processing (NLP) is sentimental analysis. The inferred feeling in the text can be successfully mined for various events using sentiment analysis. Twitter is viewed as a reliable data point for sentimental analytics studies since people are using social media to receive and exchange different types of data on a broad scale during the COVID-19 epidemic. The processing of such data may aid in making critical decisions on how to keep the situation under control. The aim of this research is to look at how sentimental states differed in a single geographic region during the lockdown at two different times.1162 tweets were analyzed related to the COVID-19 pandemic lockdown using keywords hashtags (lockdown, COVID-19) for the first sample tweets were from March 23, 2020, until April 23, 2020, and the second sample for the following year was from March 1, 2020, until April 4, 2020. Natural language processing (NLP), which is a form of Artificial intelligence, was used for this research to calculate the sentiment value of all of the tweets by using AFINN Lexicon sentiment analysis method. The findings revealed that the sentimental condition in both different times during the region's lockdown was positive in the samples of this study, which are unique to the specific geographical area of New Zealand. This research suggests applying machine learning sentimental methods such as Crystal Feel and extending the size of the sample tweet by using multiple tweets over a longer period of time.Keywords: sentiment analysis, Twitter analysis, lockdown, Covid-19, AFINN, NodeJS
Procedia PDF Downloads 1893954 Leveraging Large Language Models to Build a Cutting-Edge French Word Sense Disambiguation Corpus
Authors: Mouheb Mehdoui, Amel Fraisse, Mounir Zrigui
Abstract:
With the increasing amount of data circulating over the Web, there is a growing need to develop and deploy tools aimed at unraveling semantic nuances within text or sentences. The challenges in extracting precise meanings arise from the complexity of natural language, while words usually have multiple interpretations depending on the context. The challenge of precisely interpreting words within a given context is what the task of Word Sense Disambiguation meets. It is a very old domain within the area of Natural Language Processing aimed at determining a word’s meaning that it is going to carry in a particular context, hence increasing the correctness of applications processing the language. Numerous linguistic resources are accessible online, including WordNet, thesauri, and dictionaries, enabling exploration of diverse contextual meanings. However, several limitations persist. These include the scarcity of resources for certain languages, a limited number of examples within corpora, and the challenge of accurately detecting the topic or context covered by text, which significantly impacts word sense disambiguation. This paper will discuss the different approaches to WSD and review corpora available for this task. We will contrast these approaches, highlighting the limitations, which will allow us to build a corpus in French, targeted for WSD.Keywords: semantic enrichment, disambiguation, context fusion, natural language processing, multilingual applications
Procedia PDF Downloads 33953 Ammonia Cracking: Catalysts and Process Configurations for Enhanced Performance
Authors: Frea Van Steenweghen, Lander Hollevoet, Johan A. Martens
Abstract:
Compared to other hydrogen (H₂) carriers, ammonia (NH₃) is one of the most promising carriers as it contains 17.6 wt% hydrogen. It is easily liquefied at ≈ 9–10 bar pressure at ambient temperature. More importantly, NH₃ is a carbon-free hydrogen carrier with no CO₂ emission at final decomposition. Ammonia has a well-defined regulatory framework and a good track record regarding safety concerns. Furthermore, the industry already has an existing transport infrastructure consisting of pipelines, tank trucks and shipping technology, as ammonia has been manufactured and distributed around the world for over a century. While NH₃ synthesis and transportation technological solutions are at hand, a missing link in the hydrogen delivery scheme from ammonia is an energy-lean and efficient technology for cracking ammonia into H₂ and N₂. The most explored option for ammonia decomposition is thermo-catalytic cracking which is, by itself, the most energy-efficient approach compared to other technologies, such as plasma and electrolysis, as it is the most energy-lean and robust option. The decomposition reaction is favoured only at high temperatures (> 300°C) and low pressures (1 bar) as the thermocatalytic ammonia cracking process is faced with thermodynamic limitations. At 350°C, the thermodynamic equilibrium at 1 bar pressure limits the conversion to 99%. Gaining additional conversion up to e.g. 99.9% necessitates heating to ca. 530°C. However, reaching thermodynamic equilibrium is infeasible as a sufficient driving force is needed, requiring even higher temperatures. Limiting the conversion below the equilibrium composition is a more economical option. Thermocatalytic ammonia cracking is documented in scientific literature. Among the investigated metal catalysts (Ru, Co, Ni, Fe, …), ruthenium is known to be most active for ammonia decomposition with an onset of cracking activity around 350°C. For establishing > 99% conversion reaction, temperatures close to 600°C are required. Such high temperatures are likely to reduce the round-trip efficiency but also the catalyst lifetime because of the sintering of the supported metal phase. In this research, the first focus was on catalyst bed design, avoiding diffusion limitation. Experiments in our packed bed tubular reactor set-up showed that extragranular diffusion limitations occur at low concentrations of NH₃ when reaching high conversion, a phenomenon often overlooked in experimental work. A second focus was thermocatalyst development for ammonia cracking, avoiding the use of noble metals. To this aim, candidate metals and mixtures were deposited on a range of supports. Sintering resistance at high temperatures and the basicity of the support were found to be crucial catalyst properties. The catalytic activity was promoted by adding alkaline and alkaline earth metals. A third focus was studying the optimum process configuration by process simulations. A trade-off between conversion and favorable operational conditions (i.e. low pressure and high temperature) may lead to different process configurations, each with its own pros and cons. For example, high-pressure cracking would eliminate the need for post-compression but is detrimental for the thermodynamic equilibrium, leading to an optimum in cracking pressure in terms of energy cost.Keywords: ammonia cracking, catalyst research, kinetics, process simulation, thermodynamic equilibrium
Procedia PDF Downloads 653952 Performance of Hybrid Image Fusion: Implementation of Dual-Tree Complex Wavelet Transform Technique
Authors: Manoj Gupta, Nirmendra Singh Bhadauria
Abstract:
Most of the applications in image processing require high spatial and high spectral resolution in a single image. For example satellite image system, the traffic monitoring system, and long range sensor fusion system all use image processing. However, most of the available equipment is not capable of providing this type of data. The sensor in the surveillance system can only cover the view of a small area for a particular focus, yet the demanding application of this system requires a view with a high coverage of the field. Image fusion provides the possibility of combining different sources of information. In this paper, we have decomposed the image using DTCWT and then fused using average and hybrid of (maxima and average) pixel level techniques and then compared quality of both the images using PSNR.Keywords: image fusion, DWT, DT-CWT, PSNR, average image fusion, hybrid image fusion
Procedia PDF Downloads 6023951 Adsorption of Lead (II) and Copper (II) Ions onto Marula Nuts Activated Carbon
Authors: Lucky Malise, Hilary Rutto, Tumisang Seodigeng
Abstract:
Heavy metal contamination in waste water is a very serious issue affecting a lot of industrialized countries due to the health and environmental impact of these heavy metals on human life and the ecosystem. Adsorption using activated carbon is the most promising method for the removal of heavy metals from waste water but commercial activated carbon is expensive which gives rise to the need for alternatively activated carbon derived from cheap precursors, agricultural wastes, or byproducts from other processes. In this study activated bio-carbon derived from the carbonaceous material obtained from the pyrolysis of Marula nut shells was chemically activated and used as an adsorbent for the removal of lead (II) and copper (II) ions from aqueous solution. The surface morphology and chemistry of the adsorbent before and after chemical activation with zinc chloride impregnation were studied using SEM and FTIR analysis respectively and the results obtained indicate that chemical activation with zinc chloride improves the surface morphology of the adsorbent and enhances the intensity of the surface oxygen complexes on the surface of the adsorbent. The effect of process parameters such as adsorbent dosage, pH value of the solution, initial metal concentration, contact time, and temperature on the adsorption of lead (II) and copper (II) ions onto Marula nut activated carbon were investigated, and their optimum operating conditions were also determined. The experimental data was fitted to both the Langmuir and Freundlich isotherm models, and the data fitted best on the Freundlich isotherm model for both metal ions. The adsorption kinetics were also evaluated, and the experimental data fitted the pseudo-first order kinetic model better than the pseudo second-order kinetic model. The adsorption thermodynamics were also studied and the results indicate that the adsorption of lead and copper ions is spontaneous and exothermic in nature, feasible, and also involves a dissociative mechanism in the temperature range of 25-45 °C.Keywords: adsorption, isotherms, kinetics, marula nut shells activated carbon, thermodynamics
Procedia PDF Downloads 2683950 An Experimental Study on the Variability of Nonnative and Native Inference of Word Meanings in Timed and Untimed Conditions
Authors: Swathi M. Vanniarajan
Abstract:
Reading research suggests that online contextual vocabulary comprehension while reading is an interactive and integrative process. One’s success in it depends on a variety of factors including the amount and the nature of available linguistic and nonlinguistic cues, his/her analytical and integrative skills, schema memory (content familiarity), and processing speed characterized along the continuum of controlled to automatic processing. The experiment reported here, conducted with 30 native speakers as one group and 30 nonnative speakers as another group (all graduate students), hypothesized that while working on (24) tasks which required them to comprehend an unfamiliar word in real time without backtracking, due to the differences in the nature of their respective reading processes, the nonnative subjects would be less able to construct the meanings of the unknown words by integrating the multiple but sufficient contextual cues provided in the text but the native subjects would be able to. The results indicated that there were significant inter-group as well as intra-group differences in terms of the quality of definitions given. However, when given additional time, while the nonnative speakers could significantly improve the quality of their definitions, the native speakers in general would not, suggesting that all things being equal, time is a significant factor for success in nonnative vocabulary and reading comprehension processes and that accuracy precedes automaticity in the development of nonnative reading processes also.Keywords: reading, second language processing, vocabulary comprehension
Procedia PDF Downloads 1653949 Perceiving Text-Worlds as a Cognitive Mechanism to Understand Surah Al-Kahf
Authors: Awatef Boubakri, Khaled Jebahi
Abstract:
Using Text World Theory (TWT), we attempted to understand how mental representations (text worlds) and perceptions can be construed by readers of Quranic texts. To this end, Surah Al-Kahf was purposefully selected given the fact that while each of its stories is narrated, different levels of discourse intervene, which might result in a confused reader who might find it hard to keep track of which discourse he or she is processing. This surah was studied using specifically-designed text-world diagrams. The findings suggest that TWT can be used to help solve problems of ambiguity at the level of discourse in Quranic texts and to help construct a thinking reader whose cognitive constructs (text worlds / mental representations) are built through reflecting on the various and often changing components of discourse world, text world, and sub-worlds.Keywords: Al-Kahf, Surah, cognitive, processing, discourse
Procedia PDF Downloads 863948 Computational Aided Approach for Strut and Tie Model for Non-Flexural Elements
Authors: Mihaja Razafimbelo, Guillaume Herve-Secourgeon, Fabrice Gatuingt, Marina Bottoni, Tulio Honorio-De-Faria
Abstract:
The challenge of the research is to provide engineering with a robust, semi-automatic method for calculating optimal reinforcement for massive structural elements. In the absence of such a digital post-processing tool, design office engineers make intensive use of plate modelling, for which automatic post-processing is available. Plate models in massive areas, on the other hand, produce conservative results. In addition, the theoretical foundations of automatic post-processing tools for reinforcement are those of reinforced concrete beam sections. As long as there is no suitable alternative for automatic post-processing of plates, optimal modelling and a significant improvement of the constructability of massive areas cannot be expected. A method called strut-and-tie is commonly used in civil engineering, but the result itself remains very subjective to the calculation engineer. The tool developed will facilitate the work of supporting the engineers in their choice of structure. The method implemented consists of defining a ground-structure built on the basis of the main constraints resulting from an elastic analysis of the structure and then to start an optimization of this structure according to the fully stressed design method. The first results allow to obtain a coherent return in the first network of connecting struts and ties, compared to the cases encountered in the literature. The evolution of the tool will then make it possible to adapt the obtained latticework in relation to the cracking states resulting from the loads applied during the life of the structure, cyclic or dynamic loads. In addition, with the constructability constraint, a final result of reinforcement with an orthogonal arrangement with a regulated spacing will be implemented in the tool.Keywords: strut and tie, optimization, reinforcement, massive structure
Procedia PDF Downloads 1393947 Automated End-to-End Pipeline Processing Solution for Autonomous Driving
Authors: Ashish Kumar, Munesh Raghuraj Varma, Nisarg Joshi, Gujjula Vishwa Teja, Srikanth Sambi, Arpit Awasthi
Abstract:
Autonomous driving vehicles are revolutionizing the transportation system of the 21st century. This has been possible due to intensive research put into making a robust, reliable, and intelligent program that can perceive and understand its environment and make decisions based on the understanding. It is a very data-intensive task with data coming from multiple sensors and the amount of data directly reflects on the performance of the system. Researchers have to design the preprocessing pipeline for different datasets with different sensor orientations and alignments before the dataset can be fed to the model. This paper proposes a solution that provides a method to unify all the data from different sources into a uniform format using the intrinsic and extrinsic parameters of the sensor used to capture the data allowing the same pipeline to use data from multiple sources at a time. This also means easy adoption of new datasets or In-house generated datasets. The solution also automates the complete deep learning pipeline from preprocessing to post-processing for various tasks allowing researchers to design multiple custom end-to-end pipelines. Thus, the solution takes care of the input and output data handling, saving the time and effort spent on it and allowing more time for model improvement.Keywords: augmentation, autonomous driving, camera, custom end-to-end pipeline, data unification, lidar, post-processing, preprocessing
Procedia PDF Downloads 1203946 A Pull-Out Fiber/Matrix Interface Characterization of Vegetal Fibers Reinforced Thermoplastic Polymer Composites, the Influence of the Processing Temperature
Authors: Duy Cuong Nguyen, Ali Makke, Guillaume Montay
Abstract:
This work presents an improved single fiber pull-out test for fiber/matrix interface characterization. This test has been used to study the Inter-Facial Shear Strength ‘IFSS’ of hemp fibers reinforced polypropylene (PP). For this aim, the fiber diameter has been carefully measured using a tomography inspired method. The fiber section contour can then be approximated by a circle or a polygon. The results show that the IFSS is overestimated if the circular approximation is used. The Influence of the molding temperature on the IFSS has also been studied. We find a molding temperature of 183°C leads to better interface properties. Above or below this temperature the interface strength is reduced.Keywords: composite, hemp, interface, pull-out, processing, polypropylene, temperature
Procedia PDF Downloads 3903945 Quantification of Peptides (linusorbs) in Gluten-free Flaxseed Fortified Bakery Products
Authors: Youn Young Shim, Ji Hye Kim, Jae Youl Cho, Martin JT Reaney
Abstract:
Flaxseed (Linumusitatissimum L.) is gaining popularity in the food industry as a superfood due to its health-promoting properties. Linusorbs (LOs, a.k.a. Cyclolinopeptide) are bioactive compounds present in flaxseed exhibiting potential health effects. The study focused on the effects of processing and storage on the stability of flaxseed-derived LOs added to various bakery products. The flaxseed meal fortified gluten-free (GF) bakery bread was prepared, and the changes of LOs during the bread-making process (meal, fortified flour, dough, and bread) and storage (0, 1, 2, and 4 weeks) at different temperatures (−18 °C, 4 °C, and 22−23 °C) were analyzed by high-performance liquid chromatography-diode array detection. The total oxidative LOs and LO1OB2 were almost kept stable in flaxseed meals at storage temperatures of 22−23 °C, −18 °C, and 4 °C for up to four weeks. Processing steps during GF-bread production resulted in the oxidation of LOs. Interestingly, no LOs were detected in the dough sample; however, LOs appeared when the dough was stored at −18 °C for one week, suggesting that freezing destroyed the sticky structure of the dough and resulted in the release of LOs. The final product, flaxseed meal fortified bread, could be stored for up to four weeks at −18 °C and 4 °C, and for one week at 22−23 °C. All these results suggested that LOs may change during processing and storage and that flaxseed flour-fortified bread should be stored at low temperatures to preserve effective LOs components.Keywords: linum usitatissimum L., flaxseed, linusorb, stability, gluten-free, peptides, cyclolinopeptide
Procedia PDF Downloads 1783944 Use of Satellite Imaging to Understand Earth’s Surface Features: A Roadmap
Authors: Sabri Serkan Gulluoglu
Abstract:
It is possible with Geographic Information Systems (GIS) that the information about all natural and artificial resources on the earth is obtained taking advantage of satellite images are obtained by remote sensing techniques. However, determination of unknown sources, mapping of the distribution and efficient evaluation of resources are defined may not be possible with the original image. For this reasons, some process steps are needed like transformation, pre-processing, image enhancement and classification to provide the most accurate assessment numerically and visually. Many studies which present the phases of obtaining and processing of the satellite images have examined in the literature study. The research showed that the determination of the process steps may be followed at this subject with the existence of a common whole may provide to progress the process rapidly for the necessary and possible studies which will be.Keywords: remote sensing, satellite imaging, gis, computer science, information
Procedia PDF Downloads 3173943 Trabecular Texture Analysis Using Fractal Metrics for Bone Fragility Assessment
Authors: Khaled Harrar, Rachid Jennane
Abstract:
The purpose of this study is the discrimination of 28 postmenopausal with osteoporotic femoral fractures from an age-matched control group of 28 women using texture analysis based on fractals. Two pre-processing approaches are applied on radiographic images; these techniques are compared to highlight the choice of the pre-processing method. Furthermore, the values of the fractal dimension are compared to those of the fractal signature in terms of the classification of the two populations. In a second analysis, the BMD measure at proximal femur was compared to the fractal analysis, the latter, which is a non-invasive technique, allowed a better discrimination; the results confirm that the fractal analysis of texture on calcaneus radiographs is able to discriminate osteoporotic patients with femoral fracture from controls. This discrimination was efficient compared to that obtained by BMD alone. It was also present in comparing subgroups with overlapping values of BMD.Keywords: osteoporosis, fractal dimension, fractal signature, bone mineral density
Procedia PDF Downloads 4233942 The Application of to Optimize Pellet Quality in Broiler Feeds
Authors: Reza Vakili
Abstract:
The aim of this experiment was to optimize the effect of moisture, the production rate, grain particle size and steam conditioning temperature on pellet quality in broiler feed using Taguchi method and a 43 fractional factorial arrangement was conducted. Production rate, steam conditioning temperatures, particle sizes and moisture content were performed. During the production process, sampling was done, and then pellet durability index (PDI) and hardness evaluated in broiler feed grower and finisher. There was a significant effect of processing parameters on PDI and hardness. Based on the results of this experiment Taguchi method can be used to find the best combination of factors for optimal pellet quality.Keywords: broiler, feed physical quality, hardness, processing parameters, PDI
Procedia PDF Downloads 1843941 Nickel Removal from Industrial Wastewater by Eucalyptus Leaves and Poplar Ashes
Authors: Negin Bayat, Nahid HasanZadeh
Abstract:
Effluents of different industries such as metalworking, battery industry, mining, including heavy metal are considered problematic issues for both humans and the environment. These heavy metals include cadmium, copper, zinc, nickel, chromium, cyanide, lead, etc. Different physicochemical and biological methods are used to remove heavy metals, such as sedimentation, coagulation, flotation, chemical precipitation, filtration, membrane processes (reverse osmosis and nanofiltration), ion exchange, biological methods, adsorption with activated carbon, etc. These methods are generally either expensive or ineffective. In recent years, considerable attention has been given to the removal of heavy metal ions from solution by absorption using discarded and low-cost materials. In this study, nickel removal using an adsorption process by eucalyptus powdered leaves and poplar ash was investigated. This is an applied study. The effect of various parameters on metal removal, such as pH, amount of adsorbent, contact time, and stirring speed, was studied using a discontinuous method. This research was conducted in aqueous solutions on the laboratory scale. Then, optimum absorption conditions were obtained. Then, the study was conducted on real wastewater samples. In addition, the nickel concentration in the wastewater before and after the absorption process was measured. In all experiments, the remaining nickel was measured using an atomic absorption spectrometry device at 382 nm wavelength after an appropriate time and filtration. The results showed that increasing both adsorbent and pH parameters increase the metal removal rate. Nickel removal increased at the first 60 minutes. Then, the absorption rate remained constant and reached equilibrium. A desired removal rate with 40 mg in 100 ml adsorbent solution at pH = 9.5 was observed. According to the obtained results, the best absorption rate was observed at 40 mg dose using a combination of eucalyptus leaves and poplar ash in this study, which was equal to 99.76%. Thus, this combined method can be used as an inexpensive and effective absorbent for the removal of nickel from aqueous solutions.Keywords: absorption, wastewater, nickel, poplar ash, eucalyptus leaf, treatment
Procedia PDF Downloads 183940 Portable and Parallel Accelerated Development Method for Field-Programmable Gate Array (FPGA)-Central Processing Unit (CPU)- Graphics Processing Unit (GPU) Heterogeneous Computing
Authors: Nan Hu, Chao Wang, Xi Li, Xuehai Zhou
Abstract:
The field-programmable gate array (FPGA) has been widely adopted in the high-performance computing domain. In recent years, the embedded system-on-a-chip (SoC) contains coarse granularity multi-core CPU (central processing unit) and mobile GPU (graphics processing unit) that can be used as general-purpose accelerators. The motivation is that algorithms of various parallel characteristics can be efficiently mapped to the heterogeneous architecture coupled with these three processors. The CPU and GPU offload partial computationally intensive tasks from the FPGA to reduce the resource consumption and lower the overall cost of the system. However, in present common scenarios, the applications always utilize only one type of accelerator because the development approach supporting the collaboration of the heterogeneous processors faces challenges. Therefore, a systematic approach takes advantage of write-once-run-anywhere portability, high execution performance of the modules mapped to various architectures and facilitates the exploration of design space. In this paper, A servant-execution-flow model is proposed for the abstraction of the cooperation of the heterogeneous processors, which supports task partition, communication and synchronization. At its first run, the intermediate language represented by the data flow diagram can generate the executable code of the target processor or can be converted into high-level programming languages. The instantiation parameters efficiently control the relationship between the modules and computational units, including two hierarchical processing units mapping and adjustment of data-level parallelism. An embedded system of a three-dimensional waveform oscilloscope is selected as a case study. The performance of algorithms such as contrast stretching, etc., are analyzed with implementations on various combinations of these processors. The experimental results show that the heterogeneous computing system with less than 35% resources achieves similar performance to the pure FPGA and approximate energy efficiency.Keywords: FPGA-CPU-GPU collaboration, design space exploration, heterogeneous computing, intermediate language, parameterized instantiation
Procedia PDF Downloads 1163939 Modification of Polymer Composite Based on Electromagnetic Radiation
Authors: Ananta R. Adhikari
Abstract:
In today's era, polymer composite utilization has witnessed a significant increase across various fronts of material science advancement. Despite the development of many highly sophisticated technologies aimed at modifying polymer composites, there persists a quest for a technology that is straightforward, energy-efficient, easily controllable, cost-effective, time-saving, and environmentally friendly. Microwave technology has emerged as a major technique in material synthesis and modification due to its unique characteristics such as rapid, selective, uniform heating, and, particularly, direct heating based on molecular interaction. This study will be about the utilization of microwave energy as an alternative technique for material processing. Specifically, we will explore ongoing research conducted in our laboratory, focusing on its applications in the medical field.Keywords: polymer composites, material processing, microstructure, microwave radiation
Procedia PDF Downloads 43