Search results for: hot processing windows
2296 Collect Meaningful Information about Stock Markets from the Web
Authors: Saleem Abuleil, Khalid S. Alsamara
Abstract:
Events represent a significant source of information on the web; they deliver information about events that occurred around the world in all kind of subjects and areas. These events can be collected and organized to provide valuable and useful information for decision makers, researchers, as well as any person seeking knowledge. In this paper, we discuss an ongoing research to target stock markets domain to observe and record changes (events) when they happen, collect them, understand the meaning of each one of them, and organize the information along with meaning in a well-structured format. By using Semantic Role Labeling (SRL) technique, we identified four factors for each event in this paper: verb of action and three roles associated with it, entity name, attribute, and attribute value. We have generated a set of rules and techniques to support our approach to analyze and understand the meaning of the events taking place in stock markets.Keywords: natuaral language processing, Arabic language, event extraction and understanding, sematic role labeling, stock market
Procedia PDF Downloads 3932295 Study on Fabrication of Surface Functional Micro and Nanostructures by Femtosecond Laser
Authors: Shengzhu Cao, Hui Zhou, Gan Wu, Lanxi Wanhg, Kaifeng Zhang, Rui Wang, Hu Wang
Abstract:
The functional micro and nanostructures, which can endow material surface with unique properties such as super-absorptance, hydrophobic and drag reduction. Recently, femtosecond laser ablation has been demonstrated to be a promising technology for surface functional micro and nanostructures fabrication. In this paper, using femtosecond laser ablation processing technique, we fabricated functional micro and nanostructures on Ti and Al alloy surfaces, test results showed that processed surfaces have 82%~96% absorptance over a broad wavelength range from ultraviolet to infrared. The surface function properties, which determined by micro and nanostructures, could be modulated by variation laser parameters. These functional surfaces may find applications in such areas as photonics, plasmonics, spaceborne devices, thermal radiation sources, solar energy absorbers and biomedicine.Keywords: surface functional, micro and nanostructures, femtosecond laser, ablation
Procedia PDF Downloads 3682294 Solid Polymer Electrolyte Membranes Based on Siloxane Matrix
Authors: Natia Jalagonia, Tinatin Kuchukhidze
Abstract:
Polymer electrolytes (PE) play an important part in electrochemical devices such as batteries and fuel cells. To achieve optimal performance, the PE must maintain a high ionic conductivity and mechanical stability at both high and low relative humidity. The polymer electrolyte also needs to have excellent chemical stability for long and robustness. According to the prevailing theory, ionic conduction in polymer electrolytes is facilitated by the large-scale segmental motion of the polymer backbone, and primarily occurs in the amorphous regions of the polymer electrolyte. Crystallinity restricts polymer backbone segmental motion and significantly reduces conductivity. Consequently, polymer electrolytes with high conductivity at room temperature have been sought through polymers which have highly flexible backbones and have largely amorphous morphology. The interest in polymer electrolytes was increased also by potential applications of solid polymer electrolytes in high energy density solid state batteries, gas sensors and electrochromic windows. Conductivity of 10-3 S/cm is commonly regarded as a necessary minimum value for practical applications in batteries. At present, polyethylene oxide (PEO)-based systems are most thoroughly investigated, reaching room temperature conductivities of 10-7 S/cm in some cross-linked salt in polymer systems based on amorphous PEO-polypropylene oxide copolymers.. It is widely accepted that amorphous polymers with low glass transition temperatures Tg and a high segmental mobility are important prerequisites for high ionic conductivities. Another necessary condition for high ionic conductivity is a high salt solubility in the polymer, which is most often achieved by donors such as ether oxygen or imide groups on the main chain or on the side groups of the PE. It is well established also that lithium ion coordination takes place predominantly in the amorphous domain, and that the segmental mobility of the polymer is an important factor in determining the ionic mobility. Great attention was pointed to PEO-based amorphous electrolyte obtained by synthesis of comb-like polymers, by attaching short ethylene oxide unit sequences to an existing amorphous polymer backbone. The aim of presented work is to obtain of solid polymer electrolyte membranes using PMHS as a matrix. For this purpose the hydrosilylation reactions of α,ω-bis(trimethylsiloxy)methyl¬hydrosiloxane with allyl triethylene-glycol mo¬nomethyl ether and vinyltriethoxysilane at 1:28:7 ratio of initial com¬pounds in the presence of Karstedt’s catalyst, platinum hydrochloric acid (0.1 M solution in THF) and platinum on the carbon catalyst in 50% solution of anhydrous toluene have been studied. The synthesized olygomers are vitreous liquid products, which are well soluble in organic solvents with specific viscosity ηsp ≈ 0.05 - 0.06. The synthesized olygomers were analysed with FTIR, 1H, 13C, 29Si NMR spectroscopy. Synthesized polysiloxanes were investigated with wide-angle X-ray, gel-permeation chromatography, and DSC analyses. Via sol-gel processes of doped with lithium trifluoromethylsulfonate (triflate) or lithium bis¬(trifluoromethylsulfonyl)¬imide polymer systems solid polymer electrolyte membranes have been obtained. The dependence of ionic conductivity as a function of temperature and salt concentration was investigated and the activation energies of conductivity for all obtained compounds are calculatedKeywords: synthesis, PMHS, membrane, electrolyte
Procedia PDF Downloads 2572293 Improvement of the Traditional Techniques of Artistic Casting through the Development of Open Source 3D Printing Technologies Based on Digital Ultraviolet Light Processing
Authors: Drago Diaz Aleman, Jose Luis Saorin Perez, Cecile Meier, Itahisa Perez Conesa, Jorge De La Torre Cantero
Abstract:
Traditional manufacturing techniques used in artistic contexts compete with highly productive and efficient industrial procedures. The craft techniques and associated business models tend to disappear under the pressure of the appearance of mass-produced products that compete in all niche markets, including those traditionally reserved for the work of art. The surplus value derived from the prestige of the author, the exclusivity of the product or the mastery of the artist, do not seem to be sufficient reasons to preserve this productive model. In the last years, the adoption of open source digital manufacturing technologies in small art workshops can favor their permanence by assuming great advantages such as easy accessibility, low cost, and free modification, adapting to specific needs of each workshop. It is possible to use pieces modeled by computer and made with FDM (Fused Deposition Modeling) 3D printers that use PLA (polylactic acid) in the procedures of artistic casting. Models printed by PLA are limited to approximate minimum sizes of 3 cm, and optimal layer height resolution is 0.1 mm. Due to these limitations, it is not the most suitable technology for artistic casting processes of smaller pieces. An alternative to solve size limitation, are printers from the type (SLS) "selective sintering by laser". And other possibility is a laser hardens, by layers, metal powder and called DMLS (Direct Metal Laser Sintering). However, due to its high cost, it is a technology that is difficult to introduce in small artistic foundries. The low-cost DLP (Digital Light Processing) type printers can offer high resolutions for a reasonable cost (around 0.02 mm on the Z axis and 0.04 mm on the X and Y axes), and can print models with castable resins that allow the subsequent direct artistic casting in precious metals or their adaptation to processes such as electroforming. In this work, the design of a DLP 3D printer is detailed, using backlit LCD screens with ultraviolet light. Its development is totally "open source" and is proposed as a kit made up of electronic components, based on Arduino and easy to access mechanical components in the market. The CAD files of its components can be manufactured in low-cost FDM 3D printers. The result is less than 500 Euros, high resolution and open-design with free access that allows not only its manufacture but also its improvement. In future works, we intend to carry out different comparative analyzes, which allow us to accurately estimate the print quality, as well as the real cost of the artistic works made with it.Keywords: traditional artistic techniques, DLP 3D printer, artistic casting, electroforming
Procedia PDF Downloads 1422292 Bio-Desalination and Bioremediation of Agroindustrial Wastewaters Using Yarrowia Lipolytica
Authors: Selma Hamimed, Abdelwaheb Chatti
Abstract:
The current study deals with the biological treatment of saline wastewaters generated by various agro-food industries using Yarrowia lipolytica. The ability of this yeast was studied on the mixture of olive mill wastewater and tuna wash processing wastewater. Results showed that the high proportion of olive mill wastewater in the mixture about (75:25) is the suitable one for the highest Y. lipolytica biomass production, reaching 11.3 g L⁻¹ after seven days. In addition, results showed significant removal of chemical oxygen demand (COD) and phosphorous of 97.49 % and 98.90 %, respectively. On the other hand, Y. lipolytica was found to be effective to desalinate all mixtures reaching a removal of 92.21 %. Moreover, the analytical results using Fourier transform infrared spectroscopy (FTIR), X-ray diffraction (XRD), and scanning electron microscopy (SEM) confirmed the biosorption of NaCl on the surface of the yeast as nanocrystals form with a size of 47.3 nm.Keywords: nanocrystallization of NaCl, desalination, wastewater treatment, yarrowia lipolytica
Procedia PDF Downloads 1872291 Evaluation of the Boiling Liquid Expanding Vapor Explosion Thermal Effects in Hassi R'Mel Gas Processing Plant Using Fire Dynamics Simulator
Authors: Brady Manescau, Ilyas Sellami, Khaled Chetehouna, Charles De Izarra, Rachid Nait-Said, Fati Zidani
Abstract:
During a fire in an oil and gas refinery, several thermal accidents can occur and cause serious damage to people and environment. Among these accidents, the BLEVE (Boiling Liquid Expanding Vapor Explosion) is most observed and remains a major concern for risk decision-makers. It corresponds to a violent vaporization of explosive nature following the rupture of a vessel containing a liquid at a temperature significantly higher than its normal boiling point at atmospheric pressure. Their effects on the environment generally appear in three ways: blast overpressure, radiation from the fireball if the liquid involved is flammable and fragment hazards. In order to estimate the potential damage that would be caused by such an explosion, risk decision-makers often use quantitative risk analysis (QRA). This analysis is a rigorous and advanced approach that requires a reliable data in order to obtain a good estimate and control of risks. However, in most cases, the data used in QRA are obtained from the empirical correlations. These empirical correlations generally overestimate BLEVE effects because they are based on simplifications and do not take into account real parameters like the geometry effect. Considering that these risk analyses are based on an assessment of BLEVE effects on human life and plant equipment, more precise and reliable data should be provided. From this point of view, the CFD modeling of BLEVE effects appears as a solution to the empirical law limitations. In this context, the main objective is to develop a numerical tool in order to predict BLEVE thermal effects using the CFD code FDS version 6. Simulations are carried out with a mesh size of 1 m. The fireball source is modeled as a vertical release of hot fuel in a short time. The modeling of fireball dynamics is based on a single step combustion using an EDC model coupled with the default LES turbulence model. Fireball characteristics (diameter, height, heat flux and lifetime) issued from the large scale BAM experiment are used to demonstrate the ability of FDS to simulate the various steps of the BLEVE phenomenon from ignition up to total burnout. The influence of release parameters such as the injection rate and the radiative fraction on the fireball heat flux is also presented. Predictions are very encouraging and show good agreement in comparison with BAM experiment data. In addition, a numerical study is carried out on an operational propane accumulator in an Algerian gas processing plant of SONATRACH company located in the Hassi R’Mel Gas Field (the largest gas field in Algeria).Keywords: BLEVE effects, CFD, FDS, fireball, LES, QRA
Procedia PDF Downloads 1862290 Genetic Algorithm Optimization of Microcantilever Based Resonator
Authors: Manjula Sutagundar, B. G. Sheeparamatti, D. S. Jangamshetti
Abstract:
Micro Electro Mechanical Systems (MEMS) resonators have shown the potential of replacing quartz crystal technology for sensing and high frequency signal processing applications because of inherent advantages like small size, high quality factor, low cost, compatibility with integrated circuit chips. This paper presents the optimization and modelling and simulation of the optimized micro cantilever resonator. The objective of the work is to optimize the dimensions of a micro cantilever resonator for a specified range of resonant frequency and specific quality factor. Optimization is carried out using genetic algorithm. The genetic algorithm is implemented using MATLAB. The micro cantilever resonator is modelled in CoventorWare using the optimized dimensions obtained from genetic algorithm. The modeled cantilever is analysed for resonance frequency.Keywords: MEMS resonator, genetic algorithm, modelling and simulation, optimization
Procedia PDF Downloads 5502289 Phonological Encoding and Working Memory in Kannada Speaking Adults Who Stutter
Authors: Nirmal Sugathan, Santosh Maruthy
Abstract:
Background: A considerable number of studies have evidenced that phonological encoding (PE) and working memory (WM) skills operate differently in adults who stutter (AWS). In order to tap these skills, several paradigms have been employed such as phonological priming, phoneme monitoring, and nonword repetition tasks. This study, however, utilizes a word jumble paradigm to assess both PE and WM using different modalities and this may give a better understanding of phonological processing deficits in AWS. Aim: The present study investigated PE and WM abilities in conjunction with lexical access in AWS using jumbled words. The study also aimed at investigating the effect of increase in cognitive load on phonological processing in AWS by comparing the speech reaction time (SRT) and accuracy scores across various syllable lengths. Method: Participants were 11 AWS (Age range=19-26) and 11 adults who do not stutter (AWNS) (Age range=19-26) matched for age, gender and handedness. Stimuli: Ninety 3-, 4-, and 5-syllable jumbled words (JWs) (n=30 per syllable length category) constructed from Kannada words served as stimuli for jumbled word paradigm. In order to generate jumbled words (JWs), the syllables in the real words were randomly transpositioned. Procedures: To assess PE, the JWs were presently visually using DMDX software and for WM task, JWs were presented through auditory mode through headphones. The participants were asked to silently manipulate the jumbled words to form a Kannada real word and verbally respond once. The responses for both tasks were audio recorded using record function in DMDX software and the recorded responses were analyzed using PRAAT software to calculate the SRT. Results: SRT: Mann-Whitney test results demonstrated that AWS performed significantly slower on both tasks (p < 0.001) as indicated by increased SRT. Also, AWS presented with increased SRT on both the tasks in all syllable length conditions (p < 0.001). Effect of syllable length: Wilcoxon signed rank test was carried out revealed that, on task assessing PE, the SRT of 4syllable JWs were significantly higher in both AWS (Z= -2.93, p=.003) and AWNS (Z= -2.41, p=.003) when compared to 3-syllable words. However, the findings for 4- and 5-syllable words were not significant. Task Accuracy: The accuracy scores were calculated for three syllable length conditions for both PE and PM tasks and were compared across the groups using Mann-Whitney test. The results indicated that the accuracy scores of AWS were significantly below that of AWNS in all the three syllable conditions for both the tasks (p < 0.001). Conclusion: The above findings suggest that PE and WM skills are compromised in AWS as indicated by increased SRT. Also, AWS were progressively less accurate in descrambling JWs of increasing syllable length and this may be interpreted as, rather than existing as a uniform deficiency, PE and WM deficits emerge when the cognitive load is increased. AWNS exhibited increased SRT and increased accuracy for JWs of longer syllable length whereas AWS was not benefited from increasing the reaction time, thus AWS had to compromise for both SRT and accuracy while solving JWs of longer syllable length.Keywords: adults who stutter, phonological ability, working memory, encoding, jumbled words
Procedia PDF Downloads 2402288 Statistical Analysis of Natural Images after Applying ICA and ISA
Authors: Peyman Sheikholharam Mashhadi
Abstract:
Difficulties in analyzing real world images in classical image processing and machine vision framework have motivated researchers towards considering the biology-based vision. It is a common belief that mammalian visual cortex has been adapted to the statistics of the real world images through the evolution process. There are two well-known successful models of mammalian visual cortical cells: Independent Component Analysis (ICA) and Independent Subspace Analysis (ISA). In this paper, we statistically analyze the dependencies which remain in the components after applying these models to the natural images. Also, we investigate the response of feature detectors to gratings with various parameters in order to find optimal parameters of the feature detectors. Finally, the selectiveness of feature detectors to phase, in both models is considered.Keywords: statistics, independent component analysis, independent subspace analysis, phase, natural images
Procedia PDF Downloads 3392287 Effect of Hot Equal Channel Angular Pressing Process on Mechanical Properties of Commercial Pure Titanium
Authors: Seyed Ata Khalkhkali Sharifi, Gholamhossein Majzoubi, Farhad Abroush
Abstract:
Developing mechanical properties of pure titanium has been reviewed in this paper by using ECAP process. At the first step of this article, the experimental samples were prepared as mentioned in the standards. Then pure grade 2 Ti was processed via equal-channel angular pressing (ECAp) for 2 passes following route-A at 400°C. After processing, the microstructural evolution, tensile, fatigue, hardness properties and wear behavior were investigated. Finally, the effect of ECAP process on these samples was analyzed. The results showed improvement in strength values with a slight decrease in ductility. The analysis on 30 points within the sample showed hardness increase in each pass. Also, it was concluded that fatigue properties were increased too.Keywords: equal-channel angular pressing, titanium, mechanical behavior, engineering materials and applications
Procedia PDF Downloads 2582286 CookIT: A Web Portal for the Preservation and Dissemination of Traditional Italian Recipes
Authors: M. T. Artese, G. Ciocca, I. Gagliardi
Abstract:
Food is a social and cultural aspect of every individual. Food products, processing, and traditions have been identified as cultural objects carrying history and identity of social groups. Traditional recipes are passed down from one generation to the other, often to strengthen the link with the territory. The paper presents CookIT, a web portal developed to collect Italian traditional recipes related to regional cuisine, with the purpose to disseminate the knowledge of typical Italian recipes and the Mediterranean diet which is a significant part of Italian cuisine. The system designed is completed with multimodal means of browsing and data retrieval. Stored recipes can be retrieved integrating and combining a number of different methods and keys, while the results are displayed using classical styles, such as list and mosaic, and also using maps and graphs, with which users can play using available keys for interaction.Keywords: collaborative portal, Italian cuisine, intangible cultural heritage, traditional recipes, searching and browsing
Procedia PDF Downloads 1492285 The Effect of Precipitation on Weed Infestation of Spring Barley under Different Tillage Conditions
Authors: J. Winkler, S. Chovancová
Abstract:
The article deals with the relation between rainfall in selected months and subsequent weed infestation of spring barley. The field experiment was performed at Mendel University agricultural enterprise in Žabčice, Czech Republic. Weed infestation was measured in spring barley vegetation in years 2004 to 2012. Barley was grown in three tillage variants: conventional tillage technology (CT), minimization tillage technology (MT), and no tillage (NT). Precipitation was recorded in one-day intervals. Monthly precipitation was calculated from the measured values in the months of October through to April. The technique of canonical correspondence analysis was applied for further statistical processing. 41 different species of weeds were found in the course of the 9-year monitoring period. The results clearly show that precipitation affects the incidence of most weed species in the selected months, but acts differently in the monitored variants of tillage technologies.Keywords: weeds, precipitation, tillage, weed infestation forecast
Procedia PDF Downloads 4982284 Rule-Based Expert System for Headache Diagnosis and Medication Recommendation
Authors: Noura Al-Ajmi, Mohammed A. Almulla
Abstract:
With the increased utilization of technology devices around the world, healthcare and medical diagnosis are critical issues that people worry about these days. Doctors are doing their best to avoid any medical errors while diagnosing diseases and prescribing the wrong medication. Subsequently, artificial intelligence applications that can be installed on mobile devices such as rule-based expert systems facilitate the task of assisting doctors in several ways. Due to their many advantages, the usage of expert systems has increased recently in health sciences. This work presents a backward rule-based expert system that can be used for a headache diagnosis and medication recommendation system. The structure of the system consists of three main modules, namely the input unit, the processing unit, and the output unit.Keywords: headache diagnosis system, prescription recommender system, expert system, backward rule-based system
Procedia PDF Downloads 2152283 Ontological Modeling Approach for Statistical Databases Publication in Linked Open Data
Authors: Bourama Mane, Ibrahima Fall, Mamadou Samba Camara, Alassane Bah
Abstract:
At the level of the National Statistical Institutes, there is a large volume of data which is generally in a format which conditions the method of publication of the information they contain. Each household or business data collection project includes a dissemination platform for its implementation. Thus, these dissemination methods previously used, do not promote rapid access to information and especially does not offer the option of being able to link data for in-depth processing. In this paper, we present an approach to modeling these data to publish them in a format intended for the Semantic Web. Our objective is to be able to publish all this data in a single platform and offer the option to link with other external data sources. An application of the approach will be made on data from major national surveys such as the one on employment, poverty, child labor and the general census of the population of Senegal.Keywords: Semantic Web, linked open data, database, statistic
Procedia PDF Downloads 1742282 Understanding New Zealand’s 19th Century Timber Churches: Techniques in Extracting and Applying Underlying Procedural Rules
Authors: Samuel McLennan, Tane Moleta, Andre Brown, Marc Aurel Schnabel
Abstract:
The development of Ecclesiastical buildings within New Zealand has produced some unique design characteristics that take influence from both international styles and local building methods. What this research looks at is how procedural modelling can be used to define such common characteristics and understand how they are shared and developed within different examples of a similar architectural style. This will be achieved through the creation of procedural digital reconstructions of the various timber Gothic Churches built during the 19th century in the city of Wellington, New Zealand. ‘Procedural modelling’ is a digital modelling technique that has been growing in popularity, particularly within the game and film industry, as well as other fields such as industrial design and architecture. Such a design method entails the creation of a parametric ‘ruleset’ that can be easily adjusted to produce many variations of geometry, rather than a single geometry as is typically found in traditional CAD software. Key precedents within this area of digital heritage includes work by Haegler, Müller, and Gool, Nicholas Webb and Andre Brown, and most notably Mark Burry. What these precedents all share is how the forms of the reconstructed architecture have been generated using computational rules and an understanding of the architects’ geometric reasoning. This is also true within this research as Gothic architecture makes use of only a select range of forms (such as the pointed arch) that can be accurately replicated using the same standard geometric techniques originally used by the architect. The methodology of this research involves firstly establishing a sample group of similar buildings, documenting the existing samples, researching any lost samples to find evidence such as architectural plans, photos, and written descriptions, and then culminating all the findings into a single 3D procedural asset within the software ‘Houdini’. The end result will be an adjustable digital model that contains all the architectural components of the sample group, such as the various naves, buttresses, and windows. These components can then be selected and arranged to create visualisations of the sample group. Because timber gothic churches in New Zealand share many details between designs, the created collection of architectural components can also be used to approximate similar designs not included in the sample group, such as designs found beyond the Wellington Region. This creates an initial library of architectural components that can be further expanded on to encapsulate as wide of a sample size as desired. Such a methodology greatly improves upon the efficiency and adjustability of digital modelling compared to current practices found in digital heritage reconstruction. It also gives greater accuracy to speculative design, as a lack of evidence for lost structures can be approximated using components from still existing or better-documented examples. This research will also bring attention to the cultural significance these types of buildings have within the local area, addressing the public’s general unawareness of architectural history that is identified in the Wellington based research ‘Moving Images in Digital Heritage’ by Serdar Aydin et al.Keywords: digital forensics, digital heritage, gothic architecture, Houdini, procedural modelling
Procedia PDF Downloads 1312281 Technology Computer Aided Design Simulation of Space Charge Limited Conduction in Polycrystalline Thin Films
Authors: Kunj Parikh, S. Bhattacharya, V. Natarajan
Abstract:
TCAD numerical simulation is one of the most tried and tested powerful tools for designing devices in semiconductor foundries worldwide. It has also been used to explain conduction in organic thin films where the processing temperature is often enough to make homogeneous samples (often imperfect, but homogeneously imperfect). In this report, we have presented the results of TCAD simulation in multi-grain thin films. The work has addressed the inhomogeneity in one dimension, but can easily be extended to two and three dimensions. The effect of grain boundaries has mainly been approximated as barriers located at the junction between two adjacent grains. The effect of the value of grain boundary barrier, the bulk traps, and the measurement temperature have been investigated.Keywords: polycrystalline thin films, space charge limited conduction, Technology Computer-Aided Design (TCAD) simulation, traps
Procedia PDF Downloads 2142280 Processing of Flexible Dielectric Nanocomposites Using Nanocellulose and Recycled Alum Sludge for Wearable Technology Applications
Authors: D. Sun, L. Saw, A. Onyianta, D. O’Rourke, Z. Lu, C. See, C. Wilson, C. Popescu, M. Dorris
Abstract:
With the rapid development of wearable technology (e.g., smartwatch, activity trackers and health monitor devices), flexible dielectric materials with environmental-friendly, low-cost and high-energy efficiency characteristics are in increasing demand. In this work, a flexible dielectric nanocomposite was processed by incorporating two components: cellulose nanofibrils and alum sludge in a polymer matrix. The two components were used in the reinforcement phase as well as for enhancing the dielectric properties; they were processed using waste materials that would otherwise be disposed to landfills. Alum sludge is a by-product of the water treatment process in which aluminum sulfate is prevalently used as the primary coagulant. According to the data from a project partner-Scottish Water: there are approximately 10k tons of alum sludge generated as a waste from the water treatment work to be landfilled every year in Scotland. The industry has been facing escalating financial and environmental pressure to develop more sustainable strategies to deal with alum sludge wastes. In the available literature, some work on reusing alum sludge has been reported (e.g., aluminum recovery or agriculture and land reclamation). However, little work can be found in applying it to processing energy materials (e.g., dielectrics) for enhanced energy density and efficiency. The alum sludge was collected directly from a water treatment plant of Scottish Water and heat-treated and refined before being used in preparing composites. Cellulose nanofibrils were derived from water hyacinth, an invasive aquatic weed that causes significant ecological issues in tropical regions. The harvested water hyacinth was dried and processed using a cost-effective method, including a chemical extraction followed by a homogenization process in order to extract cellulose nanofibrils. Biodegradable elastomer polydimethylsiloxane (PDMS) was used as the polymer matrix and the nanocomposites were processed by casting raw materials in Petri dishes. The processed composites were characterized using various methods, including scanning electron microscopy (SEM), rheological analysis, thermogravimetric and X-ray diffraction analysis. The SEM result showed that cellulose nanofibrils of approximately 20nm in diameter and 100nm in length were obtained and the alum sludge particles were of approximately 200um in diameters. The TGA/DSC analysis result showed that a weight loss of up to 48% can be seen in the raw material of alum sludge and its crystallization process has been started at approximately 800°C. This observation coincides with the XRD result. Other experiments also showed that the composites exhibit comprehensive mechanical and dielectric performances. This work depicts that it is a sustainable practice of reusing such waste materials in preparing flexible, lightweight and miniature dielectric materials for wearable technology applications.Keywords: cellulose, biodegradable, sustainable, alum sludge, nanocomposite, wearable technology, dielectric
Procedia PDF Downloads 852279 Comparison of Different Extraction Methods for the Determination of Polyphenols
Authors: Senem Suna
Abstract:
Extraction of bioactive compounds from several food/food products comes as an important topic and new trend related with health promoting effects. As a result of the increasing interest in natural foods, different methods are used for the acquisition of these components especially polyphenols. However, special attention has to be paid to the selection of proper techniques or several processing technologies (supercritical fluid extraction, microwave-assisted extraction, ultrasound-assisted extraction, powdered extracts production) for each kind of food to get maximum benefit as well as the obtainment of phenolic compounds. In order to meet consumer’s demand for healthy food and the management of quality and safety requirements, advanced research and development are needed. In this review, advantages, and disadvantages of different extraction methods, their opportunities to be used in food industry and the effects of polyphenols are mentioned in details. Consequently, with the evaluation of the results of several studies, the selection of the most suitable food specific method was aimed.Keywords: bioactives, extraction, powdered extracts, supercritical fluid extraction
Procedia PDF Downloads 2392278 Mathematical Modelling of Bacterial Growth in Products of Animal Origin in Storage and Transport: Effects of Temperature, Use of Bacteriocins and pH Level
Authors: Benjamin Castillo, Luis Pastenes, Fernando Cordova
Abstract:
The pathogen growth in animal source foods is a common problem in the food industry, causing monetary losses due to the spoiling of products or food intoxication outbreaks in the community. In this sense, the quality of the product is reflected by the population of deteriorating agents present in it, which are mainly bacteria. The factors which are likely associated with freshness in animal source foods are temperature and processing, storage, and transport times. However, the level of deterioration of products depends, in turn, on the characteristics of the bacterial population, causing the decomposition or spoiling, such as pH level and toxins. Knowing the growth dynamics of the agents that are involved in product contamination allows the monitoring for more efficient processing. This means better quality and reasonable costs, along with a better estimation of necessary time and temperature intervals for transport and storage in order to preserve product quality. The objective of this project is to design a secondary model that allows measuring the impact on temperature bacterial growth and the competition for pH adequacy and release of bacteriocins in order to describe such phenomenon and, thus, estimate food product half-life with the least possible risk of deterioration or spoiling. In order to achieve this objective, the authors propose an analysis of a three-dimensional ordinary differential which includes; logistic bacterial growth extended by the inhibitory action of bacteriocins including the effect of the medium pH; change in the medium pH levels through an adaptation of the Luedeking-Piret kinetic model; Bacteriocin concentration modeled similarly to pH levels. These three dimensions are being influenced by the temperature at all times. Then, this differential system is expanded, taking into consideration the variable temperature and the concentration of pulsed bacteriocins, which represent characteristics inherent of the modeling, such as transport and storage, as well as the incorporation of substances that inhibit bacterial growth. The main results lead to the fact that temperature changes in an early stage of transport increased the bacterial population significantly more than if it had increased during the final stage. On the other hand, the incorporation of bacteriocins, as in other investigations, proved to be efficient in the short and medium-term since, although the population of bacteria decreased, once the bacteriocins were depleted or degraded over time, the bacteria eventually returned to their regular growth rate. The efficacy of the bacteriocins at low temperatures decreased slightly, which equates with the fact that their natural degradation rate also decreased. In summary, the implementation of the mathematical model allowed the simulation of a set of possible bacteria present in animal based products, along with their properties, in various transport and storage situations, which led us to state that for inhibiting bacterial growth, the optimum is complementary low constant temperatures and the initial use of bacteriocins.Keywords: bacterial growth, bacteriocins, mathematical modelling, temperature
Procedia PDF Downloads 1352277 Continuous-Time and Discrete-Time Singular Value Decomposition of an Impulse Response Function
Authors: Rogelio Luck, Yucheng Liu
Abstract:
This paper proposes the continuous-time singular value decomposition (SVD) for the impulse response function, a special kind of Green’s functions e⁻⁽ᵗ⁻ ᵀ⁾, in order to find a set of singular functions and singular values so that the convolutions of such function with the set of singular functions on a specified domain are the solutions to the inhomogeneous differential equations for those singular functions. A numerical example was illustrated to verify the proposed method. Besides the continuous-time SVD, a discrete-time SVD is also presented for the impulse response function, which is modeled using a Toeplitz matrix in the discrete system. The proposed method has broad applications in signal processing, dynamic system analysis, acoustic analysis, thermal analysis, as well as macroeconomic modeling.Keywords: singular value decomposition, impulse response function, Green’s function , Toeplitz matrix , Hankel matrix
Procedia PDF Downloads 1562276 Statistical Physics Model of Seismic Activation Preceding a Major Earthquake
Authors: Daniel S. Brox
Abstract:
Starting from earthquake fault dynamic equations, a correspondence between earthquake occurrence statistics in a seismic region before a major earthquake and eigenvalue statistics of a differential operator whose bound state eigenfunctions characterize the distribution of stress in the seismic region is derived. Modeling these eigenvalue statistics with a 2D Coulomb gas statistical physics model, previously reported deviation of seismic activation earthquake occurrence statistics from Gutenberg-Richter statistics in time intervals preceding the major earthquake is derived. It also explains how statistical physics modeling predicts a finite-dimensional nonlinear dynamic system that describes real-time velocity model evolution in the region undergoing seismic activation and how this prediction can be tested experimentally.Keywords: seismic activation, statistical physics, geodynamics, signal processing
Procedia PDF Downloads 172275 Contrast Enhancement of Masses in Mammograms Using Multiscale Morphology
Authors: Amit Kamra, V. K. Jain, Pragya
Abstract:
Mammography is widely used technique for breast cancer screening. There are various other techniques for breast cancer screening but mammography is the most reliable and effective technique. The images obtained through mammography are of low contrast which causes problem for the radiologists to interpret. Hence, a high quality image is mandatory for the processing of the image for extracting any kind of information from it. Many contrast enhancement algorithms have been developed over the years. In the present work, an efficient morphology based technique is proposed for contrast enhancement of masses in mammographic images. The proposed method is based on Multiscale Morphology and it takes into consideration the scale of the structuring element. The proposed method is compared with other state-of-the-art techniques. The experimental results show that the proposed method is better both qualitatively and quantitatively than the other standard contrast enhancement techniques.Keywords: enhancement, mammography, multi-scale, mathematical morphology
Procedia PDF Downloads 4232274 Searching Linguistic Synonyms through Parts of Speech Tagging
Authors: Faiza Hussain, Usman Qamar
Abstract:
Synonym-based searching is recognized to be a complicated problem as text mining from unstructured data of web is challenging. Finding useful information which matches user need from bulk of web pages is a cumbersome task. In this paper, a novel and practical synonym retrieval technique is proposed for addressing this problem. For replacement of semantics, user intent is taken into consideration to realize the technique. Parts-of-Speech tagging is applied for pattern generation of the query and a thesaurus for this experiment was formed and used. Comparison with Non-Context Based Searching, Context Based searching proved to be a more efficient approach while dealing with linguistic semantics. This approach is very beneficial in doing intent based searching. Finally, results and future dimensions are presented.Keywords: natural language processing, text mining, information retrieval, parts-of-speech tagging, grammar, semantics
Procedia PDF Downloads 3072273 Mathematical Modeling of Carotenoids and Polyphenols Content of Faba Beans (Vicia faba L.) during Microwave Treatments
Authors: Ridha Fethi Mechlouch, Ahlem Ayadi, Ammar Ben Brahim
Abstract:
Given the importance of the preservation of polyphenols and carotenoids during thermal processing, we attempted in this study to investigate the variation of these two parameters in faba beans during microwave treatment using different power densities (1; 2; and 3W/g), then to perform a mathematical modeling by using non-linear regression analysis to evaluate the models constants. The variation of the carotenoids and polyphenols ratio of faba beans and the models are tested to validate the experimental results. Exponential models were found to be suitable to describe the variation of caratenoid ratio (R²= 0.945, 0.927 and 0.946) for power densities (1; 2; and 3W/g) respectively, and polyphenol ratio (R²= 0.931, 0.989 and 0.982) for power densities (1; 2; and 3W/g) respectively. The effect of microwave power density Pd(W/g) on the coefficient k of models were also investigated. The coefficient is highly correlated (R² = 1) and can be expressed as a polynomial function.Keywords: microwave treatment, power density, carotenoid, polyphenol, modeling
Procedia PDF Downloads 2592272 Development of Ferrous-Aluminum Alloys from Recyclable Material by High Energy Milling
Authors: Arnold S. Freitas Neto, Rodrigo E. Coelho, Erick S. Mendonça
Abstract:
This study aimed to obtain an alloy of Iron and Aluminum in the proportion of 50% of atomicity for each constituent. Alloys were obtained by processing recycled aluminum and chips of 1200 series carbon steel in a high-energy mill. For the experiment, raw materials were processed thorough high energy milling before mixing the substances. Subsequently, the mixture of 1200 series carbon steel and Aluminum powder was carried out a milling process. Thereafter, hot compression was performed in a closed die in order to obtain the samples. The pieces underwent heat treatments, sintering and aging. Lastly, the composition and the mechanical properties of their hardness were analyzed. In this paper, results are compared with previous studies, which used iron powder of high purity instead of Carbon steel in the composition.Keywords: Fe-Al alloys, high energy milling, metallography characterization, powder metallurgy
Procedia PDF Downloads 3092271 Hit-Or-Miss Transform as a Tool for Similar Shape Detection
Authors: Osama Mohamed Elrajubi, Idris El-Feghi, Mohamed Abu Baker Saghayer
Abstract:
This paper describes an identification of specific shapes within binary images using the morphological Hit-or-Miss Transform (HMT). Hit-or-Miss transform is a general binary morphological operation that can be used in searching of particular patterns of foreground and background pixels in an image. It is actually a basic operation of binary morphology since almost all other binary morphological operators are derived from it. The input of this method is a binary image and a structuring element (a template which will be searched in a binary image) while the output is another binary image. In this paper a modification of Hit-or-Miss transform has been proposed. The accuracy of algorithm is adjusted according to the similarity of the template and the sought template. The implementation of this method has been done by C language. The algorithm has been tested on several images and the results have shown that this new method can be used for similar shape detection.Keywords: hit-or-miss operator transform, HMT, binary morphological operation, shape detection, binary images processing
Procedia PDF Downloads 3322270 A Custom Convolutional Neural Network with Hue, Saturation, Value Color for Malaria Classification
Authors: Ghazala Hcini, Imen Jdey, Hela Ltifi
Abstract:
Malaria disease should be considered and handled as a potential restorative catastrophe. One of the most challenging tasks in the field of microscopy image processing is due to differences in test design and vulnerability of cell classifications. In this article, we focused on applying deep learning to classify patients by identifying images of infected and uninfected cells. We performed multiple forms, counting a classification approach using the Hue, Saturation, Value (HSV) color space. HSV is used since of its superior ability to speak to image brightness; at long last, for classification, a convolutional neural network (CNN) architecture is created. Clusters of focus were used to deliver the classification. The highlights got to be forbidden, and a few more clamor sorts are included in the information. The suggested method has a precision of 99.79%, a recall value of 99.55%, and provides 99.96% accuracy.Keywords: deep learning, convolutional neural network, image classification, color transformation, HSV color, malaria diagnosis, malaria cells images
Procedia PDF Downloads 882269 Human Posture Estimation Based on Multiple Viewpoints
Authors: Jiahe Liu, HongyangYu, Feng Qian, Miao Luo
Abstract:
This study aimed to address the problem of improving the confidence of key points by fusing multi-view information, thereby estimating human posture more accurately. We first obtained multi-view image information and then used the MvP algorithm to fuse this multi-view information together to obtain a set of high-confidence human key points. We used these as the input for the Spatio-Temporal Graph Convolution (ST-GCN). ST-GCN is a deep learning model used for processing spatio-temporal data, which can effectively capture spatio-temporal relationships in video sequences. By using the MvP algorithm to fuse multi-view information and inputting it into the spatio-temporal graph convolution model, this study provides an effective method to improve the accuracy of human posture estimation and provides strong support for further research and application in related fields.Keywords: multi-view, pose estimation, ST-GCN, joint fusion
Procedia PDF Downloads 702268 Computational Analysis on Thermal Performance of Chip Package in Electro-Optical Device
Authors: Long Kim Vu
Abstract:
The central processing unit in Electro-Optical devices is a Field-programmable gate array (FPGA) chip package allowing flexible, reconfigurable computing but energy consumption. Because chip package is placed in isolated devices based on IP67 waterproof standard, there is no air circulation and the heat dissipation is a challenge. In this paper, the author successfully modeled a chip package which various interposer materials such as silicon, glass and organics. Computational fluid dynamics (CFD) was utilized to analyze the thermal performance of chip package in the case of considering comprehensive heat transfer modes: conduction, convection and radiation, which proposes equivalent heat dissipation. The logic chip temperature varying with time is compared between the simulation and experiment results showing the excellent correlation, proving the reasonable chip modeling and simulation method.Keywords: CFD, FPGA, heat transfer, thermal analysis
Procedia PDF Downloads 1842267 Post Growth Annealing Effect on Deep Level Emission and Raman Spectra of Hydrothermally Grown ZnO Nanorods Assisted by KMnO4
Authors: Ashish Kumar, Tejendra Dixit, I. A. Palani, Vipul Singh
Abstract:
Zinc oxide, with its interesting properties such as large band gap (3.37eV), high exciton binding energy (60 meV) and intense UV absorption has been studied in literature for various applications viz. optoelectronics, biosensors, UV-photodetectors etc. The performance of ZnO devices is highly influenced by morphologies, size, crystallinity of the ZnO active layer and processing conditions. Recently, our group has shown the influence of the in situ addition of KMnO4 in the precursor solution during the hydrothermal growth of ZnO nanorods (NRs) on their near band edge (NBE) emission. In this paper, we have investigated the effect of post-growth annealing on the variations in NBE and deep level (DL) emissions of as grown ZnO nanorods. These observed results have been explained on the basis of X-ray Diffraction (XRD) and Raman spectroscopic analysis, which clearly show that improved crystalinity and quantum confinement in ZnO nanorods.Keywords: ZnO, nanorods, hydrothermal, KMnO4
Procedia PDF Downloads 400