Search results for: extrusion processing
2293 A Highly Accurate Computer-Aided Diagnosis: CAD System for the Diagnosis of Breast Cancer by Using Thermographic Analysis
Authors: Mahdi Bazarganigilani
Abstract:
Computer-aided diagnosis (CAD) systems can play crucial roles in diagnosing crucial diseases such as breast cancer at the earliest. In this paper, a CAD system for the diagnosis of breast cancer was introduced and evaluated. This CAD system was developed by using spatio-temporal analysis of data on a set of consecutive thermographic images by employing wavelet transformation. By using this analysis, a very accurate machine learning model using random forest was obtained. The final results showed a promising accuracy of 91% in terms of the F1 measure indicator among 200 patients' sample data. The CAD system was further extended to obtain a detailed analysis of the effect of smaller sub-areas of each breast on the occurrence of cancer.Keywords: computer-aided diagnosis systems, thermographic analysis, spatio-temporal analysis, image processing, machine learning
Procedia PDF Downloads 2102292 Eco-Nanofiltration Membranes: Nanofiltration Membrane Technology Utilization-Based Fiber Pineapple Leaves Waste as Solutions for Industrial Rubber Liquid Waste Processing and Fertilizer Crisis in Indonesia
Authors: Andi Setiawan, Annisa Ulfah Pristya
Abstract:
Indonesian rubber plant area reached 2.9 million hectares with productivity reached 1.38 million. High rubber productivity is directly proportional to the amount of waste produced rubber processing industry. Rubber industry would produce a negative impact on the rubber industry in the form of environmental pollution caused by waste that has not been treated optimally. Rubber industrial wastewater containing high-nitrogen compounds (nitrate and ammonia) and phosphate compounds which cause water pollution and odor problems due to the high ammonia content. On the other hand, demand for NPK fertilizers in Indonesia continues to increase from year to year and in need of ammonia and phosphate as raw material. Based on domestic demand, it takes a year to 400,000 tons of ammonia and Indonesia imports 200,000 tons of ammonia per year valued at IDR 4.2 trillion. As well, the lack of phosphoric acid to be imported from Jordan, Morocco, South Africa, the Philippines, and India as many as 225 thousand tons per year. During this time, the process of wastewater treatment is generally done with a rubber on the tank to contain the waste and then precipitated, filtered and the rest released into the environment. However, this method is inefficient and thus require high energy costs because through many stages before producing clean water that can be discharged into the river. On the other hand, Indonesia has the potential of pineapple fruit can be harvested throughout the year in all of Indonesia. In 2010, production reached 1,406,445 tons of pineapple in Indonesia or about 9.36 percent of the total fruit production in Indonesia. Increased productivity is directly proportional to the amount of pineapple waste pineapple leaves are kept continuous and usually just dumped in the ground or disposed of with other waste at the final disposal. Through Eco-Nanofiltration Membrane-Based Fiber Pineapple leaves Waste so that environmental problems can be solved efficiently. Nanofiltration is a process that uses pressure as a driving force that can be either convection or diffusion of each molecule. Nanofiltration membranes that can split water to nano size so as to separate the waste processed residual economic value that N and P were higher as a raw material for the manufacture of NPK fertilizer to overcome the crisis in Indonesia. The raw materials were used to manufacture Eco-Nanofiltration Membrane is cellulose from pineapple fiber which processed into cellulose acetate which is biodegradable and only requires a change of the membrane every 6 months. Expected output target is Green eco-technology so with nanofiltration membranes not only treat waste rubber industry in an effective, efficient and environmentally friendly but also lowers the cost of waste treatment compared to conventional methods.Keywords: biodegradable, cellulose diacetate, fertilizers, pineapple, rubber
Procedia PDF Downloads 4472291 Experimental Study on Dehumidification Performance of Supersonic Nozzle
Authors: Esam Jassim
Abstract:
Supersonic nozzles are commonly used to purify natural gas in gas processing technology. As an innovated technology, it is employed to overcome the deficit of the traditional method, related to gas dynamics, thermodynamics and fluid dynamics theory. An indoor test rig is built to study the dehumidification process of moisture fluid. Humid air was chosen for the study. The working fluid was circulating in an open loop, which had provision for filtering, metering, and humidifying. A stainless steel supersonic separator is constructed together with the C-D nozzle system. The result shows that dehumidification enhances as NPR increases. This is due to the high intensity in the turbulence caused by the shock formation in the divergent section. Such disturbance strengthens the centrifugal force, pushing more particles toward the near-wall region. In return return, the pressure recovery factor, defined as the ratio of the outlet static pressure of the fluid to its inlet value, decreases with NPR.Keywords: supersonic nozzle, dehumidification, particle separation, nozzle geometry
Procedia PDF Downloads 3392290 Reversible and Adaptive Watermarking for MRI Medical Images
Authors: Nisar Ahmed Memon
Abstract:
A new medical image watermarking scheme delivering high embedding capacity is presented in this paper. Integer Wavelet Transform (IWT), Companding technique and adaptive thresholding are used in this scheme. The proposed scheme implants, recovers the hidden information and restores the input image to its pristine state at the receiving end. Magnetic Resonance Imaging (MRI) images are used for experimental purposes. The scheme first segment the MRI medical image into non-overlapping blocks and then inserts watermark into wavelet coefficients having a high frequency of each block. The scheme uses block-based watermarking adopting iterative optimization of threshold for companding in order to avoid the histogram pre and post processing. Results show that proposed scheme performs better than other reversible medical image watermarking schemes available in literature for MRI medical images.Keywords: adaptive thresholding, companding technique, data authentication, reversible watermarking
Procedia PDF Downloads 2962289 Unsupervised Learning with Self-Organizing Maps for Named Entity Recognition in the CONLL2003 Dataset
Authors: Assel Jaxylykova, Alexnder Pak
Abstract:
This study utilized a Self-Organizing Map (SOM) for unsupervised learning on the CONLL-2003 dataset for Named Entity Recognition (NER). The process involved encoding words into 300-dimensional vectors using FastText. These vectors were input into a SOM grid, where training adjusted node weights to minimize distances. The SOM provided a topological representation for identifying and clustering named entities, demonstrating its efficacy without labeled examples. Results showed an F1-measure of 0.86, highlighting SOM's viability. Although some methods achieve higher F1 measures, SOM eliminates the need for labeled data, offering a scalable and efficient alternative. The SOM's ability to uncover hidden patterns provides insights that could enhance existing supervised methods. Further investigation into potential limitations and optimization strategies is suggested to maximize benefits.Keywords: named entity recognition, natural language processing, self-organizing map, CONLL-2003, semantics
Procedia PDF Downloads 462288 Analysis of EEG Signals Using Wavelet Entropy and Approximate Entropy: A Case Study on Depression Patients
Authors: Subha D. Puthankattil, Paul K. Joseph
Abstract:
Analyzing brain signals of the patients suffering from the state of depression may lead to interesting observations in the signal parameters that is quite different from a normal control. The present study adopts two different methods: Time frequency domain and nonlinear method for the analysis of EEG signals acquired from depression patients and age and sex matched normal controls. The time frequency domain analysis is realized using wavelet entropy and approximate entropy is employed for the nonlinear method of analysis. The ability of the signal processing technique and the nonlinear method in differentiating the physiological aspects of the brain state are revealed using Wavelet entropy and Approximate entropy.Keywords: EEG, depression, wavelet entropy, approximate entropy, relative wavelet energy, multiresolution decomposition
Procedia PDF Downloads 3322287 Multi Agent System Architecture Oriented Prometheus Methodology Design for Reverse Logistics
Authors: F. Lhafiane, A. Elbyed, M. Bouchoum
Abstract:
The design of Reverse logistics Network has attracted growing attention with the stringent pressures from both environmental awareness and business sustainability. Reverse logistical activities include return, remanufacture, disassemble and dispose of products can be quite complex to manage. In addition, demand can be difficult to predict, and decision making is one of the challenges tasks. This complexity has amplified the need to develop an integrated architecture for product return as an enterprise system. The main purpose of this paper is to design Multi agent system (MAS) architecture using the Prometheus methodology to efficiently manage reverse logistics processes. The proposed MAS architecture includes five types of agents: Gate keeping Agent, Collection Agent, Sorting Agent, Processing Agent and Disposal Agent which act respectively during the five steps of reverse logistics Network.Keywords: reverse logistics, multi agent system, prometheus methodology
Procedia PDF Downloads 4712286 CRM Cloud Computing: An Efficient and Cost Effective Tool to Improve Customer Interactions
Authors: Gaurangi Saxena, Ravindra Saxena
Abstract:
Lately, cloud computing is used to enhance the ability to attain corporate goals more effectively and efficiently at lower cost. This new computing paradigm “The Cloud Computing” has emerged as a powerful tool for optimum utilization of resources and gaining competitiveness through cost reduction and achieving business goals with greater flexibility. Realizing the importance of this new technique, most of the well known companies in computer industry like Microsoft, IBM, Google and Apple are spending millions of dollars in researching cloud computing and investigating the possibility of producing interface hardware for cloud computing systems. It is believed that by using the right middleware, a cloud computing system can execute all the programs a normal computer could run. Potentially, everything from most simple generic word processing software to highly specialized and customized programs designed for specific company could work successfully on a cloud computing system. A Cloud is a pool of virtualized computer resources. Clouds are not limited to grid environments, but also support “interactive user-facing applications” such as web applications and three-tier architectures. Cloud Computing is not a fundamentally new paradigm. It draws on existing technologies and approaches, such as utility Computing, Software-as-a-service, distributed computing, and centralized data centers. Some companies rent physical space to store servers and databases because they don’t have it available on site. Cloud computing gives these companies the option of storing data on someone else’s hardware, removing the need for physical space on the front end. Prominent service providers like Amazon, Google, SUN, IBM, Oracle, Salesforce etc. are extending computing infrastructures and platforms as a core for providing top-level services for computation, storage, database and applications. Application services could be email, office applications, finance, video, audio and data processing. By using cloud computing system a company can improve its customer relationship management. A CRM cloud computing system may be highly useful in delivering a sales team a blend of unique functionalities to improve agent/customer interactions. This paper attempts to first define the cloud computing as a tool for running business activities more effectively and efficiently at a lower cost; and then it distinguishes cloud computing with grid computing. Based on exhaustive literature review, authors discuss application of cloud computing in different disciplines of management especially in the field of marketing with special reference to use of cloud computing in CRM. Study concludes that CRM cloud computing platform helps a company track any data, such as orders, discounts, references, competitors and many more. By using CRM cloud computing, companies can improve its customer interactions and by serving them more efficiently that too at a lower cost can help gaining competitive advantage.Keywords: cloud computing, competitive advantage, customer relationship management, grid computing
Procedia PDF Downloads 3122285 Estimating Estimators: An Empirical Comparison of Non-Invasive Analysis Methods
Authors: Yan Torres, Fernanda Simoes, Francisco Petrucci-Fonseca, Freddie-Jeanne Richard
Abstract:
The non-invasive samples are an alternative of collecting genetic samples directly. Non-invasive samples are collected without the manipulation of the animal (e.g., scats, feathers and hairs). Nevertheless, the use of non-invasive samples has some limitations. The main issue is degraded DNA, leading to poorer extraction efficiency and genotyping. Those errors delayed for some years a widespread use of non-invasive genetic information. Possibilities to limit genotyping errors can be done using analysis methods that can assimilate the errors and singularities of non-invasive samples. Genotype matching and population estimation algorithms can be highlighted as important analysis tools that have been adapted to deal with those errors. Although, this recent development of analysis methods there is still a lack of empirical performance comparison of them. A comparison of methods with dataset different in size and structure can be useful for future studies since non-invasive samples are a powerful tool for getting information specially for endangered and rare populations. To compare the analysis methods, four different datasets used were obtained from the Dryad digital repository were used. Three different matching algorithms (Cervus, Colony and Error Tolerant Likelihood Matching - ETLM) are used for matching genotypes and two different ones for population estimation (Capwire and BayesN). The three matching algorithms showed different patterns of results. The ETLM produced less number of unique individuals and recaptures. A similarity in the matched genotypes between Colony and Cervus was observed. That is not a surprise since the similarity between those methods on the likelihood pairwise and clustering algorithms. The matching of ETLM showed almost no similarity with the genotypes that were matched with the other methods. The different cluster algorithm system and error model of ETLM seems to lead to a more criterious selection, although the processing time and interface friendly of ETLM were the worst between the compared methods. The population estimators performed differently regarding the datasets. There was a consensus between the different estimators only for the one dataset. The BayesN showed higher and lower estimations when compared with Capwire. The BayesN does not consider the total number of recaptures like Capwire only the recapture events. So, this makes the estimator sensitive to data heterogeneity. Heterogeneity in the sense means different capture rates between individuals. In those examples, the tolerance for homogeneity seems to be crucial for BayesN work properly. Both methods are user-friendly and have reasonable processing time. An amplified analysis with simulated genotype data can clarify the sensibility of the algorithms. The present comparison of the matching methods indicates that Colony seems to be more appropriated for general use considering a time/interface/robustness balance. The heterogeneity of the recaptures affected strongly the BayesN estimations, leading to over and underestimations population numbers. Capwire is then advisable to general use since it performs better in a wide range of situations.Keywords: algorithms, genetics, matching, population
Procedia PDF Downloads 1432284 LuMee: A Centralized Smart Protector for School Children who are Using Online Education
Authors: Lumindu Dilumka, Ranaweera I. D., Sudusinghe S. P., Sanduni Kanchana A. M. K.
Abstract:
This study was motivated by the challenges experienced by parents and guardians in ensuring the safety of children in cyberspace. In the last two or three years, online education has become very popular all over the world due to the Covid 19 pandemic. Therefore, parents, guardians and teachers must ensure the safety of children in cyberspace. Children are more likely to go astray and there are plenty of online programs are waiting to get them on the wrong track and also, children who are engaging in the online education can be distracted at any moment. Therefore, parents should keep a close check on their children's online activity. Apart from that, due to the unawareness of children, they tempt to share their sensitive information, causing a chance of being a victim of phishing attacks from outsiders. These problems can be overcome through the proposed web-based system. We use feature extraction, web tracking and analysis mechanisms, image processing and name entity recognition to implement this web-based system.Keywords: online education, cyber bullying, social media, face recognition, web tracker, privacy data
Procedia PDF Downloads 892283 Endocardial Ultrasound Segmentation using Level Set method
Authors: Daoudi Abdelaziz, Mahmoudi Saïd, Chikh Mohamed Amine
Abstract:
This paper presents a fully automatic segmentation method of the left ventricle at End Systolic (ES) and End Diastolic (ED) in the ultrasound images by means of an implicit deformable model (level set) based on Geodesic Active Contour model. A pre-processing Gaussian smoothing stage is applied to the image, which is essential for a good segmentation. Before the segmentation phase, we locate automatically the area of the left ventricle by using a detection approach based on the Hough Transform method. Consequently, the result obtained is used to automate the initialization of the level set model. This initial curve (zero level set) deforms to search the Endocardial border in the image. On the other hand, quantitative evaluation was performed on a data set composed of 15 subjects with a comparison to ground truth (manual segmentation).Keywords: level set method, transform Hough, Gaussian smoothing, left ventricle, ultrasound images.
Procedia PDF Downloads 4652282 Fiber Orientation Measurements in Reinforced Thermoplastics
Authors: Ihsane Modhaffar
Abstract:
Fiber orientation is essential for the physical properties of composite materials. The theoretical parameters of a given reinforcement are usually known and widely used to predict the behavior of the material. In this work, we propose an image processing approach to estimate true principal directions and fiber orientation during injection molding processes of short fiber reinforced thermoplastics. Generally, a group of fibers are described in terms of probability distribution function or orientation tensor. Numerical techniques for the prediction of fiber orientation are also considered for concentrated situations. The flow was considered to be incompressible, and behave as Newtonian fluid containing suspensions of short-fibers. The governing equations, of this problem are: the continuity, the momentum and the energy. The obtained results were compared to available experimental findings. A good agreement between the numerical results and the experimental data was achieved.Keywords: injection, composites, short-fiber reinforced thermoplastics, fiber orientation, incompressible fluid, numerical simulation
Procedia PDF Downloads 5322281 Collect Meaningful Information about Stock Markets from the Web
Authors: Saleem Abuleil, Khalid S. Alsamara
Abstract:
Events represent a significant source of information on the web; they deliver information about events that occurred around the world in all kind of subjects and areas. These events can be collected and organized to provide valuable and useful information for decision makers, researchers, as well as any person seeking knowledge. In this paper, we discuss an ongoing research to target stock markets domain to observe and record changes (events) when they happen, collect them, understand the meaning of each one of them, and organize the information along with meaning in a well-structured format. By using Semantic Role Labeling (SRL) technique, we identified four factors for each event in this paper: verb of action and three roles associated with it, entity name, attribute, and attribute value. We have generated a set of rules and techniques to support our approach to analyze and understand the meaning of the events taking place in stock markets.Keywords: natuaral language processing, Arabic language, event extraction and understanding, sematic role labeling, stock market
Procedia PDF Downloads 3932280 Study on Fabrication of Surface Functional Micro and Nanostructures by Femtosecond Laser
Authors: Shengzhu Cao, Hui Zhou, Gan Wu, Lanxi Wanhg, Kaifeng Zhang, Rui Wang, Hu Wang
Abstract:
The functional micro and nanostructures, which can endow material surface with unique properties such as super-absorptance, hydrophobic and drag reduction. Recently, femtosecond laser ablation has been demonstrated to be a promising technology for surface functional micro and nanostructures fabrication. In this paper, using femtosecond laser ablation processing technique, we fabricated functional micro and nanostructures on Ti and Al alloy surfaces, test results showed that processed surfaces have 82%~96% absorptance over a broad wavelength range from ultraviolet to infrared. The surface function properties, which determined by micro and nanostructures, could be modulated by variation laser parameters. These functional surfaces may find applications in such areas as photonics, plasmonics, spaceborne devices, thermal radiation sources, solar energy absorbers and biomedicine.Keywords: surface functional, micro and nanostructures, femtosecond laser, ablation
Procedia PDF Downloads 3692279 Improvement of the Traditional Techniques of Artistic Casting through the Development of Open Source 3D Printing Technologies Based on Digital Ultraviolet Light Processing
Authors: Drago Diaz Aleman, Jose Luis Saorin Perez, Cecile Meier, Itahisa Perez Conesa, Jorge De La Torre Cantero
Abstract:
Traditional manufacturing techniques used in artistic contexts compete with highly productive and efficient industrial procedures. The craft techniques and associated business models tend to disappear under the pressure of the appearance of mass-produced products that compete in all niche markets, including those traditionally reserved for the work of art. The surplus value derived from the prestige of the author, the exclusivity of the product or the mastery of the artist, do not seem to be sufficient reasons to preserve this productive model. In the last years, the adoption of open source digital manufacturing technologies in small art workshops can favor their permanence by assuming great advantages such as easy accessibility, low cost, and free modification, adapting to specific needs of each workshop. It is possible to use pieces modeled by computer and made with FDM (Fused Deposition Modeling) 3D printers that use PLA (polylactic acid) in the procedures of artistic casting. Models printed by PLA are limited to approximate minimum sizes of 3 cm, and optimal layer height resolution is 0.1 mm. Due to these limitations, it is not the most suitable technology for artistic casting processes of smaller pieces. An alternative to solve size limitation, are printers from the type (SLS) "selective sintering by laser". And other possibility is a laser hardens, by layers, metal powder and called DMLS (Direct Metal Laser Sintering). However, due to its high cost, it is a technology that is difficult to introduce in small artistic foundries. The low-cost DLP (Digital Light Processing) type printers can offer high resolutions for a reasonable cost (around 0.02 mm on the Z axis and 0.04 mm on the X and Y axes), and can print models with castable resins that allow the subsequent direct artistic casting in precious metals or their adaptation to processes such as electroforming. In this work, the design of a DLP 3D printer is detailed, using backlit LCD screens with ultraviolet light. Its development is totally "open source" and is proposed as a kit made up of electronic components, based on Arduino and easy to access mechanical components in the market. The CAD files of its components can be manufactured in low-cost FDM 3D printers. The result is less than 500 Euros, high resolution and open-design with free access that allows not only its manufacture but also its improvement. In future works, we intend to carry out different comparative analyzes, which allow us to accurately estimate the print quality, as well as the real cost of the artistic works made with it.Keywords: traditional artistic techniques, DLP 3D printer, artistic casting, electroforming
Procedia PDF Downloads 1422278 Bio-Desalination and Bioremediation of Agroindustrial Wastewaters Using Yarrowia Lipolytica
Authors: Selma Hamimed, Abdelwaheb Chatti
Abstract:
The current study deals with the biological treatment of saline wastewaters generated by various agro-food industries using Yarrowia lipolytica. The ability of this yeast was studied on the mixture of olive mill wastewater and tuna wash processing wastewater. Results showed that the high proportion of olive mill wastewater in the mixture about (75:25) is the suitable one for the highest Y. lipolytica biomass production, reaching 11.3 g L⁻¹ after seven days. In addition, results showed significant removal of chemical oxygen demand (COD) and phosphorous of 97.49 % and 98.90 %, respectively. On the other hand, Y. lipolytica was found to be effective to desalinate all mixtures reaching a removal of 92.21 %. Moreover, the analytical results using Fourier transform infrared spectroscopy (FTIR), X-ray diffraction (XRD), and scanning electron microscopy (SEM) confirmed the biosorption of NaCl on the surface of the yeast as nanocrystals form with a size of 47.3 nm.Keywords: nanocrystallization of NaCl, desalination, wastewater treatment, yarrowia lipolytica
Procedia PDF Downloads 1872277 Evaluation of the Boiling Liquid Expanding Vapor Explosion Thermal Effects in Hassi R'Mel Gas Processing Plant Using Fire Dynamics Simulator
Authors: Brady Manescau, Ilyas Sellami, Khaled Chetehouna, Charles De Izarra, Rachid Nait-Said, Fati Zidani
Abstract:
During a fire in an oil and gas refinery, several thermal accidents can occur and cause serious damage to people and environment. Among these accidents, the BLEVE (Boiling Liquid Expanding Vapor Explosion) is most observed and remains a major concern for risk decision-makers. It corresponds to a violent vaporization of explosive nature following the rupture of a vessel containing a liquid at a temperature significantly higher than its normal boiling point at atmospheric pressure. Their effects on the environment generally appear in three ways: blast overpressure, radiation from the fireball if the liquid involved is flammable and fragment hazards. In order to estimate the potential damage that would be caused by such an explosion, risk decision-makers often use quantitative risk analysis (QRA). This analysis is a rigorous and advanced approach that requires a reliable data in order to obtain a good estimate and control of risks. However, in most cases, the data used in QRA are obtained from the empirical correlations. These empirical correlations generally overestimate BLEVE effects because they are based on simplifications and do not take into account real parameters like the geometry effect. Considering that these risk analyses are based on an assessment of BLEVE effects on human life and plant equipment, more precise and reliable data should be provided. From this point of view, the CFD modeling of BLEVE effects appears as a solution to the empirical law limitations. In this context, the main objective is to develop a numerical tool in order to predict BLEVE thermal effects using the CFD code FDS version 6. Simulations are carried out with a mesh size of 1 m. The fireball source is modeled as a vertical release of hot fuel in a short time. The modeling of fireball dynamics is based on a single step combustion using an EDC model coupled with the default LES turbulence model. Fireball characteristics (diameter, height, heat flux and lifetime) issued from the large scale BAM experiment are used to demonstrate the ability of FDS to simulate the various steps of the BLEVE phenomenon from ignition up to total burnout. The influence of release parameters such as the injection rate and the radiative fraction on the fireball heat flux is also presented. Predictions are very encouraging and show good agreement in comparison with BAM experiment data. In addition, a numerical study is carried out on an operational propane accumulator in an Algerian gas processing plant of SONATRACH company located in the Hassi R’Mel Gas Field (the largest gas field in Algeria).Keywords: BLEVE effects, CFD, FDS, fireball, LES, QRA
Procedia PDF Downloads 1862276 Genetic Algorithm Optimization of Microcantilever Based Resonator
Authors: Manjula Sutagundar, B. G. Sheeparamatti, D. S. Jangamshetti
Abstract:
Micro Electro Mechanical Systems (MEMS) resonators have shown the potential of replacing quartz crystal technology for sensing and high frequency signal processing applications because of inherent advantages like small size, high quality factor, low cost, compatibility with integrated circuit chips. This paper presents the optimization and modelling and simulation of the optimized micro cantilever resonator. The objective of the work is to optimize the dimensions of a micro cantilever resonator for a specified range of resonant frequency and specific quality factor. Optimization is carried out using genetic algorithm. The genetic algorithm is implemented using MATLAB. The micro cantilever resonator is modelled in CoventorWare using the optimized dimensions obtained from genetic algorithm. The modeled cantilever is analysed for resonance frequency.Keywords: MEMS resonator, genetic algorithm, modelling and simulation, optimization
Procedia PDF Downloads 5502275 Phonological Encoding and Working Memory in Kannada Speaking Adults Who Stutter
Authors: Nirmal Sugathan, Santosh Maruthy
Abstract:
Background: A considerable number of studies have evidenced that phonological encoding (PE) and working memory (WM) skills operate differently in adults who stutter (AWS). In order to tap these skills, several paradigms have been employed such as phonological priming, phoneme monitoring, and nonword repetition tasks. This study, however, utilizes a word jumble paradigm to assess both PE and WM using different modalities and this may give a better understanding of phonological processing deficits in AWS. Aim: The present study investigated PE and WM abilities in conjunction with lexical access in AWS using jumbled words. The study also aimed at investigating the effect of increase in cognitive load on phonological processing in AWS by comparing the speech reaction time (SRT) and accuracy scores across various syllable lengths. Method: Participants were 11 AWS (Age range=19-26) and 11 adults who do not stutter (AWNS) (Age range=19-26) matched for age, gender and handedness. Stimuli: Ninety 3-, 4-, and 5-syllable jumbled words (JWs) (n=30 per syllable length category) constructed from Kannada words served as stimuli for jumbled word paradigm. In order to generate jumbled words (JWs), the syllables in the real words were randomly transpositioned. Procedures: To assess PE, the JWs were presently visually using DMDX software and for WM task, JWs were presented through auditory mode through headphones. The participants were asked to silently manipulate the jumbled words to form a Kannada real word and verbally respond once. The responses for both tasks were audio recorded using record function in DMDX software and the recorded responses were analyzed using PRAAT software to calculate the SRT. Results: SRT: Mann-Whitney test results demonstrated that AWS performed significantly slower on both tasks (p < 0.001) as indicated by increased SRT. Also, AWS presented with increased SRT on both the tasks in all syllable length conditions (p < 0.001). Effect of syllable length: Wilcoxon signed rank test was carried out revealed that, on task assessing PE, the SRT of 4syllable JWs were significantly higher in both AWS (Z= -2.93, p=.003) and AWNS (Z= -2.41, p=.003) when compared to 3-syllable words. However, the findings for 4- and 5-syllable words were not significant. Task Accuracy: The accuracy scores were calculated for three syllable length conditions for both PE and PM tasks and were compared across the groups using Mann-Whitney test. The results indicated that the accuracy scores of AWS were significantly below that of AWNS in all the three syllable conditions for both the tasks (p < 0.001). Conclusion: The above findings suggest that PE and WM skills are compromised in AWS as indicated by increased SRT. Also, AWS were progressively less accurate in descrambling JWs of increasing syllable length and this may be interpreted as, rather than existing as a uniform deficiency, PE and WM deficits emerge when the cognitive load is increased. AWNS exhibited increased SRT and increased accuracy for JWs of longer syllable length whereas AWS was not benefited from increasing the reaction time, thus AWS had to compromise for both SRT and accuracy while solving JWs of longer syllable length.Keywords: adults who stutter, phonological ability, working memory, encoding, jumbled words
Procedia PDF Downloads 2402274 Statistical Analysis of Natural Images after Applying ICA and ISA
Authors: Peyman Sheikholharam Mashhadi
Abstract:
Difficulties in analyzing real world images in classical image processing and machine vision framework have motivated researchers towards considering the biology-based vision. It is a common belief that mammalian visual cortex has been adapted to the statistics of the real world images through the evolution process. There are two well-known successful models of mammalian visual cortical cells: Independent Component Analysis (ICA) and Independent Subspace Analysis (ISA). In this paper, we statistically analyze the dependencies which remain in the components after applying these models to the natural images. Also, we investigate the response of feature detectors to gratings with various parameters in order to find optimal parameters of the feature detectors. Finally, the selectiveness of feature detectors to phase, in both models is considered.Keywords: statistics, independent component analysis, independent subspace analysis, phase, natural images
Procedia PDF Downloads 3392273 Effect of Hot Equal Channel Angular Pressing Process on Mechanical Properties of Commercial Pure Titanium
Authors: Seyed Ata Khalkhkali Sharifi, Gholamhossein Majzoubi, Farhad Abroush
Abstract:
Developing mechanical properties of pure titanium has been reviewed in this paper by using ECAP process. At the first step of this article, the experimental samples were prepared as mentioned in the standards. Then pure grade 2 Ti was processed via equal-channel angular pressing (ECAp) for 2 passes following route-A at 400°C. After processing, the microstructural evolution, tensile, fatigue, hardness properties and wear behavior were investigated. Finally, the effect of ECAP process on these samples was analyzed. The results showed improvement in strength values with a slight decrease in ductility. The analysis on 30 points within the sample showed hardness increase in each pass. Also, it was concluded that fatigue properties were increased too.Keywords: equal-channel angular pressing, titanium, mechanical behavior, engineering materials and applications
Procedia PDF Downloads 2582272 CookIT: A Web Portal for the Preservation and Dissemination of Traditional Italian Recipes
Authors: M. T. Artese, G. Ciocca, I. Gagliardi
Abstract:
Food is a social and cultural aspect of every individual. Food products, processing, and traditions have been identified as cultural objects carrying history and identity of social groups. Traditional recipes are passed down from one generation to the other, often to strengthen the link with the territory. The paper presents CookIT, a web portal developed to collect Italian traditional recipes related to regional cuisine, with the purpose to disseminate the knowledge of typical Italian recipes and the Mediterranean diet which is a significant part of Italian cuisine. The system designed is completed with multimodal means of browsing and data retrieval. Stored recipes can be retrieved integrating and combining a number of different methods and keys, while the results are displayed using classical styles, such as list and mosaic, and also using maps and graphs, with which users can play using available keys for interaction.Keywords: collaborative portal, Italian cuisine, intangible cultural heritage, traditional recipes, searching and browsing
Procedia PDF Downloads 1492271 The Effect of Precipitation on Weed Infestation of Spring Barley under Different Tillage Conditions
Authors: J. Winkler, S. Chovancová
Abstract:
The article deals with the relation between rainfall in selected months and subsequent weed infestation of spring barley. The field experiment was performed at Mendel University agricultural enterprise in Žabčice, Czech Republic. Weed infestation was measured in spring barley vegetation in years 2004 to 2012. Barley was grown in three tillage variants: conventional tillage technology (CT), minimization tillage technology (MT), and no tillage (NT). Precipitation was recorded in one-day intervals. Monthly precipitation was calculated from the measured values in the months of October through to April. The technique of canonical correspondence analysis was applied for further statistical processing. 41 different species of weeds were found in the course of the 9-year monitoring period. The results clearly show that precipitation affects the incidence of most weed species in the selected months, but acts differently in the monitored variants of tillage technologies.Keywords: weeds, precipitation, tillage, weed infestation forecast
Procedia PDF Downloads 4982270 Rule-Based Expert System for Headache Diagnosis and Medication Recommendation
Authors: Noura Al-Ajmi, Mohammed A. Almulla
Abstract:
With the increased utilization of technology devices around the world, healthcare and medical diagnosis are critical issues that people worry about these days. Doctors are doing their best to avoid any medical errors while diagnosing diseases and prescribing the wrong medication. Subsequently, artificial intelligence applications that can be installed on mobile devices such as rule-based expert systems facilitate the task of assisting doctors in several ways. Due to their many advantages, the usage of expert systems has increased recently in health sciences. This work presents a backward rule-based expert system that can be used for a headache diagnosis and medication recommendation system. The structure of the system consists of three main modules, namely the input unit, the processing unit, and the output unit.Keywords: headache diagnosis system, prescription recommender system, expert system, backward rule-based system
Procedia PDF Downloads 2152269 Ontological Modeling Approach for Statistical Databases Publication in Linked Open Data
Authors: Bourama Mane, Ibrahima Fall, Mamadou Samba Camara, Alassane Bah
Abstract:
At the level of the National Statistical Institutes, there is a large volume of data which is generally in a format which conditions the method of publication of the information they contain. Each household or business data collection project includes a dissemination platform for its implementation. Thus, these dissemination methods previously used, do not promote rapid access to information and especially does not offer the option of being able to link data for in-depth processing. In this paper, we present an approach to modeling these data to publish them in a format intended for the Semantic Web. Our objective is to be able to publish all this data in a single platform and offer the option to link with other external data sources. An application of the approach will be made on data from major national surveys such as the one on employment, poverty, child labor and the general census of the population of Senegal.Keywords: Semantic Web, linked open data, database, statistic
Procedia PDF Downloads 1752268 Technology Computer Aided Design Simulation of Space Charge Limited Conduction in Polycrystalline Thin Films
Authors: Kunj Parikh, S. Bhattacharya, V. Natarajan
Abstract:
TCAD numerical simulation is one of the most tried and tested powerful tools for designing devices in semiconductor foundries worldwide. It has also been used to explain conduction in organic thin films where the processing temperature is often enough to make homogeneous samples (often imperfect, but homogeneously imperfect). In this report, we have presented the results of TCAD simulation in multi-grain thin films. The work has addressed the inhomogeneity in one dimension, but can easily be extended to two and three dimensions. The effect of grain boundaries has mainly been approximated as barriers located at the junction between two adjacent grains. The effect of the value of grain boundary barrier, the bulk traps, and the measurement temperature have been investigated.Keywords: polycrystalline thin films, space charge limited conduction, Technology Computer-Aided Design (TCAD) simulation, traps
Procedia PDF Downloads 2142267 Processing of Flexible Dielectric Nanocomposites Using Nanocellulose and Recycled Alum Sludge for Wearable Technology Applications
Authors: D. Sun, L. Saw, A. Onyianta, D. O’Rourke, Z. Lu, C. See, C. Wilson, C. Popescu, M. Dorris
Abstract:
With the rapid development of wearable technology (e.g., smartwatch, activity trackers and health monitor devices), flexible dielectric materials with environmental-friendly, low-cost and high-energy efficiency characteristics are in increasing demand. In this work, a flexible dielectric nanocomposite was processed by incorporating two components: cellulose nanofibrils and alum sludge in a polymer matrix. The two components were used in the reinforcement phase as well as for enhancing the dielectric properties; they were processed using waste materials that would otherwise be disposed to landfills. Alum sludge is a by-product of the water treatment process in which aluminum sulfate is prevalently used as the primary coagulant. According to the data from a project partner-Scottish Water: there are approximately 10k tons of alum sludge generated as a waste from the water treatment work to be landfilled every year in Scotland. The industry has been facing escalating financial and environmental pressure to develop more sustainable strategies to deal with alum sludge wastes. In the available literature, some work on reusing alum sludge has been reported (e.g., aluminum recovery or agriculture and land reclamation). However, little work can be found in applying it to processing energy materials (e.g., dielectrics) for enhanced energy density and efficiency. The alum sludge was collected directly from a water treatment plant of Scottish Water and heat-treated and refined before being used in preparing composites. Cellulose nanofibrils were derived from water hyacinth, an invasive aquatic weed that causes significant ecological issues in tropical regions. The harvested water hyacinth was dried and processed using a cost-effective method, including a chemical extraction followed by a homogenization process in order to extract cellulose nanofibrils. Biodegradable elastomer polydimethylsiloxane (PDMS) was used as the polymer matrix and the nanocomposites were processed by casting raw materials in Petri dishes. The processed composites were characterized using various methods, including scanning electron microscopy (SEM), rheological analysis, thermogravimetric and X-ray diffraction analysis. The SEM result showed that cellulose nanofibrils of approximately 20nm in diameter and 100nm in length were obtained and the alum sludge particles were of approximately 200um in diameters. The TGA/DSC analysis result showed that a weight loss of up to 48% can be seen in the raw material of alum sludge and its crystallization process has been started at approximately 800°C. This observation coincides with the XRD result. Other experiments also showed that the composites exhibit comprehensive mechanical and dielectric performances. This work depicts that it is a sustainable practice of reusing such waste materials in preparing flexible, lightweight and miniature dielectric materials for wearable technology applications.Keywords: cellulose, biodegradable, sustainable, alum sludge, nanocomposite, wearable technology, dielectric
Procedia PDF Downloads 852266 Comparison of Different Extraction Methods for the Determination of Polyphenols
Authors: Senem Suna
Abstract:
Extraction of bioactive compounds from several food/food products comes as an important topic and new trend related with health promoting effects. As a result of the increasing interest in natural foods, different methods are used for the acquisition of these components especially polyphenols. However, special attention has to be paid to the selection of proper techniques or several processing technologies (supercritical fluid extraction, microwave-assisted extraction, ultrasound-assisted extraction, powdered extracts production) for each kind of food to get maximum benefit as well as the obtainment of phenolic compounds. In order to meet consumer’s demand for healthy food and the management of quality and safety requirements, advanced research and development are needed. In this review, advantages, and disadvantages of different extraction methods, their opportunities to be used in food industry and the effects of polyphenols are mentioned in details. Consequently, with the evaluation of the results of several studies, the selection of the most suitable food specific method was aimed.Keywords: bioactives, extraction, powdered extracts, supercritical fluid extraction
Procedia PDF Downloads 2392265 Mathematical Modelling of Bacterial Growth in Products of Animal Origin in Storage and Transport: Effects of Temperature, Use of Bacteriocins and pH Level
Authors: Benjamin Castillo, Luis Pastenes, Fernando Cordova
Abstract:
The pathogen growth in animal source foods is a common problem in the food industry, causing monetary losses due to the spoiling of products or food intoxication outbreaks in the community. In this sense, the quality of the product is reflected by the population of deteriorating agents present in it, which are mainly bacteria. The factors which are likely associated with freshness in animal source foods are temperature and processing, storage, and transport times. However, the level of deterioration of products depends, in turn, on the characteristics of the bacterial population, causing the decomposition or spoiling, such as pH level and toxins. Knowing the growth dynamics of the agents that are involved in product contamination allows the monitoring for more efficient processing. This means better quality and reasonable costs, along with a better estimation of necessary time and temperature intervals for transport and storage in order to preserve product quality. The objective of this project is to design a secondary model that allows measuring the impact on temperature bacterial growth and the competition for pH adequacy and release of bacteriocins in order to describe such phenomenon and, thus, estimate food product half-life with the least possible risk of deterioration or spoiling. In order to achieve this objective, the authors propose an analysis of a three-dimensional ordinary differential which includes; logistic bacterial growth extended by the inhibitory action of bacteriocins including the effect of the medium pH; change in the medium pH levels through an adaptation of the Luedeking-Piret kinetic model; Bacteriocin concentration modeled similarly to pH levels. These three dimensions are being influenced by the temperature at all times. Then, this differential system is expanded, taking into consideration the variable temperature and the concentration of pulsed bacteriocins, which represent characteristics inherent of the modeling, such as transport and storage, as well as the incorporation of substances that inhibit bacterial growth. The main results lead to the fact that temperature changes in an early stage of transport increased the bacterial population significantly more than if it had increased during the final stage. On the other hand, the incorporation of bacteriocins, as in other investigations, proved to be efficient in the short and medium-term since, although the population of bacteria decreased, once the bacteriocins were depleted or degraded over time, the bacteria eventually returned to their regular growth rate. The efficacy of the bacteriocins at low temperatures decreased slightly, which equates with the fact that their natural degradation rate also decreased. In summary, the implementation of the mathematical model allowed the simulation of a set of possible bacteria present in animal based products, along with their properties, in various transport and storage situations, which led us to state that for inhibiting bacterial growth, the optimum is complementary low constant temperatures and the initial use of bacteriocins.Keywords: bacterial growth, bacteriocins, mathematical modelling, temperature
Procedia PDF Downloads 1352264 Continuous-Time and Discrete-Time Singular Value Decomposition of an Impulse Response Function
Authors: Rogelio Luck, Yucheng Liu
Abstract:
This paper proposes the continuous-time singular value decomposition (SVD) for the impulse response function, a special kind of Green’s functions e⁻⁽ᵗ⁻ ᵀ⁾, in order to find a set of singular functions and singular values so that the convolutions of such function with the set of singular functions on a specified domain are the solutions to the inhomogeneous differential equations for those singular functions. A numerical example was illustrated to verify the proposed method. Besides the continuous-time SVD, a discrete-time SVD is also presented for the impulse response function, which is modeled using a Toeplitz matrix in the discrete system. The proposed method has broad applications in signal processing, dynamic system analysis, acoustic analysis, thermal analysis, as well as macroeconomic modeling.Keywords: singular value decomposition, impulse response function, Green’s function , Toeplitz matrix , Hankel matrix
Procedia PDF Downloads 156