Search results for: extrusion processing
3613 The Amount of Information Processing and Balance Performance in Children: The Dual-Task Paradigm
Authors: Chin-Chih Chiou, Tai-Yuan Su, Ti-Yu Chen, Wen-Yu Chiu, Chungyu Chen
Abstract:
The purpose of this study was to investigate the effect of reaction time (RT) or balance performance as the number of stimulus-response choices increases, the amount of information processing of 0-bit and 1-bit conditions based on Hick’s law, using the dual-task design. Eighteen children (age: 9.38 ± 0.27 years old) were recruited as the participants for this study, and asked to assess RT and balance performance separately and simultaneously as following five conditions: simple RT (0-bit decision), choice RT (1-bit decision), single balance control, balance control with simple RT, and balance control with choice RT. Biodex 950-300 balance system and You-Shang response timer were used to record and analyze the postural stability and information processing speed (RT) respectively for the participants. Repeated measures one-way ANOVA with HSD post-hoc test and 2 (balance) × 2 (amount of information processing) repeated measures two-way ANOVA were used to test the parameters of balance performance and RT (α = .05). The results showed the overall stability index in the 1-bit decision was lower than in 0-bit decision, and the mean deflection in the 1-bit decision was lower than in single balance performance. Simple RTs were faster than choice RTs both in single task condition and dual task condition. It indicated that the chronometric approach of RT could use to infer the attention requirement of the secondary task. However, this study did not find that the balance performance is interfered for children by the increasing of the amount of information processing.Keywords: capacity theory, reaction time, Hick’s law, balance
Procedia PDF Downloads 4513612 Embedded Acoustic Signal Processing System Using OpenMP Architecture
Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad
Abstract:
In this paper, altera de1-SoC FPGA board technology is utilized as a distinguished tool for nondestructive characterization of an aluminum circular cylindrical shell of radius ratio b/a (a: outer radius; b: inner radius). The acoustic backscattered signal processing system has been developed using OpenMP architecture. The design is built in three blocks; it is implemented per functional block, in a heterogeneous Intel-Altera system running under Linux. The useful data to determine the performances of SoC FPGA is computed by the analytical method. The exploitation of SoC FPGA has lead to obtain the backscattering form function and resonance spectra. A0 and S0 modes of propagation in the tube are shown. The findings are then compared to those achieved from the Matlab simulation of analytical method. A good agreement has, therefore, been noted. Moreover, the detailed SoC FPGA-based system has shown that acoustic spectra are performed at up to 5 times faster than the Matlab implementation using almost the same data. This FPGA-based system implementation of processing algorithms is realized with a coefficient of correlation R and absolute error respectively about 0.962 and 5 10⁻⁵.Keywords: OpenMP, signal processing system, acoustic backscattering, nondestructive characterization, thin tubes
Procedia PDF Downloads 923611 Massively-Parallel Bit-Serial Neural Networks for Fast Epilepsy Diagnosis: A Feasibility Study
Authors: Si Mon Kueh, Tom J. Kazmierski
Abstract:
There are about 1% of the world population suffering from the hidden disability known as epilepsy and major developing countries are not fully equipped to counter this problem. In order to reduce the inconvenience and danger of epilepsy, different methods have been researched by using a artificial neural network (ANN) classification to distinguish epileptic waveforms from normal brain waveforms. This paper outlines the aim of achieving massive ANN parallelization through a dedicated hardware using bit-serial processing. The design of this bit-serial Neural Processing Element (NPE) is presented which implements the functionality of a complete neuron using variable accuracy. The proposed design has been tested taking into consideration non-idealities of a hardware ANN. The NPE consists of a bit-serial multiplier which uses only 16 logic elements on an Altera Cyclone IV FPGA and a bit-serial ALU as well as a look-up table. Arrays of NPEs can be driven by a single controller which executes the neural processing algorithm. In conclusion, the proposed compact NPE design allows the construction of complex hardware ANNs that can be implemented in a portable equipment that suits the needs of a single epileptic patient in his or her daily activities to predict the occurrences of impending tonic conic seizures.Keywords: Artificial Neural Networks (ANN), bit-serial neural processor, FPGA, Neural Processing Element (NPE)
Procedia PDF Downloads 3213610 Solutions for Food-Safe 3D Printing
Authors: Geremew Geidare Kailo, Igor Gáspár, András Koris, Ivana Pajčin, Flóra Vitális, Vanja Vlajkov
Abstract:
Three-dimension (3D) printing, a very popular additive manufacturing technology, has recently undergone rapid growth and replaced the use of conventional technology from prototyping to producing end-user parts and products. The 3D Printing technology involves a digital manufacturing machine that produces three-dimensional objects according to designs created by the user via 3D modeling or computer-aided design/manufacturing (CAD/CAM) software. The most popular 3D printing system is Fused Deposition Modeling (FDM) or also called Fused Filament Fabrication (FFF). A 3D-printed object is considered food safe if it can have direct contact with the food without any toxic effects, even after cleaning, storing, and reusing the object. This work analyzes the processing timeline of the filament (material for 3D printing) from unboxing to the extrusion through the nozzle. It is an important task to analyze the growth of bacteria on the 3D printed surface and in gaps between the layers. By default, the 3D-printed object is not food safe after longer usage and direct contact with food (even though they use food-safe filaments), but there are solutions for this problem. The aim of this work was to evaluate the 3D-printed object from different perspectives of food safety. Firstly, testing antimicrobial 3D printing filaments from a food safety aspect since the 3D Printed object in the food industry may have direct contact with the food. Therefore, the main purpose of the work is to reduce the microbial load on the surface of a 3D-printed part. Coating with epoxy resin was investigated, too, to see its effect on mechanical strength, thermal resistance, surface smoothness and food safety (cleanability). Another aim of this study was to test new temperature-resistant filaments and the effect of high temperature on 3D printed materials to see if they can be cleaned with boiling or similar hi-temp treatment. This work proved that all three mentioned methods could improve the food safety of the 3D printed object, but the size of this effect variates. The best result we got was with coating with epoxy resin, and the object was cleanable like any other injection molded plastic object with a smooth surface. Very good results we got by boiling the objects, and it is good to see that nowadays, more and more special filaments have a food-safe certificate and can withstand boiling temperatures too. Using antibacterial filaments reduced bacterial colonies to 1/5, but the biggest advantage of this method is that it doesn’t require any post-processing. The object is ready out of the 3D printer. Acknowledgements: The research was supported by the Hungarian and Serbian bilateral scientific and technological cooperation project funded by the Hungarian National Office for Research, Development and Innovation (NKFI, 2019-2.1.11-TÉT-2020-00249) and the Ministry of Education, Science and Technological Development of the Republic of Serbia. The authors acknowledge the Hungarian University of Agriculture and Life Sciences’s Doctoral School of Food Science for the support in this studyKeywords: food safety, 3D printing, filaments, microbial, temperature
Procedia PDF Downloads 1423609 The Impact of Artificial Intelligence on Food Industry
Authors: George Hanna Abdelmelek Henien
Abstract:
Quality and safety issues are common in Ethiopia's food processing industry, which can negatively impact consumers' health and livelihoods. The country is known for its various agricultural products that are important to the economy. However, food quality and safety policies and management practices in the food processing industry have led to many health problems, foodborne illnesses and economic losses. This article aims to show the causes and consequences of food safety and quality problems in the food processing industry in Ethiopia and discuss possible solutions to solve them. One of the main reasons for food quality and safety in Ethiopia's food processing industry is the lack of adequate regulation and enforcement mechanisms. Inadequate food safety and quality policies have led to inefficiencies in food production. Additionally, the failure to monitor and enforce existing regulations has created a good opportunity for unscrupulous companies to engage in harmful practices that endanger the lives of citizens. The impact on food quality and safety is significant due to loss of life, high medical costs, and loss of consumer confidence in the food processing industry. Foodborne diseases such as diarrhoea, typhoid and cholera are common in Ethiopia, and food quality and safety play an important role in . Additionally, food recalls due to contamination or contamination often cause significant economic losses in the food processing industry. To solve these problems, the Ethiopian government began taking measures to improve food quality and safety in the food processing industry. One of the most prominent initiatives is the Ethiopian Food and Drug Administration (EFDA), which was established in 2010 to monitor and control the quality and safety of food and beverage products in the country. EFDA has implemented many measures to improve food safety, such as carrying out routine inspections, monitoring the import of food products and implementing labeling requirements. Another solution that can improve food quality and safety in the food processing industry in Ethiopia is the implementation of food safety management system (FSMS). FSMS is a set of procedures and policies designed to identify, assess and control food safety risks during food processing. Implementing a FSMS can help companies in the food processing industry identify and address potential risks before they harm consumers. Additionally, implementing an FSMS can help companies comply with current safety and security regulations. Consequently, improving food safety policy and management system in Ethiopia's food processing industry is important to protect people's health and improve the country's economy. . Addressing the root causes of food quality and safety and implementing practical solutions that can help improve the overall food safety and quality in the country, such as establishing regulatory bodies and implementing food management systems.Keywords: food quality, food safety, policy, management system, food processing industry food traceability, industry 4.0, internet of things, block chain, best worst method, marcos
Procedia PDF Downloads 633608 Processing and Characterization of Oxide Dispersion Strengthened (ODS) Fe-14Cr-3W-0.5Ti-0.3Y₂O₃ (14YWT) Ferritic Steel
Authors: Farha Mizana Shamsudin, Shahidan Radiman, Yusof Abdullah, Nasri Abdul Hamid
Abstract:
Oxide dispersion strengthened (ODS) ferritic steels are amongst the most promising candidates for large scale structural materials to be applied in next generation fission and fusion nuclear power reactors. This kind of material is relatively stable at high temperature, possess remarkable mechanical properties and comparatively good resistance from neutron radiation damage. The superior performance of ODS ferritic steels over their conventional properties is attributed to the high number density of nano-sized dispersoids that act as nucleation sites and stable sinks for many small helium bubbles resulting from irradiation, and also as pinning points to dislocation movement and grain growth. ODS ferritic steels are usually produced by powder metallurgical routes involving mechanical alloying (MA) process of Y2O3 and pre-alloyed or elemental metallic powders, and then consolidated by hot isostatic pressing (HIP) or hot extrusion (HE) techniques. In this study, Fe-14Cr-3W-0.5Ti-0.3Y₂O₃ (designated as 14YWT) was produced by mechanical alloying process and followed by hot isostatic pressing (HIP) technique. Crystal structure and morphology of this sample were identified and characterized by using X-ray Diffraction (XRD) and field emission scanning electron microscope (FESEM) respectively. The magnetic measurement of this sample at room temperature was carried out by using a vibrating sample magnetometer (VSM). FESEM micrograph revealed a homogeneous microstructure constituted by fine grains of less than 650 nm in size. The ultra-fine dispersoids of size between 5 nm to 19 nm were observed homogeneously distributed within the BCC matrix. The EDS mapping reveals that the dispersoids contain Y-Ti-O nanoclusters and from the magnetization curve plotted by VSM, this sample approaches the behavior of soft ferromagnetic materials. In conclusion, ODS Fe-14Cr-3W-0.5Ti-0.3Y₂O₃ (14YWT) ferritic steel was successfully produced by HIP technique in this present study.Keywords: hot isostatic pressing, magnetization, microstructure, ODS ferritic steel
Procedia PDF Downloads 3193607 Study of the Montmorillonite Effect on PET/Clay and PEN/Clay Nanocomposites
Authors: F. Zouai, F. Z. Benabid, S. Bouhelal, D. Benachour
Abstract:
Nanocomposite polymer / clay are relatively important area of research. These reinforced plastics have attracted considerable attention in scientific and industrial fields because a very small amount of clay can significantly improve the properties of the polymer. The polymeric matrices used in this work are two saturated polyesters ie polyethylene terephthalate (PET) and polyethylene naphthalate (PEN).The success of processing compatible blends, based on poly(ethylene terephthalate) (PET)/ poly(ethylene naphthalene) (PEN)/clay nanocomposites in one step by reactive melt extrusion is described. Untreated clay was first purified and functionalized ‘in situ’ with a compound based on an organic peroxide/ sulfur mixture and (tetramethylthiuram disulfide) as the activator for sulfur. The PET and PEN materials were first separately mixed in the molten state with functionalized clay. The PET/4 wt% clay and PEN/7.5 wt% clay compositions showed total exfoliation. These compositions, denoted nPET and nPEN, respectively, were used to prepare new n(PET/PEN) nanoblends in the same mixing batch. The n(PET/PEN) nanoblends were compared to neat PET/PEN blends. The blends and nanocomposites were characterized using various techniques. Microstructural and nanostructural properties were investigated. Fourier transform infrared spectroscopy (FTIR) results showed that the exfoliation of tetrahedral clay nanolayers is complete and the octahedral structure totally disappears. It was shown that total exfoliation, confirmed by wide angle X-ray scattering (WAXS) measurements, contributes to the enhancement of impact strength and tensile modulus. In addition, WAXS results indicated that all samples are amorphous. The differential scanning calorimetry (DSC) study indicated the occurrence of one glass transition temperature Tg, one crystallization temperature Tc and one melting temperature Tm for every composition. This was evidence that both PET/PEN and nPET/nPEN blends are compatible in the entire range of compositions. In addition, the nPET/nPEN blends showed lower Tc and higher Tm values than the corresponding neat PET/PEN blends. In conclusion, the results obtained indicate that n(PET/PEN) blends are different from the pure ones in nanostructure and physical behavior.Keywords: blends, exfoliation, DRX, DSC, montmorillonite, nanocomposites, PEN, PET, plastograph, reactive melt-mixing
Procedia PDF Downloads 2983606 Cognitive Dysfunctioning and the Fronto-Limbic Network in Bipolar Disorder Patients: A Fmri Meta-Analysis
Authors: Rahele Mesbah, Nic Van Der Wee, Manja Koenders, Erik Giltay, Albert Van Hemert, Max De Leeuw
Abstract:
Introduction: Patients with bipolar disorder (BD), characterized by depressive and manic episodes, often suffer from cognitive dysfunction. An up-to-date meta-analysis of functional Magnetic Resonance Imaging (fMRI) studies examining cognitive function in BD is lacking. Objective: The aim of the current fMRI meta-analysis is to investigate brain functioning of bipolar patients compared with healthy subjects within three domains of emotion processing, reward processing, and working memory. Method: Differences in brain regions activation were tested within whole-brain analysis using the activation likelihood estimation (ALE) method. Separate analyses were performed for each cognitive domain. Results: A total of 50 fMRI studies were included: 20 studies used an emotion processing (316 BD and 369 HC) task, 9 studies a reward processing task (215 BD and 213 HC), and 21 studies used a working memory task (503 BD and 445 HC). During emotion processing, BD patients hyperactivated parts of the left amygdala and hippocampus as compared to HC’s, but showed hypoactivation in the inferior frontal gyrus (IFG). Regarding reward processing, BD patients showed hyperactivation in part of the orbitofrontal cortex (OFC). During working memory, BD patients showed increased activity in the prefrontal cortex (PFC) and anterior cingulate cortex (ACC). Conclusions: This meta-analysis revealed evidence for activity disturbances in several brain areas involved in the cognitive functioning of BD patients. Furthermore, most of the found regions are part of the so-called fronto-limbic network which is hypothesized to be affected as a result of BD candidate genes' expression.Keywords: cognitive functioning, fMRI analysis, bipolar disorder, fronto-limbic network
Procedia PDF Downloads 4623605 Spatial Audio Player Using Musical Genre Classification
Authors: Jun-Yong Lee, Hyoung-Gook Kim
Abstract:
In this paper, we propose a smart music player that combines the musical genre classification and the spatial audio processing. The musical genre is classified based on content analysis of the musical segment detected from the audio stream. In parallel with the classification, the spatial audio quality is achieved by adding an artificial reverberation in a virtual acoustic space to the input mono sound. Thereafter, the spatial sound is boosted with the given frequency gains based on the musical genre when played back. Experiments measured the accuracy of detecting the musical segment from the audio stream and its musical genre classification. A listening test was performed based on the virtual acoustic space based spatial audio processing.Keywords: automatic equalization, genre classification, music segment detection, spatial audio processing
Procedia PDF Downloads 4293604 Advancements in Mathematical Modeling and Optimization for Control, Signal Processing, and Energy Systems
Authors: Zahid Ullah, Atlas Khan
Abstract:
This abstract focuses on the advancements in mathematical modeling and optimization techniques that play a crucial role in enhancing the efficiency, reliability, and performance of these systems. In this era of rapidly evolving technology, mathematical modeling and optimization offer powerful tools to tackle the complex challenges faced by control, signal processing, and energy systems. This abstract presents the latest research and developments in mathematical methodologies, encompassing areas such as control theory, system identification, signal processing algorithms, and energy optimization. The abstract highlights the interdisciplinary nature of mathematical modeling and optimization, showcasing their applications in a wide range of domains, including power systems, communication networks, industrial automation, and renewable energy. It explores key mathematical techniques, such as linear and nonlinear programming, convex optimization, stochastic modeling, and numerical algorithms, that enable the design, analysis, and optimization of complex control and signal processing systems. Furthermore, the abstract emphasizes the importance of addressing real-world challenges in control, signal processing, and energy systems through innovative mathematical approaches. It discusses the integration of mathematical models with data-driven approaches, machine learning, and artificial intelligence to enhance system performance, adaptability, and decision-making capabilities. The abstract also underscores the significance of bridging the gap between theoretical advancements and practical applications. It recognizes the need for practical implementation of mathematical models and optimization algorithms in real-world systems, considering factors such as scalability, computational efficiency, and robustness. In summary, this abstract showcases the advancements in mathematical modeling and optimization techniques for control, signal processing, and energy systems. It highlights the interdisciplinary nature of these techniques, their applications across various domains, and their potential to address real-world challenges. The abstract emphasizes the importance of practical implementation and integration with emerging technologies to drive innovation and improve the performance of control, signal processing, and energy.Keywords: mathematical modeling, optimization, control systems, signal processing, energy systems, interdisciplinary applications, system identification, numerical algorithms
Procedia PDF Downloads 1123603 The Positive Effects of Processing Instruction on the Acquisition of French as a Second Language: An Eye-Tracking Study
Authors: Cecile Laval, Harriet Lowe
Abstract:
Processing Instruction is a psycholinguistic pedagogical approach drawing insights from the Input Processing Model which establishes the initial innate strategies used by second language learners to connect form and meaning of linguistic features. With the ever-growing use of technology in Second Language Acquisition research, the present study uses eye-tracking to measure the effectiveness of Processing Instruction in the acquisition of French and its effects on learner’s cognitive strategies. The experiment was designed using a TOBII Pro-TX300 eye-tracker to measure participants’ default strategies when processing French linguistic input and any cognitive changes after receiving Processing Instruction treatment. Participants were drawn from lower intermediate adult learners of French at the University of Greenwich and randomly assigned to two groups. The study used a pre-test/post-test methodology. The pre-tests (one per linguistic item) were administered via the eye-tracker to both groups one week prior to instructional treatment. One group received full Processing Instruction treatment (explicit information on the grammatical item and on the processing strategies, and structured input activities) on the primary target linguistic feature (French past tense imperfective aspect). The second group received Processing Instruction treatment except the explicit information on the processing strategies. Three immediate post-tests on the three grammatical structures under investigation (French past tense imperfective aspect, French Subjunctive used for the expression of doubt, and the French causative construction with Faire) were administered with the eye-tracker. The eye-tracking data showed the positive change in learners’ processing of the French target features after instruction with improvement in the interpretation of the three linguistic features under investigation. 100% of participants in both groups made a statistically significant improvement (p=0.001) in the interpretation of the primary target feature (French past tense imperfective aspect) after treatment. 62.5% of participants made an improvement in the secondary target item (French Subjunctive used for the expression of doubt) and 37.5% of participants made an improvement in the cumulative target feature (French causative construction with Faire). Statistically there was no significant difference between the pre-test and post-test scores in the cumulative target feature; however, the variance approximately tripled between the pre-test and the post-test (3.9 pre-test and 9.6 post-test). This suggests that the treatment does not affect participants homogenously and implies a role for individual differences in the transfer-of-training effect of Processing Instruction. The use of eye-tracking provides an opportunity for the study of unconscious processing decisions made during moment-by-moment comprehension. The visual data from the eye-tracking demonstrates changes in participants’ processing strategies. Gaze plots from pre- and post-tests display participants fixation points changing from focusing on content words to focusing on the verb ending. This change in processing strategies can be clearly seen in the interpretation of sentences in both primary and secondary target features. This paper will present the research methodology, design and results of the experimental study using eye-tracking to investigate the primary effects and transfer-of-training effects of Processing Instruction. It will then provide evidence of the cognitive benefits of Processing Instruction in Second Language Acquisition and offer suggestion in second language teaching of grammar.Keywords: eye-tracking, language teaching, processing instruction, second language acquisition
Procedia PDF Downloads 2793602 Optimisation of Wastewater Treatment for Yeast Processing Effluent Using Response Surface Methodology
Authors: Shepherd Manhokwe, Sheron Shoko, Cuthbert Zvidzai
Abstract:
In the present study, the interactive effects of temperature and cultured bacteria on the performance of a biological treatment system of yeast processing wastewater were investigated. The main objective of this study was to investigate and optimize the operating parameters that reduce organic load and colour. Experiments were conducted based on a Central Composite Design (CCD) and analysed using Response Surface Methodology (RSM). Three dependent parameters were either directly measured or calculated as response. These parameters were total Chemical Oxygen Demand (COD) removal, colour reduction and total solids. COD removal efficiency of 26 % and decolourization efficiency of 44 % were recorded for the wastewater treatment. The optimized conditions for the biological treatment were found to be at 20 g/l cultured bacteria and 25 °C for COD reduction. For colour reduction optimum conditions were temperature of 30.35°C and bacterial formulation of 20g/l. Biological treatment of baker’s yeast processing effluent is a suitable process for the removal of organic load and colour from wastewater, especially when the operating parameters are optimized.Keywords: COD reduction, optimisation, response surface methodology, yeast processing wastewater
Procedia PDF Downloads 3443601 EEG Signal Processing Methods to Differentiate Mental States
Authors: Sun H. Hwang, Young E. Lee, Yunhan Ga, Gilwon Yoon
Abstract:
EEG is a very complex signal with noises and other bio-potential interferences. EOG is the most distinct interfering signal when EEG signals are measured and analyzed. It is very important how to process raw EEG signals in order to obtain useful information. In this study, the EEG signal processing techniques such as EOG filtering and outlier removal were examined to minimize unwanted EOG signals and other noises. The two different mental states of resting and focusing were examined through EEG analysis. A focused state was induced by letting subjects to watch a red dot on the white screen. EEG data for 32 healthy subjects were measured. EEG data after 60-Hz notch filtering were processed by a commercially available EOG filtering and our presented algorithm based on the removal of outliers. The ratio of beta wave to theta wave was used as a parameter for determining the degree of focusing. The results show that our algorithm was more appropriate than the existing EOG filtering.Keywords: EEG, focus, mental state, outlier, signal processing
Procedia PDF Downloads 2833600 The Influence of Different Technologies on the Infiltration Properties and Soil Surface Crusting Processing in the North Bohemia Region
Authors: Miroslav Dumbrovsky, Lucie Larisova
Abstract:
The infiltration characteristic of the soil surface is one of the major factors that determines the potential soil degradation risk. The physical, chemical and biological characteristic of soil is changed by the processing of soil. The infiltration soil ability has an important role in soil and water conservation. The subject of the contribution is the evaluation of the influence of the conventional tillage and reduced tillage technology on soil surface crusting processing and infiltration properties of the soil in the North Bohemia region. Field experimental work at the area was carried out in the years 2013-2016 on Cambisol district medium-heavy clayey soil. The research was conducted on sloping erosion-endangered blocks of compacted arable land. The areas were chosen each year in the way that one of the experimental areas was handled by conventional tillage technologies and the other by reduced tillage technologies. Intact soil samples were taken into Kopecký´s cylinders in the three landscape positions, at a depth of 10 cm (representing topsoil) and 30 cm (representing subsoil). The cumulative infiltration was measured using a mini-disc infiltrometer near the consumption points. The Zhang method (1997), which provides an estimate of the unsaturated hydraulic conductivity K(h), was used for the evaluation of the infiltration tests of the mini-disc infiltrometer. The soil profile processed by conventional tillage showed a higher degree of compaction and soil crusting processing. The bulk density was between 1.10–1.67 g.cm⁻³, compared to the land processed by the reduced tillage technology, where the values were between 0.80–1.29 g.cm⁻³. Unsaturated hydraulic conductivity values were about one-third higher within the reduced tillage technology soil processing.Keywords: soil crusting processing, unsaturated hydraulic conductivity, cumulative infiltration, bulk density, porosity
Procedia PDF Downloads 2473599 On the Implementation of The Pulse Coupled Neural Network (PCNN) in the Vision of Cognitive Systems
Authors: Hala Zaghloul, Taymoor Nazmy
Abstract:
One of the great challenges of the 21st century is to build a robot that can perceive and act within its environment and communicate with people, while also exhibiting the cognitive capabilities that lead to performance like that of people. The Pulse Coupled Neural Network, PCNN, is a relative new ANN model that derived from a neural mammal model with a great potential in the area of image processing as well as target recognition, feature extraction, speech recognition, combinatorial optimization, compressed encoding. PCNN has unique feature among other types of neural network, which make it a candid to be an important approach for perceiving in cognitive systems. This work show and emphasis on the potentials of PCNN to perform different tasks related to image processing. The main drawback or the obstacle that prevent the direct implementation of such technique, is the need to find away to control the PCNN parameters toward perform a specific task. This paper will evaluate the performance of PCNN standard model for processing images with different properties, and select the important parameters that give a significant result, also, the approaches towards find a way for the adaptation of the PCNN parameters to perform a specific task.Keywords: cognitive system, image processing, segmentation, PCNN kernels
Procedia PDF Downloads 2803598 Studying the Value-Added Chain for the Fish Distribution Process at Quang Binh Fishing Port in Vietnam
Authors: Van Chung Nguyen
Abstract:
The purpose of this study is to study the current status of the value chain for fish distribution at Quang Binh Fishing Port with 360 research samples in which the research subjects are fishermen, traders, retailers, and businesses. The research uses the approach of applying the value chain theoretical framework of Kaplinsky and Morris to quantify and describe market channels and actors participating in the value chain and analyze the value-added process of these companies according to market channels. The analysis results show that fishermen directly catch fish with high economic efficiency, but processing enterprises and, especially retailers, are the agents to obtain higher added value. Processing enterprises play a role that is not really clear due to outdated processing technology; in contrast, retailers have the highest added value. This shows that the added value of the fish supply chain at Quang Binh fishing port is still limited, leading to low output quality. Therefore, the selling price of fish to the market is still high compared to the abundant fish resources, leading to low consumption and limiting exports due to the quality of processing enterprises. This reduces demand and fishing capacity, and productivity is lower than potential. To improve the fish value chain at fishing ports, it is necessary to focus on improving product quality, strengthening linkages between actors, building brands and product consumption markets at the same time, improving the capacity of export processing enterprises.Keywords: Quang Binh fishing port, value chain, market, distributions channel
Procedia PDF Downloads 733597 Recent Development on Application of Microwave Energy on Process Metallurgy
Authors: Mamdouh Omran, Timo Fabritius
Abstract:
A growing interest in microwave heating has emerged recently. Many researchers have begun to pay attention to microwave energy as an alternative technique for processing various primary and secondary raw materials. Compared to conventional methods, microwave processing offers several advantages, such as selective heating, rapid heating, and volumetric heating. The present study gives a summary on our recent works related to the use of microwave energy for the recovery of valuable metals from primary and secondary raw materials. The research is mainly focusing on: Application of microwave for the recovery and recycling of metals from different metallurgical industries wastes (i.e. electric arc furnace (EAF) dust, blast furnace (BF), basic oxygen furnace (BOF) sludge). Application of microwave for upgrading and recovery of valuable metals from primary raw materials (i.e. iron ore). The results indicated that microwave heating is a promising and effective technique for processing primary and secondary steelmaking wastes. After microwave treatment of iron ore for 60 s and 900 W, about a 28.30% increase in grindability.Wet high intensity magnetic separation (WHIMS) indicated that the magnetic separation increased from 34% to 98% after microwave treatment for 90 s and 900 W. In the case of EAF dust, after microwave processing at 1100 W for 20 min, Zinc removal from 64 % to ~ 97 %, depending on mixture ratio and treatment time.Keywords: dielectric properties, microwave heating, raw materials, secondary raw materials
Procedia PDF Downloads 953596 Study of Nano Clay Based on Pet
Authors: F. Zouai, F. Z. Benabid, S. Bouhelal, D. Benachoura
Abstract:
A (PET)/clay nano composites has been successfully performed in one step by reactive melt extrusion. The PEN was first mixed in the melt state with different amounts of functionalized clay. It was observed that the composition PET/4 wt% clay showed total exfoliation. These completely exfoliated composition called nPET, was used to prepare new nPET nano composites in the same mixing batch. The nPEN was compared to neat PET. The nanocomposites were characterized by different techniques: differential scanning calorimetry (DSC) and wide-angle X-ray scattering (WAXS). The micro and nanostructure/properties relationships were investigated. From the different WAXS patterns, it is seen that all samples are amorphous phase. In addition, nPET blends present lower Tc values and higher Tm values than the corresponding neat PET. The present study allowed establishing good correlations between the different measured properties.Keywords: PET, montmorillonite, nanocomposites, exfoliation, reactive melt-mixing
Procedia PDF Downloads 4033595 A Multi-Criteria Model for Scheduling of Stochastic Single Machine Problem with Outsourcing and Solving It through Application of Chance Constrained
Authors: Homa Ghave, Parmis Shahmaleki
Abstract:
This paper presents a new multi-criteria stochastic mathematical model for a single machine scheduling with outsourcing allowed. There are multiple jobs processing in batch. For each batch, all of job or a quantity of it can be outsourced. The jobs have stochastic processing time and lead time and deterministic due dates arrive randomly. Because of the stochastic inherent of processing time and lead time, we use the chance constrained programming for modeling the problem. First, the problem is formulated in form of stochastic programming and then prepared in a form of deterministic mixed integer linear programming. The objectives are considered in the model to minimize the maximum tardiness and outsourcing cost simultaneously. Several procedures have been developed to deal with the multi-criteria problem. In this paper, we utilize the concept of satisfaction functions to increases the manager’s preference. The proposed approach is tested on instances where the random variables are normally distributed.Keywords: single machine scheduling, multi-criteria mathematical model, outsourcing strategy, uncertain lead times and processing times, chance constrained programming, satisfaction function
Procedia PDF Downloads 2643594 External Retinal Prosthesis Image Processing System Used One-Cue Saliency Map Based on DSP
Authors: Yili Chen, Jixiang Fu, Zhihua Liu, Zhicheng Zhang, Rongmao Li, Nan Fu, Yaoqin Xie
Abstract:
Retinal prothesis is designed to help the blind to get some sight.It is made up of internal part and external part.In external part ,there is made up of camera, image processing, and RF transmitter.In internal part, there is RF receiver, implant chip,micro-electrode.The image got from the camera should be processed by suitable stragies to corresponds to stimulus the electrode.Nowadays, the number of the micro-electrode is hundreds and we don’t know the mechanism how the elctrode stimulus the optic nerve, an easy way to the hypothesis is that the pixel in the image is correspondence to the electrode.So it is a question how to get the important information of the image captured from the picture.There are many strategies to experimented to get the most important information as soon as possible, due to the real time system.ROI is a useful algorithem to extract the region of the interest.Our paper will explain the details of the orinciples and functions of the ROI.And based on this, we simplified the ROI algrithem,and used it in outside image prcessing DSP system of the retinal prothesis.Results show that our image processing stratiges is suitable for real-time retinal prothesis and can cut redundant information and help useful information to express in the low-size image.Keywords: image processing, region of interest, saliency map, low-size image, useful information express, cut redundant information in image
Procedia PDF Downloads 2823593 Robust Image Design Based Steganographic System
Authors: Sadiq J. Abou-Loukh, Hanan M. Habbi
Abstract:
This paper presents a steganography to hide the transmitted information without excite suspicious and also illustrates the level of secrecy that can be increased by using cryptography techniques. The proposed system has been implemented firstly by encrypted image file one time pad key and secondly encrypted message that hidden to perform encryption followed by image embedding. Then the new image file will be created from the original image by using four triangles operation, the new image is processed by one of two image processing techniques. The proposed two processing techniques are thresholding and differential predictive coding (DPC). Afterwards, encryption or decryption keys are generated by functional key generator. The generator key is used one time only. Encrypted text will be hidden in the places that are not used for image processing and key generation system has high embedding rate (0.1875 character/pixel) for true color image (24 bit depth).Keywords: encryption, thresholding, differential predictive coding, four triangles operation
Procedia PDF Downloads 4933592 Optimized Road Lane Detection Through a Combined Canny Edge Detection, Hough Transform, and Scaleable Region Masking Toward Autonomous Driving
Authors: Samane Sharifi Monfared, Lavdie Rada
Abstract:
Nowadays, autonomous vehicles are developing rapidly toward facilitating human car driving. One of the main issues is road lane detection for a suitable guidance direction and car accident prevention. This paper aims to improve and optimize road line detection based on a combination of camera calibration, the Hough transform, and Canny edge detection. The video processing is implemented using the Open CV library with the novelty of having a scale able region masking. The aim of the study is to introduce automatic road lane detection techniques with the user’s minimum manual intervention.Keywords: hough transform, canny edge detection, optimisation, scaleable masking, camera calibration, improving the quality of image, image processing, video processing
Procedia PDF Downloads 943591 Translation Directionality: An Eye Tracking Study
Authors: Elahe Kamari
Abstract:
Research on translation process has been conducted for more than 20 years, investigating various issues and using different research methodologies. Most recently, researchers have started to use eye tracking to study translation processes. They believed that the observable, measurable data that can be gained from eye tracking are indicators of unobservable cognitive processes happening in the translators’ mind during translation tasks. The aim of this study was to investigate directionality in translation processes through using eye tracking. The following hypotheses were tested: 1) processing the target text requires more cognitive effort than processing the source text, in both directions of translation; 2) L2 translation tasks on the whole require more cognitive effort than L1 tasks; 3) cognitive resources allocated to the processing of the source text is higher in L1 translation than in L2 translation; 4) cognitive resources allocated to the processing of the target text is higher in L2 translation than in L1 translation; and 5) in both directions non-professional translators invest more cognitive effort in translation tasks than do professional translators. The performance of a group of 30 male professional translators was compared with that of a group of 30 male non-professional translators. All the participants translated two comparable texts one into their L1 (Persian) and the other into their L2 (English). The eye tracker measured gaze time, average fixation duration, total task length and pupil dilation. These variables are assumed to measure the cognitive effort allocated to the translation task. The data derived from eye tracking only confirmed the first hypothesis. This hypothesis was confirmed by all the relevant indicators: gaze time, average fixation duration and pupil dilation. The second hypothesis that L2 translation tasks requires allocation of more cognitive resources than L1 translation tasks has not been confirmed by all four indicators. The third hypothesis that source text processing requires more cognitive resources in L1 translation than in L2 translation and the fourth hypothesis that target text processing requires more cognitive effort in L2 translation than L1 translation were not confirmed. It seems that source text processing in L2 translation can be just as demanding as in L1 translation. The final hypothesis that non-professional translators allocate more cognitive resources for the same translation tasks than do the professionals was partially confirmed. One of the indicators, average fixation duration, indicated higher cognitive effort-related values for professionals.Keywords: translation processes, eye tracking, cognitive resources, directionality
Procedia PDF Downloads 4633590 A Trends Analysis of Yatch Simulator
Authors: Jae-Neung Lee, Keun-Chang Kwak
Abstract:
This paper describes an analysis of Yacht Simulator international trends and also explains about Yacht. Examples of yacht Simulator using Yacht Simulator include image processing for totaling the total number of vehicles, edge/target detection, detection and evasion algorithm, image processing using SIFT (scale invariant features transform) matching, and application of median filter and thresholding.Keywords: yacht simulator, simulator, trends analysis, SIFT
Procedia PDF Downloads 4323589 Video Processing of a Football Game: Detecting Features of a Football Match for Automated Calculation of Statistics
Authors: Rishabh Beri, Sahil Shah
Abstract:
We have applied a range of filters and processing in order to extract out the various features of the football game, like the field lines of a football field. Another important aspect was the detection of the players in the field and tagging them according to their teams distinguished by their jersey colours. This extracted information combined about the players and field helped us to create a virtual field that consists of the playing field and the players mapped to their locations in it.Keywords: Detect, Football, Players, Virtual
Procedia PDF Downloads 3313588 An Image Processing Based Approach for Assessing Wheelchair Cushions
Authors: B. Farahani, R. Fadil, A. Aboonabi, B. Hoffmann, J. Loscheider, K. Tavakolian, S. Arzanpour
Abstract:
Wheelchair users spend long hours in a sitting position, and selecting the right cushion is highly critical in preventing pressure ulcers in that demographic. Pressure mapping systems (PMS) are typically used in clinical settings by therapists to identify the sitting profile and pressure points in the sitting area to select the cushion that fits the best for the users. A PMS is a flexible mat composed of arrays of distributed networks of flexible sensors. The output of the PMS systems is a color-coded image that shows the intensity of the pressure concentration. Therapists use the PMS images to compare different cushions fit for each user. This process is highly subjective and requires good visual memory for the best outcome. This paper aims to develop an image processing technique to analyze the images of PMS and provide an objective measure to assess the cushions based on their pressure distribution mappings. In this paper, we first reviewed the skeletal anatomy of the human sitting area and its relation to the PMS image. This knowledge is then used to identify the important features that must be considered in image processing. We then developed an algorithm based on those features to analyze the images and rank them according to their fit to the users' needs.Keywords: dynamic cushion, image processing, pressure mapping system, wheelchair
Procedia PDF Downloads 1703587 Relationship among Teams' Information Processing Capacity and Performance in Information System Projects: The Effects of Uncertainty and Equivocality
Authors: Ouafa Sakka, Henri Barki, Louise Cote
Abstract:
Uncertainty and equivocality are defined in the information processing literature as two task characteristics that require different information processing responses from managers. As uncertainty often stems from a lack of information, addressing it is thought to require the collection of additional data. On the other hand, as equivocality stems from ambiguity and a lack of understanding of the task at hand, addressing it is thought to require rich communication between those involved. Past research has provided weak to moderate empirical support to these hypotheses. The present study contributes to this literature by defining uncertainty and equivocality at the project level and investigating their moderating effects on the association between several project information processing constructs and project performance. The information processing constructs considered are the amount of information collected by the project team, and the richness and frequency of formal communications among the team members to discuss the project’s follow-up reports. Data on 93 information system development (ISD) project managers was collected in a questionnaire survey and analyzed it via the Fisher Test for correlation differences. The results indicate that the highest project performance levels were observed in projects characterized by high uncertainty and low equivocality in which project managers were provided with detailed and updated information on project costs and schedules. In addition, our findings show that information about user needs and technical aspects of the project is less useful to managing projects where uncertainty and equivocality are high. Further, while the strongest positive effect of interactive use of follow-up reports on performance occurred in projects where both uncertainty and equivocality levels were high, its weakest effect occurred when both of these were low.Keywords: uncertainty, equivocality, information processing model, management control systems, project control, interactive use, diagnostic use, information system development
Procedia PDF Downloads 2943586 Intelligent Grading System of Apple Using Neural Network Arbitration
Authors: Ebenezer Obaloluwa Olaniyi
Abstract:
In this paper, an intelligent system has been designed to grade apple based on either its defective or healthy for production in food processing. This paper is segmented into two different phase. In the first phase, the image processing techniques were employed to extract the necessary features required in the apple. These techniques include grayscale conversion, segmentation where a threshold value is chosen to separate the foreground of the images from the background. Then edge detection was also employed to bring out the features in the images. These extracted features were then fed into the neural network in the second phase of the paper. The second phase is a classification phase where neural network employed to classify the defective apple from the healthy apple. In this phase, the network was trained with back propagation and tested with feed forward network. The recognition rate obtained from our system shows that our system is more accurate and faster as compared with previous work.Keywords: image processing, neural network, apple, intelligent system
Procedia PDF Downloads 3983585 Identification System for Grading Banana in Food Processing Industry
Authors: Ebenezer O. Olaniyi, Oyebade K. Oyedotun, Khashman Adnan
Abstract:
In the food industry high quality production is required within a limited time to meet up with the demand in the society. In this research work, we have developed a model which can be used to replace the human operator due to their low output in production and slow in making decisions as a result of an individual differences in deciding the defective and healthy banana. This model can perform the vision attributes of human operators in deciding if the banana is defective or healthy for food production based. This research work is divided into two phase, the first phase is the image processing where several image processing techniques such as colour conversion, edge detection, thresholding and morphological operation were employed to extract features for training and testing the network in the second phase. These features extracted in the first phase were used in the second phase; the classification system phase where the multilayer perceptron using backpropagation neural network was employed to train the network. After the network has learned and converges, the network was tested with feedforward neural network to determine the performance of the network. From this experiment, a recognition rate of 97% was obtained and the time taken for this experiment was limited which makes the system accurate for use in the food industry.Keywords: banana, food processing, identification system, neural network
Procedia PDF Downloads 4703584 Cluster Analysis and Benchmarking for Performance Optimization of a Pyrochlore Processing Unit
Authors: Ana C. R. P. Ferreira, Adriano H. P. Pereira
Abstract:
Given the frequent variation of mineral properties throughout the Araxá pyrochlore deposit, even if a good homogenization work has been carried out before feeding the processing plants, an operation with quality and performance’s high variety standard is expected. These results could be improved and standardized if the blend composition parameters that most influence the processing route are determined, and then the types of raw materials are grouped by them, finally presenting a great reference with operational settings for each group. Associating the physical and chemical parameters of a unit operation through benchmarking or even an optimal reference of metallurgical recovery and product quality reflects in the reduction of the production costs, optimization of the mineral resource, and guarantee of greater stability in the subsequent processes of the production chain that uses the mineral of interest. Conducting a comprehensive exploratory data analysis to identify which characteristics of the ore are most relevant to the process route, associated with the use of Machine Learning algorithms for grouping the raw material (ore) and associating these with reference variables in the process’ benchmark is a reasonable alternative for the standardization and improvement of mineral processing units. Clustering methods through Decision Tree and K-Means were employed, associated with algorithms based on the theory of benchmarking, with criteria defined by the process team in order to reference the best adjustments for processing the ore piles of each cluster. A clean user interface was created to obtain the outputs of the created algorithm. The results were measured through the average time of adjustment and stabilization of the process after a new pile of homogenized ore enters the plant, as well as the average time needed to achieve the best processing result. Direct gains from the metallurgical recovery of the process were also measured. The results were promising, with a reduction in the adjustment time and stabilization when starting the processing of a new ore pile, as well as reaching the benchmark. Also noteworthy are the gains in metallurgical recovery, which reflect a significant saving in ore consumption and a consequent reduction in production costs, hence a more rational use of the tailings dams and life optimization of the mineral deposit.Keywords: mineral clustering, machine learning, process optimization, pyrochlore processing
Procedia PDF Downloads 143