Search results for: sol-gel processing
2971 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks
Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone
Abstract:
Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing
Procedia PDF Downloads 1882970 Application to Monitor the Citizens for Corona and Get Medical Aids or Assistance from Hospitals
Authors: Vathsala Kaluarachchi, Oshani Wimalarathna, Charith Vandebona, Gayani Chandrarathna, Lakmal Rupasinghe, Windhya Rankothge
Abstract:
It is the fundamental function of a monitoring system to allow users to collect and process data. A worldwide threat, the corona outbreak has wreaked havoc in Sri Lanka, and the situation has gotten out of hand. Since the epidemic, the Sri Lankan government has been unable to establish a systematic system for monitoring corona patients and providing emergency care in the event of an outbreak. Most patients have been held at home because of the high number of patients reported in the nation, but they do not yet have access to a functioning medical system. It has resulted in an increase in the number of patients who have been left untreated because of a lack of medical care. The absence of competent medical monitoring is the biggest cause of mortality for many people nowadays, according to our survey. As a result, a smartphone app for analyzing the patient's state and determining whether they should be hospitalized will be developed. Using the data supplied, we are aiming to send an alarm letter or SMS to the hospital once the system recognizes them. Since we know what those patients need and when they need it, we will put up a desktop program at the hospital to monitor their progress. Deep learning, image processing and application development, natural language processing, and blockchain management are some of the components of the research solution. The purpose of this research paper is to introduce a mechanism to connect hospitals and patients even when they are physically apart. Further data security and user-friendliness are enhanced through blockchain and NLP.Keywords: blockchain, deep learning, NLP, monitoring system
Procedia PDF Downloads 1332969 Learner's Difficulties Acquiring English: The Case of Native Speakers of Rio de La Plata Spanish Towards Justifying the Need for Corpora
Authors: Maria Zinnia Bardas Hoffmann
Abstract:
Contrastive Analysis (CA) is the systematic comparison between two languages. It stems from the notion that errors are caused by interference of the L1 system in the acquisition process of an L2. CA represents a useful tool to understand the nature of learning and acquisition. Also, this particular method promises a path to un-derstand the nature of underlying cognitive processes, even when other factors such as intrinsic motivation and teaching strategies were found to best explain student’s problems in acquisition. CA study is justified not only from the need to get a deeper understanding of the nature of SLA, but as an invaluable source to provide clues, at a cognitive level, for those general processes involved in rule formation and abstract thought. It is relevant for cross disciplinary studies and the fields of Computational Thought, Natural Language processing, Applied Linguistics, Cognitive Linguistics and Math Theory. That being said, this paper intends to address here as well its own set of constraints and limitations. Finally, this paper: (a) aims at identifying some of the difficulties students may find in their learning process due to the nature of their specific variety of L1, Rio de la Plata Spanish (RPS), (b) represents an attempt to discuss the necessity for specific models to approach CA.Keywords: second language acquisition, applied linguistics, contrastive analysis, applied contrastive analysis English language department, meta-linguistic rules, cross-linguistics studies, computational thought, natural language processing
Procedia PDF Downloads 1502968 Theorizing Optimal Use of Numbers and Anecdotes: The Science of Storytelling in Newsrooms
Authors: Hai L. Tran
Abstract:
When covering events and issues, the news media often employ both personal accounts as well as facts and figures. However, the process of using numbers and narratives in the newsroom is mostly operated through trial and error. There is a demonstrated need for the news industry to better understand the specific effects of storytelling and data-driven reporting on the audience as well as explanatory factors driving such effects. In the academic world, anecdotal evidence and statistical evidence have been studied in a mutually exclusive manner. Existing research tends to treat pertinent effects as though the use of one form precludes the other and as if a tradeoff is required. Meanwhile, narratives and statistical facts are often combined in various communication contexts, especially in news presentations. There is value in reconceptualizing and theorizing about both relative and collective impacts of numbers and narratives as well as the mechanism underlying such effects. The current undertaking seeks to link theory to practice by providing a complete picture of how and why people are influenced by information conveyed through quantitative and qualitative accounts. Specifically, the cognitive-experiential theory is invoked to argue that humans employ two distinct systems to process information. The rational system requires the processing of logical evidence effortful analytical cognitions, which are affect-free. Meanwhile, the experiential system is intuitive, rapid, automatic, and holistic, thereby demanding minimum cognitive resources and relating to the experience of affect. In certain situations, one system might dominate the other, but rational and experiential modes of processing operations in parallel and at the same time. As such, anecdotes and quantified facts impact audience response differently and a combination of data and narratives is more effective than either form of evidence. In addition, the present study identifies several media variables and human factors driving the effects of statistics and anecdotes. An integrative model is proposed to explain how message characteristics (modality, vividness, salience, congruency, position) and individual differences (involvement, numeracy skills, cognitive resources, cultural orientation) impact selective exposure, which in turn activates pertinent modes of processing, and thereby induces corresponding responses. The present study represents a step toward bridging theoretical frameworks from various disciplines to better understand the specific effects and the conditions under which the use of anecdotal evidence and/or statistical evidence enhances or undermines information processing. In addition to theoretical contributions, this research helps inform news professionals about the benefits and pitfalls of incorporating quantitative and qualitative accounts in reporting. It proposes a typology of possible scenarios and appropriate strategies for journalists to use when presenting news with anecdotes and numbers.Keywords: data, narrative, number, anecdote, storytelling, news
Procedia PDF Downloads 792967 A Palmprint Identification System Based Multi-Layer Perceptron
Authors: David P. Tantua, Abdulkader Helwan
Abstract:
Biometrics has been recently used for the human identification systems using the biological traits such as the fingerprints and iris scanning. Identification systems based biometrics show great efficiency and accuracy in such human identification applications. However, these types of systems are so far based on some image processing techniques only, which may decrease the efficiency of such applications. Thus, this paper aims to develop a human palmprint identification system using multi-layer perceptron neural network which has the capability to learn using a backpropagation learning algorithms. The developed system uses images obtained from a public database available on the internet (CASIA). The processing system is as follows: image filtering using median filter, image adjustment, image skeletonizing, edge detection using canny operator to extract features, clear unwanted components of the image. The second phase is to feed those processed images into a neural network classifier which will adaptively learn and create a class for each different image. 100 different images are used for training the system. Since this is an identification system, it should be tested with the same images. Therefore, the same 100 images are used for testing it, and any image out of the training set should be unrecognized. The experimental results shows that this developed system has a great accuracy 100% and it can be implemented in real life applications.Keywords: biometrics, biological traits, multi-layer perceptron neural network, image skeletonizing, edge detection using canny operator
Procedia PDF Downloads 3712966 Thiosulfate Leaching of the Auriferous Ore from Castromil Deposit: A Case Study
Authors: Rui Sousa, Aurora Futuro, António Fiúza
Abstract:
The exploitation of gold ore deposits is highly dependent on efficient mineral processing methods, although actual perspectives based on life-cycle assessment introduce difficulties that were unforeseen in a very recent past. Cyanidation is the most applied gold processing method, but the potential environmental problems derived from the usage of cyanide as leaching reagent led to a demand for alternative methods. Ammoniacal thiosulfate leaching is one of the most important alternatives to cyanidation. In this article, some experimental studies carried out in order to assess the feasibility of thiosulfate as a leaching agent for the ore from the unexploited Portuguese gold mine of Castromil. It became clear that the process depends on the concentrations of ammonia, thiosulfate and copper. Based on this fact, a few leaching tests were performed in order to assess the best reagent prescription, and also the effects of different combination of these concentrations. Higher thiosulfate concentrations cause the decrease of gold dissolution. Lower concentrations of ammonia require higher thiosulfate concentrations, and higher ammonia concentrations require lower thiosulfate concentrations. The addition of copper increases the gold dissolution ratio. Subsequently, some alternative operatory conditions were tested such as variations in temperature and in the solid/liquid ratio as well as the application of a pre-treatment before the leaching stage. Finally, thiosulfate leaching was compared to cyanidation. Thiosulfate leaching showed to be an important alternative, although a pre-treatment is required to increase the yield of the gold dissolution.Keywords: gold, leaching, pre-treatment, thiosulfate
Procedia PDF Downloads 3102965 Enhancing Project Management Performance in Prefabricated Building Construction under Uncertainty: A Comprehensive Approach
Authors: Niyongabo Elyse
Abstract:
Prefabricated building construction is a pioneering approach that combines design, production, and assembly to attain energy efficiency, environmental sustainability, and economic feasibility. Despite continuous development in the industry in China, the low technical maturity of standardized design, factory production, and construction assembly introduces uncertainties affecting prefabricated component production and on-site assembly processes. This research focuses on enhancing project management performance under uncertainty to help enterprises navigate these challenges and optimize project resources. The study introduces a perspective on how uncertain factors influence the implementation of prefabricated building construction projects. It proposes a theoretical model considering project process management ability, adaptability to uncertain environments, and collaboration ability of project participants. The impact of uncertain factors is demonstrated through case studies and quantitative analysis, revealing constraints on implementation time, cost, quality, and safety. To address uncertainties in prefabricated component production scheduling, a fuzzy model is presented, expressing processing times in interval values. The model utilizes a cooperative co-evolution evolution algorithm (CCEA) to optimize scheduling, demonstrated through a real case study showcasing reduced project duration and minimized effects of processing time disturbances. Additionally, the research addresses on-site assembly construction scheduling, considering the relationship between task processing times and assigned resources. A multi-objective model with fuzzy activity durations is proposed, employing a hybrid cooperative co-evolution evolution algorithm (HCCEA) to optimize project scheduling. Results from real case studies indicate improved project performance in terms of duration, cost, and resilience to processing time delays and resource changes. The study also introduces a multistage dynamic process control model, utilizing IoT technology for real-time monitoring during component production and construction assembly. This approach dynamically adjusts schedules when constraints arise, leading to enhanced project management performance, as demonstrated in a real prefabricated housing project. Key contributions include a fuzzy prefabricated components production scheduling model, a multi-objective multi-mode resource-constrained construction project scheduling model with fuzzy activity durations, a multi-stage dynamic process control model, and a cooperative co-evolution evolution algorithm. The integrated mathematical model addresses the complexity of prefabricated building construction project management, providing a theoretical foundation for practical decision-making in the field.Keywords: prefabricated construction, project management performance, uncertainty, fuzzy scheduling
Procedia PDF Downloads 502964 Geospatial Land Suitability Modeling for Biofuel Crop Using AHP
Authors: Naruemon Phongaksorn
Abstract:
The biofuel consumption has increased significantly over the decade resulting in the increasing request on agricultural land for biofuel feedstocks. However, the biofuel feedstocks are already stressed of having low productivity owing to inappropriate agricultural practices without considering suitability of crop land. This research evaluates the land suitability using GIS-integrated Analytic Hierarchy Processing (AHP) of biofuel crops: cassava, at Chachoengsao province, in Thailand. AHP method that has been widely accepted for land use planning. The objective of this study is compared between AHP method and the most limiting group of land characteristics method (classical approach). The reliable results of the land evaluation were tested against the crop performance assessed by the field investigation in 2015. In addition to the socio-economic land suitability, the expected availability of raw materials for biofuel production to meet the local biofuel demand, are also estimated. The results showed that the AHP could classify and map the physical land suitability with 10% higher overall accuracy than the classical approach. The Chachoengsao province showed high and moderate socio-economic land suitability for cassava. Conditions in the Chachoengsao province were also favorable for cassava plantation, as the expected raw material needed to support ethanol production matched that of ethanol plant capacity of this province. The GIS integrated AHP for biofuel crops land suitability evaluation appears to be a practical way of sustainably meeting biofuel production demand.Keywords: Analytic Hierarchy Processing (AHP), Cassava, Geographic Information Systems, Land suitability
Procedia PDF Downloads 2012963 Flashsonar or Echolocation Education: Expanding the Function of Hearing and Changing the Meaning of Blindness
Authors: Thomas, Daniel Tajo, Kish
Abstract:
Sight is primarily associated with the function of gathering and processing near and extended spatial information which is largely used to support self-determined interaction with the environment through self-directed movement and navigation. By contrast, hearing is primarily associated with the function of gathering and processing sequential information which may typically be used to support self-determined communication through the self-directed use of music and language. Blindness or the lack of vision is traditionally characterized by a lack of capacity to access spatial information which, in turn, is presumed to result in a lack of capacity for self-determined interaction with the environment due to limitations in self-directed movement and navigation. However, through a specific protocol of FlashSonar education developed by World Access for the Blind, the function of hearing can be expanded in blind people to carry out some of the functions normally associated with sight, that is to access and process near and extended spatial information to construct three-dimensional acoustic images of the environment. This perceptual education protocol results in a significant restoration in blind people of self-determined environmental interaction, movement, and navigational capacities normally attributed to vision - a new way to see. Thus, by expanding the function of hearing to process spatial information to restore self-determined movement, we are not only changing the meaning of blindness, and what it means to be blind, but we are also recasting the meaning of vision and what it is to see.Keywords: echolocation, changing, sensory, function
Procedia PDF Downloads 1542962 Exploring the Impact of Eye Movement Desensitization and Reprocessing (EMDR) And Mindfulness for Processing Trauma and Facilitating Healing During Ayahuasca Ceremonies
Authors: J. Hash, J. Converse, L. Gibson
Abstract:
Plant medicines are of growing interest for addressing mental health concerns. Ayahuasca, a traditional plant-based medicine, has established itself as a powerful way of processing trauma and precipitating healing and mood stabilization. Eye Movement Desensitization and Reprocessing (EMDR) is another treatment modality that aids in the rapid processing and resolution of trauma. We investigated group EMDR therapy, G-TEP, as a preparatory practice before Ayahuasca ceremonies to determine if the combination of these modalities supports participants in their journeys of letting go of past experiences negatively impacting mental health, thereby accentuating the healing of the plant medicine. We surveyed 96 participants (51 experimental G-TEP, 45 control grounding prior to their ceremony; age M=38.6, SD=9.1; F=57, M=34; white=39, Hispanic/Latinx=23, multiracial=11, Asian/Pacific Islander=10, other=7) in a pre-post, mixed methods design. Participants were surveyed for demographic characteristics, symptoms of PTSD and cPTSD (International Trauma Questionnaire (ITQ), depression (Beck Depression Inventory, BDI), and stress (Perceived Stress Scale, PSS) before the ceremony and at the end of the ceremony weekend. Open-ended questions also inquired about their expectations of the ceremony and results at the end. No baseline differences existed between the control and experimental participants. Overall, participants reported a decrease in meeting the threshold for PTSD symptoms (p<0.01); surprisingly, the control group reported significantly fewer thresholds met for symptoms of affective dysregulation, 2(1)=6.776, p<.01, negative self-concept, 2 (1)=7.122, p<.01, and disturbance in relationships, 2 (1)=9.804, p<.01, on subscales of the ITQ as compared to the experimental group. All participants also experienced a significant decrease in scores on the BDI, t(94)=8.995, p<.001, and PSS, t(91)=6.892, p<.001. Similar to patterns of PTSD symptoms, the control group reported significantly lower scores on the BDI, t(65.115)=-2.587, p<.01, and a trend toward lower PSS, t(90)=-1.775, p=.079 (this was significant with a one-sided test at p<.05), compared to the experimental group following the ceremony. Qualitative interviews among participants revealed a potential explanation for these relatively higher levels of depression and stress in the experimental group following the ceremony. Many participants reported needing more time to process their experience to gain an understanding of the effects of the Ayahuasca medicine. Others reported a sense of hopefulness and understanding of the sources of their trauma and the necessary steps to heal moving forward. This suggests increased introspection and openness to processing trauma, therefore making them more receptive to their emotions. The integration process of an Ayahuasca ceremony is a week- to months-long process that was not accessible in this stage of research, yet it is an integral process to understanding the full effects of the Ayahuasca medicine following the closure of a ceremony. Our future research aims to assess participants weeks into their integration process to determine the effectiveness of EMDR, and if the higher levels of depression and stress indicate the initial reaction to greater awareness of trauma and receptivity to healing.Keywords: ayahuasca, EMDR, PTSD, mental health
Procedia PDF Downloads 652961 Texturing of Tool Insert Using Femtosecond Laser
Authors: Ashfaq Khan, Aftab Khan, Mushtaq Khan, Sarem Sattar, Mohammad A Sheikh, Lin Li
Abstract:
Chip removal processes are one of key processes of the manufacturing industry where chip removal is conducted by tool inserts of exceptionally hard materials. Tungsten carbide has been extensively used as tool insert for machining processes involving chip removal processes. These hard materials are generally fabricated by single step sintering process as further modification after fabrication in these materials cannot be done easily. Advances in tool surface modification have revealed that advantages such as improved tribological properties and extended tool life can be harnessed from the same tool by texturing the tool rake surface. Moreover, it has been observed that the shape and location of the texture also influences the behavior. Although texturing offers plentiful advantages the challenge lies in the generation of textures on the tool surface. Extremely hard material such as diamond is required to process tungsten carbide. Laser is unique processing tool that does not have a physical contact with the material and thus does not wear. In this research the potential of utilizing laser for texturing of tungsten carbide to develop custom features would be studied. A parametric study of texturing of Tungsten Carbide with a femtosecond laser would be conducted to investigate the process parameters and establish the feasible processing window. The effect of fluence, scan speed and number of repetition would be viewed in detail. Moreover, the mechanism for the generation of features would also be reviewed.Keywords: laser, texturing, femtosecond, tungsten carbide
Procedia PDF Downloads 6582960 Production of Buttermilk as a Bio-Active Functional Food by Utilizing Dairy Waste
Authors: Hafsa Tahir, Sanaullah Iqbal
Abstract:
Glactooligosaccharide (GOS) is a type of prebiotic which is mainly found in human milk. GOS belongs to those bacteria which stimulates the growth of beneficial bacteria in human intestines. The aim of the present study was to develop a value-added product by producing prebiotic (GOS) in buttermilk through trans galactosylation. Buttermilk is considered as an industrial waste which is discarded after the production of butter and cream. It contains protein, minerals, vitamins and a smaller amount of fat. Raw milk was pasteurized at 100º C for butter production and then trans galactosylation process was induced in the butter milk thus obtained to produce prebiotic GOS. Results showed that the enzyme (which was obtained from bacterial strain of Esecrshia coli and has a gene of Lactobacillus reuteri L103) concentration between 400-600µl/5ml can produce GOS in 30 minutes. Chemical analysis and sensory evaluation of plain and GOS containing buttermilk showed no remarkable difference in their composition. Furthermore, the shelf-life study showed that there was non-significant (P>0.05) difference in glass and pouch packaging of buttermilk. Buttermilk in pouch packaging maintained its stability for 6 days without the addition of preservatives. Therefore it is recommended that GOS enriched buttermilk which is generally considered as a processing waste in dairy manufacturing can be turned into a cost-effective nutritional functional food product. This will not only enhance the production efficiency of butter processing but also will create a new market opportunity for dairy manufacturers all over the world.Keywords: buttermilk, galactooligosaccharide, shelf Life, transgalactosylation
Procedia PDF Downloads 2922959 Advanced Magnetic Field Mapping Utilizing Vertically Integrated Deployment Platforms
Authors: John E. Foley, Martin Miele, Raul Fonda, Jon Jacobson
Abstract:
This paper presents development and implementation of new and innovative data collection and analysis methodologies based on deployment of total field magnetometer arrays. Our research has focused on the development of a vertically-integrated suite of platforms all utilizing common data acquisition, data processing and analysis tools. These survey platforms include low-altitude helicopters and ground-based vehicles, including robots, for terrestrial mapping applications. For marine settings the sensor arrays are deployed from either a hydrodynamic bottom-following wing towed from a surface vessel or from a towed floating platform for shallow-water settings. Additionally, sensor arrays are deployed from tethered remotely operated vehicles (ROVs) for underwater settings where high maneuverability is required. While the primary application of these systems is the detection and mapping of unexploded ordnance (UXO), these system are also used for various infrastructure mapping and geologic investigations. For each application, success is driven by the integration of magnetometer arrays, accurate geo-positioning, system noise mitigation, and stable deployment of the system in appropriate proximity of expected targets or features. Each of the systems collects geo-registered data compatible with a web-enabled data management system providing immediate access of data and meta-data for remote processing, analysis and delivery of results. This approach allows highly sophisticated magnetic processing methods, including classification based on dipole modeling and remanent magnetization, to be efficiently applied to many projects. This paper also briefly describes the initial development of magnetometer-based detection systems deployed from low-altitude helicopter platforms and the subsequent successful transition of this technology to the marine environment. Additionally, we present examples from a range of terrestrial and marine settings as well as ongoing research efforts related to sensor miniaturization for unmanned aerial vehicle (UAV) magnetic field mapping applications.Keywords: dipole modeling, magnetometer mapping systems, sub-surface infrastructure mapping, unexploded ordnance detection
Procedia PDF Downloads 4642958 Accurate Positioning Method of Indoor Plastering Robot Based on Line Laser
Authors: Guanqiao Wang, Hongyang Yu
Abstract:
There is a lot of repetitive work in the traditional construction industry. These repetitive tasks can significantly improve production efficiency by replacing manual tasks with robots. There- fore, robots appear more and more frequently in the construction industry. Navigation and positioning are very important tasks for construction robots, and the requirements for accuracy of positioning are very high. Traditional indoor robots mainly use radiofrequency or vision methods for positioning. Compared with ordinary robots, the indoor plastering robot needs to be positioned closer to the wall for wall plastering, so the requirements for construction positioning accuracy are higher, and the traditional navigation positioning method has a large error, which will cause the robot to move. Without the exact position, the wall cannot be plastered, or the error of plastering the wall is large. A new positioning method is proposed, which is assisted by line lasers and uses image processing-based positioning to perform more accurate positioning on the traditional positioning work. In actual work, filter, edge detection, Hough transform and other operations are performed on the images captured by the camera. Each time the position of the laser line is found, it is compared with the standard value, and the position of the robot is moved or rotated to complete the positioning work. The experimental results show that the actual positioning error is reduced to less than 0.5 mm by this accurate positioning method.Keywords: indoor plastering robot, navigation, precise positioning, line laser, image processing
Procedia PDF Downloads 1482957 Vitrification and Devitrification of Chromium Containing Tannery Ash
Authors: Savvas Varitis, Panagiotis Kavouras, George Kaimakamis, Eleni Pavlidou, George Vourlias, Konstantinos Chrysafis, Philomela Komninou, Theodoros Karakostas
Abstract:
Tannery industry produces high quantities of chromium containing waste which also have high organic content. Processing of this waste is important since the organic content is above the disposal limits and the containing trivalent chromium could be potentially oxidized to hexavalent in the environment. This work aims to fabricate new vitreous and glass ceramic materials which could incorporate the tannery waste in stabilized form either for safe disposal or for the production of useful materials. Tannery waste was incinerated at 500oC in anoxic conditions so most of the organic content would be removed and the chromium remained trivalent. Glass forming agents SiO2, Na2O and CaO were mixed with the resulting ash in different proportions with decreasing ash content. Considering the low solubility of Cr in silicate melts, the mixtures were melted at 1400oC and/or 1500oC for 2h and then casted on a refractory steel plate. The resulting vitreous products were characterized by X-Ray Diffraction (XRD), Differential Thermal Analysis (DTA), Scanning and Transmission Electron Microscopy (SEM and TEM). XRD reveals the existence of Cr2O3 (eskolaite) crystallites embedded in a glassy amorphous matrix. Such crystallites are not formed under a certain proportion of the waste in the ash-vitrified material. Reduction of the ash proportion increases chromium content in the silicate matrix. From these glassy products, glass-ceramics were produced via different regimes of thermal treatment.Keywords: chromium containing tannery ash, glass ceramic materials, thermal processing, vitrification
Procedia PDF Downloads 3672956 The Effect of High-Pressure Processing on the Inactivation of Saccharomyces cerevisiae in Different Concentration of Manuka Honey and Its Relation with ° Brix
Authors: Noor Akhmazillah Fauzi, Mohammed Mehdi Farid, Filipa V. Silva
Abstract:
The aim of this paper is to investigate if different concentration of Manuka honey (as a model food) has a major influence on the inactivation of Saccharomyces cerevisiae (as the testing microorganism) after subjecting it to HPP. Honey samples with different sugar concentrations (20, 30, 40, 50, 60 and 70 °Brix) were prepared aseptically using sterilized distilled water. No dilution of honey was made for the 80 °Brix sample. For the 0 °Brix sample (control), sterilized distilled water was used. Thermal treatment at 55 °C for 10 min (conventionally applied in honey pasteurisation in industry) was carried out for comparison purpose. S. cerevisiae cell numbers in honey samples were established before and after each HPP and thermal treatment. The number of surviving cells was determined after a proper dilution of the untreated and treated samples by the viable plate count method. S. cerevisiae cells, in different honey concentrations (0 to 80 °Brix), subjected to 600 MPa (at ambient temperature) showed an increasing resistance to inactivation with °Brix. A significant correlation (p < 0.05) between cell reduction and °Brix was found. Cell reduction in high pressure-treated samples varied linearly with °Brix (R2 > 0.9), confirming that the baroprotective effect of the food is due to sugar content. This study has practical implications in establishing efficient process design for commercial manufacturing of high sugar food products and on the potential use of HPP for such products.Keywords: high pressure processing, honey, Saccharomyces cerevisiae, °Brix
Procedia PDF Downloads 3532955 Coarse-Grained Computational Fluid Dynamics-Discrete Element Method Modelling of the Multiphase Flow in Hydrocyclones
Authors: Li Ji, Kaiwei Chu, Shibo Kuang, Aibing Yu
Abstract:
Hydrocyclones are widely used to classify particles by size in industries such as mineral processing and chemical processing. The particles to be handled usually have a broad range of size distributions and sometimes density distributions, which has to be properly considered, causing challenges in the modelling of hydrocyclone. The combined approach of Computational Fluid Dynamics (CFD) and Discrete Element Method (DEM) offers convenience to model particle size/density distribution. However, its direct application to hydrocyclones is computationally prohibitive because there are billions of particles involved. In this work, a CFD-DEM model with the concept of the coarse-grained (CG) model is developed to model the solid-fluid flow in a hydrocyclone. The DEM is used to model the motion of discrete particles by applying Newton’s laws of motion. Here, a particle assembly containing a certain number of particles with same properties is treated as one CG particle. The CFD is used to model the liquid flow by numerically solving the local-averaged Navier-Stokes equations facilitated with the Volume of Fluid (VOF) model to capture air-core. The results are analyzed in terms of fluid and solid flow structures, and particle-fluid, particle-particle and particle-wall interaction forces. Furthermore, the calculated separation performance is compared with the measurements. The results obtained from the present study indicate that this approach can offer an alternative way to examine the flow and performance of hydrocyclonesKeywords: computational fluid dynamics, discrete element method, hydrocyclone, multiphase flow
Procedia PDF Downloads 4072954 Exploring Language Attrition Through Processing: The Case of Mising Language in Assam
Authors: Chumki Payun, Bidisha Som
Abstract:
The Mising language, spoken by the Mising community in Assam, belongs to the Tibeto-Burman family of languages. This is one of the smaller languages of the region and is facing endangerment due to the dominance of the larger languages, like Assamese. The language is spoken in close in-group scenarios and is gradually losing ground to the dominant languages, partly also due to the education setup where schools use only dominant languages. While there are a number of factors for the current contemporary status of the language, and those can be studied using sociolinguistic tools, the current work aims to contribute to the understanding of language attrition through language processing in order to establish if the effect of second language dominance is more than mere ‘usage’ patterns and has an impact on cognitive strategies. When bilingualism spreads widely in society and results in a language shift, speakers perform people often do better in their second language (L2) than in their first language (L1) across a variety of task settings, in both comprehension and production tasks. This phenomenon was investigated in the case of Mising-Assamese bilinguals, using a picture naming task, in two districts of Jorhat and Tinsukia in Assam, where the relative dominance of L2 is slightly different. This explorative study aimed to investigate if the L2 dominance is visible in their performance and also if the pattern is different in the two different places, thus pointing to the degree of language loss in this case. The findings would have implications for native language education, as education in one’s mother tongue can help reverse the effect of language attrition helping preserve the traditional knowledge system. The hypothesis was that due to the dominance of the L2, subjects’ performance in the task would be better in Assamese than that of Missing. The experiment: Mising-Assamese bilingual participants (age ranges 21-31; N= 20 each from both districts) had to perform a picture naming task in which participants were shown pictures of familiar objects and asked to name them in four scenarios: (a) only in Mising; (b) only in Assamese; (c) a cued mix block: an auditory cue determines the language in which to name the object, and (d) non-cued mix block: participants are not given any specific language cues, but instructed to name the pictures in whichever language they feel most comfortable. The experiment was designed and executed using E-prime 3.0 and was conducted responses were recorded using the help of a Chronos response box and was recorded with the help of a recorder. Preliminary analysis reveals the presence of dominance of L2 over L1. The paper will present a comparison of the response latency, error analysis, and switch cost in L1 and L2 and explain the same from the perspective of language attrition.Keywords: bilingualism, language attrition, language processing, Mising language.
Procedia PDF Downloads 222953 A Gradient Orientation Based Efficient Linear Interpolation Method
Authors: S. Khan, A. Khan, Abdul R. Soomrani, Raja F. Zafar, A. Waqas, G. Akbar
Abstract:
This paper proposes a low-complexity image interpolation method. Image interpolation is used to convert a low dimension video/image to high dimension video/image. The objective of a good interpolation method is to upscale an image in such a way that it provides better edge preservation at the cost of very low complexity so that real-time processing of video frames can be made possible. However, low complexity methods tend to provide real-time interpolation at the cost of blurring, jagging and other artifacts due to errors in slope calculation. Non-linear methods, on the other hand, provide better edge preservation, but at the cost of high complexity and hence they can be considered very far from having real-time interpolation. The proposed method is a linear method that uses gradient orientation for slope calculation, unlike conventional linear methods that uses the contrast of nearby pixels. Prewitt edge detection is applied to separate uniform regions and edges. Simple line averaging is applied to unknown uniform regions, whereas unknown edge pixels are interpolated after calculation of slopes using gradient orientations of neighboring known edge pixels. As a post-processing step, bilateral filter is applied to interpolated edge regions in order to enhance the interpolated edges.Keywords: edge detection, gradient orientation, image upscaling, linear interpolation, slope tracing
Procedia PDF Downloads 2602952 Trust: The Enabler of Knowledge-Sharing Culture in an Informal Setting
Authors: Emmanuel Ukpe, S. M. F. D. Syed Mustapha
Abstract:
Trust in an organization has been perceived as one of the key factors behind knowledge sharing, mainly in an unstructured work environment. In an informal working environment, to instill trust among individuals is a challenge and even more in the virtual environment. The study has contributed in developing the framework for building trust in an unstructured organization in performing knowledge sharing in a virtual environment. The artifact called KAPE (Knowledge Acquisition, Processing, and Exchange) was developed for knowledge sharing for the informal organization where the framework was incorporated. It applies to Cassava farmers to facilitate knowledge sharing using web-based platform. A survey was conducted; data were collected from 382 farmers from 21 farm communities. Multiple regression technique, Cronbach’s Alpha reliability test; Tukey’s Honestly significant difference (HSD) analysis; one way Analysis of Variance (ANOVA), and all trust acceptable measures (TAM) were used to test the hypothesis and to determine noteworthy relationships. The results show a significant difference when there is a trust in knowledge sharing between farmers, the ones who have high in trust acceptable factors found in the model (M = 3.66 SD = .93) and the ones who have low on trust acceptable factors (M = 2.08 SD = .28), (t (48) = 5.69, p = .00). Furthermore, when applying Cognitive Expectancy Theory, the farmers with cognitive-consonance show higher level of trust and satisfaction with knowledge and information from KAPE, as compared with a low level of cognitive-dissonance. These results imply that the adopted trust model KAPE positively improved knowledge sharing activities in an informal environment amongst rural farmers.Keywords: trust, knowledge, sharing, knowledge acquisition, processing and exchange, KAPE
Procedia PDF Downloads 1202951 Modern Agriculture and Industrialization Nexus in the Nigerian Context
Authors: Ese Urhie, Olabisi Popoola, Obindah Gershon, Olabanji Ewetan
Abstract:
Modern agriculture involves the use of improved tools and equipment (instead of crude and ineffective tools) like tractors, hand operated planters, hand operated fertilizer drills and combined harvesters - which increase agricultural productivity. Farmers in Nigeria still have huge potentials to enhance their productivity. The study argues that the increase in agricultural output due to increased productivity, orchestrated by modern agriculture will promote forward linkages and opportunities in the processing sub-sector; both the manufacturing of machines and the processing of raw materials. Depending on existing incentives, foreign investment could be attracted to augment local investment in the sector. The availability of raw materials in large quantity – which prices are competitive – will attract investment in other industries. In addition, potentials for backward linkages will also be created. In a nutshell, adopting the unbalanced growth theory in favour of the agricultural sector could engender industrialization in a country with untapped potentials. The paper highlights the numerous potentials of modern agriculture that are yet to be tapped in Nigeria and also provides a theoretical analysis of how the realization of such potentials could promote industrialization in the country. The study adopts the Lewis’ theory of structural–change model and Hirschman’s theory of unbalanced growth in the design of the analytical framework. The framework will be useful in empirical studies that will guide policy formulation.Keywords: modern agriculture, industrialization, structural change model, unbalanced growth
Procedia PDF Downloads 3032950 Hot Deformation Behavior and Recrystallization of Inconel 718 Superalloy under Double Cone Compression
Authors: Wang Jianguo, Ding Xiao, Liu Dong, Wang Haiping, Yang Yanhui, Hu Yang
Abstract:
The hot deformation behavior of Inconel 718 alloy was studied by uniaxial compression tests under the deformation temperature of 940~1040℃ and strain rate of 0.001-10s⁻¹. The double cone compression (DCC) tests develop strains range from 30% to the 79% strain including all intermediate values of stains at different temperature (960~1040℃). DCC tests were simulated by finite element software which shown the strain and strain rates distribution. The result shows that the peak stress level of the alloy decreased with increasing deformation temperature and decreasing strain rate, which could be characterized by a Zener-Hollomon parameter in the hyperbolic-sine equation. The characterization method of hot processing window containing recrystallization volume fraction and average grain size was proposed for double cone compression test of uniform coarse grain, mixed crystal and uniform fine grain double conical specimen in hydraulic press and screw press. The results show that uniform microstructures can be obtained by low temperature with high deformation followed by high temperature with small deformation on the hydraulic press and low temperature, medium deformation, multi-pass on the screw press. The two methods were applied in industrial forgings process, and the forgings with uniform microstructure were obtained successfully.Keywords: inconel 718 superalloy, hot processing windows, double cone compression, uniform microstructure
Procedia PDF Downloads 2192949 Synthesis and Characterization of Carboxymethyl Cellulose-Chitosan Based Composite Hydrogels for Biomedical and Non-Biomedical Applications
Abstract:
Hydrogels have attracted much academic and industrial attention due to their unique properties and potential biomedical and non-biomedical applications. Limitations on extending their applications have resulted from the synthesis of hydrogels using toxic materials and complex irreproducible processing techniques. In order to promote environmental sustainability, hydrogel efficiency, and wider application, this study focused on the synthesis of composite hydrogels matrices from an edible non-toxic crosslinker-citric acid (CA) using a simple low energy processing method based on carboxymethyl cellulose (CMC) and chitosan (CSN) natural polymers. Composite hydrogels were developed by chemical crosslinking. The results demonstrated that CMC:2CSN:CA exhibited good performance properties and super-absorbency 21× its original weight. This makes it promising for biomedical applications such as chronic wound healing and regeneration, next generation skin substitute, in situ bone regeneration and cell delivery. On the other hand, CMC:CSN:CA exhibited durable well-structured internal network with minimum swelling degrees, water absorbency, excellent gel fraction, and infra-red reflectance. These properties make it a suitable composite hydrogel matrix for warming effect and controlled and efficient release of loaded materials. CMC:2CSN:CA and CMC:CSN:CA composite hydrogels developed also exhibited excellent chemical, morphological, and thermal properties.Keywords: citric acid, fumaric acid, tartaric acid, zinc nitrate hexahydrate
Procedia PDF Downloads 1512948 The Effects of Blanching, Boiling and Steaming on Ascorbic Acid Content, Total Phenolic Content, and Colour in Cauliflowers (Brassica oleracea var. Botrytis)
Authors: Huei Lin Lee, Wee Sim Choo
Abstract:
The effects of blanching, boiling and steaming on the ascorbic acid content, total phenolic content and colour in cauliflower (Brassica oleraceavar. Botrytis) was investigated. It was found that blanching was the best thermal processing to be applied on cauliflower compared to boiling and steaming processes. Blanching and steaming processes on cauliflower retained most of the ascorbic acid content (AAC) compared to those of boiling. As for the total phenolic content (TPC), blanching process retained a higher TPC in cauliflower compared to those of boiling and steaming processes. There were no significant differences between the TPC of boiled and steamed cauliflowers. As for the colour measurement, there were no significant differences in the colour of the cauliflower at different lead time (after processing to the point of consumption) of 30 minutes interval up to 3 hours but there were slight variations in L*, a*, and b* values among the thermal processed cauliflowers (blanched, boiled and steamed). The cauliflowers in this study were found to give a desirable white colour (L* value in the range of 77-83) in all the three thermal processes (blanching, boiling and steaming). There was no significant difference on the effect of lead time (30-minutes interval up to 3 hours) in raw and all the three thermal processed (blanched, boiled and steamed) cauliflowers.Keywords: ascorbic acid, cauliflower, colour, phenolics
Procedia PDF Downloads 3142947 The Effect of Development of Two-Phase Flow Regimes on the Stability of Gas Lift Systems
Authors: Khalid. M. O. Elmabrok, M. L. Burby, G. G. Nasr
Abstract:
Flow instability during gas lift operation is caused by three major phenomena – the density wave oscillation, the casing heading pressure and the flow perturbation within the two-phase flow region. This paper focuses on the causes and the effect of flow instability during gas lift operation and suggests ways to control it in order to maximise productivity during gas lift operations. A laboratory-scale two-phase flow system to study the effects of flow perturbation was designed and built. The apparatus is comprised of a 2 m long by 66 mm ID transparent PVC pipe with air injection point situated at 0.1 m above the base of the pipe. This is the point where stabilised bubbles were visibly clear after injection. Air is injected into the water filled transparent pipe at different flow rates and pressures. The behavior of the different sizes of the bubbles generated within the two-phase region was captured using a digital camera and the images were analysed using the advanced image processing package. It was observed that the average maximum bubbles sizes increased with the increase in the length of the vertical pipe column from 29.72 to 47 mm. The increase in air injection pressure from 0.5 to 3 bars increased the bubble sizes from 29.72 mm to 44.17 mm and then decreasing when the pressure reaches 4 bars. It was observed that at higher bubble velocity of 6.7 m/s, larger diameter bubbles coalesce and burst due to high agitation and collision with each other. This collapse of the bubbles causes pressure drop and reverse flow within two phase flow and is the main cause of the flow instability phenomena.Keywords: gas lift instability, bubbles forming, bubbles collapsing, image processing
Procedia PDF Downloads 4202946 Progress in Combining Image Captioning and Visual Question Answering Tasks
Authors: Prathiksha Kamath, Pratibha Jamkhandi, Prateek Ghanti, Priyanshu Gupta, M. Lakshmi Neelima
Abstract:
Combining Image Captioning and Visual Question Answering (VQA) tasks have emerged as a new and exciting research area. The image captioning task involves generating a textual description that summarizes the content of the image. VQA aims to answer a natural language question about the image. Both these tasks include computer vision and natural language processing (NLP) and require a deep understanding of the content of the image and semantic relationship within the image and the ability to generate a response in natural language. There has been remarkable growth in both these tasks with rapid advancement in deep learning. In this paper, we present a comprehensive review of recent progress in combining image captioning and visual question-answering (VQA) tasks. We first discuss both image captioning and VQA tasks individually and then the various ways in which both these tasks can be integrated. We also analyze the challenges associated with these tasks and ways to overcome them. We finally discuss the various datasets and evaluation metrics used in these tasks. This paper concludes with the need for generating captions based on the context and captions that are able to answer the most likely asked questions about the image so as to aid the VQA task. Overall, this review highlights the significant progress made in combining image captioning and VQA, as well as the ongoing challenges and opportunities for further research in this exciting and rapidly evolving field, which has the potential to improve the performance of real-world applications such as autonomous vehicles, robotics, and image search.Keywords: image captioning, visual question answering, deep learning, natural language processing
Procedia PDF Downloads 732945 Referencing Anna: Findings From Eye-tracking During Dutch Pronoun Resolution
Authors: Robin Devillers, Chantal van Dijk
Abstract:
Children face ambiguities in everyday language use. Particularly ambiguity in pronoun resolution can be challenging, whereas adults can rapidly identify the antecedent of the mentioned pronoun. Two main factors underlie this process, namely the accessibility of the referent and the syntactic cues of the pronoun. After 200ms, adults have converged the accessibility and the syntactic constraints, while relieving cognitive effort by considering contextual cues. As children are still developing their cognitive capacity, they are not able yet to simultaneously assess and integrate accessibility, contextual cues and syntactic information. As such, they fail to identify the correct referent and possibly fixate more on the competitor in comparison to adults. In this study, Dutch while-clauses were used to investigate the interpretation of pronouns by children. The aim is to a) examine the extent to which 7-10 year old children are able to utilise discourse and syntactic information during online and offline sentence processing and b) analyse the contribution of individual factors, including age, working memory, condition and vocabulary. Adult and child participants are presented with filler-items and while-clauses, and the latter follows a particular structure: ‘Anna and Sophie are sitting in the library. While Anna is reading a book, she is taking a sip of water.’ This sentence illustrates the ambiguous situation, as it is unclear whether ‘she’ refers to Anna or Sophie. In the unambiguous situation, either Anna or Sophie would be substituted by a boy, such as ‘Peter’. The pronoun in the second sentence will unambiguously refer to one of the characters due to the syntactic constraints of the pronoun. Children’s and adults’ responses were measured by means of a visual world paradigm. This paradigm consisted of two characters, of which one was the referent (the target) and the other was the competitor. A sentence was presented and followed by a question, which required the participant to choose which character was the referent. Subsequently, this paradigm yields an online (fixations) and offline (accuracy) score. These findings will be analysed using Generalised Additive Mixed Models, which allow for a thorough estimation of the individual variables. These findings will contribute to the scientific literature in several ways; firstly, the use of while-clauses has not been studied much and it’s processing has not yet been identified. Moreover, online pronoun resolution has not been investigated much in both children and adults, and therefore, this study will contribute to adults and child’s pronoun resolution literature. Lastly, pronoun resolution has not been studied yet in Dutch and as such, this study adds to the languagesKeywords: pronouns, online language processing, Dutch, eye-tracking, first language acquisition, language development
Procedia PDF Downloads 992944 Explaining the Steps of Designing and Calculating the Content Validity Ratio Index of the Screening Checklist of Preschool Students (5 to 7 Years Old) Exposed to Learning Difficulties
Authors: Sajed Yaghoubnezhad, Sedygheh Rezai
Abstract:
Background and Aim: Since currently in Iran, students with learning disabilities are identified after entering school, and with the approach to the gap between IQ and academic achievement, the purpose of this study is to design and calculate the content validity of the pre-school screening checklist (5-7) exposed to learning difficulties. Methods: This research is a fundamental study, and in terms of data collection method, it is quantitative research with a descriptive approach. In order to design this checklist, after reviewing the research background and theoretical foundations, cognitive abilities (visual processing, auditory processing, phonological awareness, executive functions, spatial visual working memory and fine motor skills) are considered the basic variables of school learning. The basic items and worksheets of the screening checklist of pre-school students 5 to 7 years old with learning difficulties were compiled based on the mentioned abilities and were provided to the specialists in order to calculate the content validity ratio index. Results: Based on the results of the table, the validity of the CVR index of the background information checklist is equal to 0.9, and the CVR index of the performance checklist of preschool children (5 to7 years) is equal to 0.78. In general, the CVR index of this checklist is reported to be 0.84. The results of this study provide good evidence for the validity of the pre-school sieve screening checklist (5-7) exposed to learning difficulties.Keywords: checklist, screening, preschoolers, learning difficulties
Procedia PDF Downloads 1022943 The Implementation of an E-Government System in Developing Countries: A Case of Taita Taveta County, Kenya
Authors: Tabitha Mberi, Tirus Wanyoike, Joseph Sevilla
Abstract:
The use of Information and Communication Technology (ICT) in Government is gradually becoming a major requirement to transform delivery of services to its stakeholders by improving quality of service and efficiency. In Kenya, the devolvement of government from local authorities to county governments has resulted in many counties adopting online revenue collection systems which can be easily accessed by its stakeholders. Strathmore Research and Consortium Centre (SRCC) implemented a revenue collection system in Taita Taveta, a County in coastal Kenya. It consisted of two systems that are integrated; an online system dubbed “CountyPro” for processing county services such as Business Permit applications, General Billing, Property Rates Payments and any other revenue streams from the county. The second part was a Point of Sale(PoS) system used by the county revenue collectors to charge for market fees and vehicle parking fees. This study assesses the success and challenges in adoption of the integrated system. Qualitative and quantitative data collection methods were used to collect data on the adoption of the system with the researcher using focus groups, interviews, and questionnaires to collect data from various users of the system An analysis was carried out and revealed that 87% of the county revenue officers who are situated in county offices describe the system as efficient and has made their work easier in terms of processing of transactions for customers.Keywords: e-government, counties, information technology, online system, point of sale
Procedia PDF Downloads 2472942 Machine Learning Strategies for Data Extraction from Unstructured Documents in Financial Services
Authors: Delphine Vendryes, Dushyanth Sekhar, Baojia Tong, Matthew Theisen, Chester Curme
Abstract:
Much of the data that inform the decisions of governments, corporations and individuals are harvested from unstructured documents. Data extraction is defined here as a process that turns non-machine-readable information into a machine-readable format that can be stored, for instance, in a database. In financial services, introducing more automation in data extraction pipelines is a major challenge. Information sought by financial data consumers is often buried within vast bodies of unstructured documents, which have historically required thorough manual extraction. Automated solutions provide faster access to non-machine-readable datasets, in a context where untimely information quickly becomes irrelevant. Data quality standards cannot be compromised, so automation requires high data integrity. This multifaceted task is broken down into smaller steps: ingestion, table parsing (detection and structure recognition), text analysis (entity detection and disambiguation), schema-based record extraction, user feedback incorporation. Selected intermediary steps are phrased as machine learning problems. Solutions leveraging cutting-edge approaches from the fields of computer vision (e.g. table detection) and natural language processing (e.g. entity detection and disambiguation) are proposed.Keywords: computer vision, entity recognition, finance, information retrieval, machine learning, natural language processing
Procedia PDF Downloads 111