Search results for: materials processing
10054 One Step Further: Pull-Process-Push Data Processing
Authors: Romeo Botes, Imelda Smit
Abstract:
In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list
Procedia PDF Downloads 24810053 Quantitative Analysis of Multiprocessor Architectures for Radar Signal Processing
Authors: Deepak Kumar, Debasish Deb, Reena Mamgain
Abstract:
Radar signal processing requires high number crunching capability. Most often this is achieved using multiprocessor platform. Though multiprocessor platform provides the capability of meeting the real time computational challenges, the architecture of the same along with mapping of the algorithm on the architecture plays a vital role in efficiently using the platform. Towards this, along with standard performance metrics, few additional metrics are defined which helps in evaluating the multiprocessor platform along with the algorithm mapping. A generic multiprocessor architecture can not suit all the processing requirements. Depending on the system requirement and type of algorithms used, the most suitable architecture for the given problem is decided. In the paper, we study different architectures and quantify the different performance metrics which enables comparison of different architectures for their merit. We also carried out case study of different architectures and their efficiency depending on parallelism exploited on algorithm or data or both.Keywords: radar signal processing, multiprocessor architecture, efficiency, load imbalance, buffer requirement, pipeline, parallel, hybrid, cluster of processors (COPs)
Procedia PDF Downloads 41510052 The Influence of Concreteness on English Compound Noun Processing: Modulation of Constituent Transparency
Authors: Turgut Coskun
Abstract:
'Concreteness effect' refers to faster processing of concrete words and 'compound facilitation' refers to faster response to compounds. In this study, our main goal was to investigate the interaction between compound facilitation and concreteness effect. The latter might modulate compound processing basing on constituents’ transparency patterns. To evaluate these, we created lists for compound and monomorphemic words, sub-categorized them into concrete and abstract words, and further sub-categorized them basing on their transparency. The transparency conditions were opaque-opaque (OO), transparent-opaque (TO), and transparent-transparent (TT). We used RT data from English Lexicon Project (ELP) for our comparisons. The results showed the importance of concreteness factor (facilitation) in both compound and monomorphemic processing. Important for our present concern, separate concrete and abstract compound analyses revealed different patterns for OO, TO, and TT compounds. Concrete TT and TO conditions were processed faster than Concrete OO, Abstract OO and Abstract TT compounds, however, they weren’t processed faster than Abstract TO compounds. These results may reflect on different representation patterns of concrete and abstract compounds.Keywords: abstract word, compound representation, concrete word, constituent transparency, processing speed
Procedia PDF Downloads 20710051 Arousal, Encoding, And Intrusive Memories
Authors: Hannah Gutmann, Rick Richardson, Richard Bryant
Abstract:
Intrusive memories following a traumatic event are not uncommon. However, in some individuals, these memories become maladaptive and lead to prolonged stress reactions. A seminal model of PTSD explains that aberrant processing during trauma may lead to prolonged stress reactions and intrusive memories. This model explains that elevated arousal at the time of the trauma promotes data driven processing, leading to fragmented and intrusive memories. This study investigated the role of elevated arousal on the development of intrusive memories. We measured salivary markers of arousal and investigated what impact this had on data driven processing, memory fragmentation, and subsequently, the development of intrusive memories. We assessed 100 healthy participants to understand their processing style, arousal, and experience of intrusive memories. Participants were randomised to a control or experimental condition, the latter of which was designed to increase their arousal. Based on current theory, participants in the experimental condition were expected to engage in more data driven processing and experience more intrusive memories than participants in the control condition. This research aims to shed light on the mechanisms underlying the development of intrusive memories to illustrate ways in which therapeutic approaches for PTSD may be augmented for greater efficacy.Keywords: stress, cortisol, SAA, PTSD, intrusive memories
Procedia PDF Downloads 20210050 Aircraft Components, Manufacturing and Design: Opportunities, Bottlenecks, and Challenges
Authors: Ionel Botef
Abstract:
Aerospace products operate in very aggressive environments characterized by high temperature, high pressure, large stresses on individual components, the presence of oxidizing and corroding atmosphere, as well as internally created or externally ingested particulate materials that induce erosion and impact damage. Consequently, during operation, the materials of individual components degrade. In addition, the impact of maintenance costs for both civil and military aircraft was estimated at least two to three times greater than initial purchase values, and this trend is expected to increase. As a result, for viable product realisation and maintenance, a spectrum of issues regarding novel processing technologies, innovation of new materials, performance, costs, and environmental impact must constantly be addressed. One of these technologies, namely the cold-gas dynamic-spray process has enabled a broad range of coatings and applications, including many that have not been previously possible or commercially practical, hence its potential for new aerospace applications. Therefore, the purpose of this paper is to summarise the state of the art of this technology alongside its theoretical and experimental studies, and explore how the cold-gas dynamic-spray process could be integrated within a framework that finally could lead to more efficient aircraft maintenance. Based on the paper's qualitative findings supported by authorities, evidence, and logic essentially it is argued that the cold-gas dynamic-spray manufacturing process should not be viewed in isolation, but should be viewed as a component of a broad framework that finally leads to more efficient aerospace operations.Keywords: aerospace, aging aircraft, cold spray, materials
Procedia PDF Downloads 12410049 Sustainable Membranes Based on 2D Materials for H₂ Separation and Purification
Authors: Juan A. G. Carrio, Prasad Talluri, Sergio G. Echeverrigaray, Antonio H. Castro Neto
Abstract:
Hydrogen as a fuel and environmentally pleasant energy carrier is part of this transition towards low-carbon systems. The extensive deployment of hydrogen production, purification and transport infrastructures still represents significant challenges. Independent of the production process, the hydrogen generally is mixed with light hydrocarbons and other undesirable gases that need to be removed to obtain H₂ with the required purity for end applications. In this context, membranes are one of the simplest, most attractive, sustainable, and performant technologies enabling hydrogen separation and purification. They demonstrate high separation efficiencies and low energy consumption levels in operation, which is a significant leap compared to current energy-intensive options technologies. The unique characteristics of 2D laminates have given rise to a diversity of research on their potential applications in separation systems. Specifically, it is already known in the scientific literature that graphene oxide-based membranes present the highest reported selectivity of H₂ over other gases. This work explores the potential of a new type of 2D materials-based membranes in separating H₂ from CO₂ and CH₄. We have developed nanostructured composites based on 2D materials that have been applied in the fabrication of membranes to maximise H₂ selectivity and permeability, for different gas mixtures, by adjusting the membranes' characteristics. Our proprietary technology does not depend on specific porous substrates, which allows its integration in diverse separation modules with different geometries and configurations, looking to address the technical performance required for industrial applications and economic viability. The tuning and precise control of the processing parameters allowed us to control the thicknesses of the membranes below 100 nanometres to provide high permeabilities. Our results for the selectivity of new nanostructured 2D materials-based membranes are in the range of the performance reported in the available literature around 2D materials (such as graphene oxide) applied to hydrogen purification, which validates their use as one of the most promising next-generation hydrogen separation and purification solutions.Keywords: membranes, 2D materials, hydrogen purification, nanocomposites
Procedia PDF Downloads 14210048 Detecting and Disabling Digital Cameras Using D3CIP Algorithm Based on Image Processing
Authors: S. Vignesh, K. S. Rangasamy
Abstract:
The paper deals with the device capable of detecting and disabling digital cameras. The system locates the camera and then neutralizes it. Every digital camera has an image sensor known as a CCD, which is retro-reflective and sends light back directly to its original source at the same angle. The device shines infrared LED light, which is invisible to the human eye, at a distance of about 20 feet. It then collects video of these reflections with a camcorder. Then the video of the reflections is transferred to a computer connected to the device, where it is sent through image processing algorithms that pick out infrared light bouncing back. Once the camera is detected, the device would project an invisible infrared laser into the camera's lens, thereby overexposing the photo and rendering it useless. Low levels of infrared laser neutralize digital cameras but are neither a health danger to humans nor a physical damage to cameras. We also discuss the simplified design of the above device that can used in theatres to prevent piracy. The domains being covered here are optics and image processing.Keywords: CCD, optics, image processing, D3CIP
Procedia PDF Downloads 36010047 Processing of Flexible Dielectric Nanocomposites Using Nanocellulose and Recycled Alum Sludge for Wearable Technology Applications
Authors: D. Sun, L. Saw, A. Onyianta, D. O’Rourke, Z. Lu, C. See, C. Wilson, C. Popescu, M. Dorris
Abstract:
With the rapid development of wearable technology (e.g., smartwatch, activity trackers and health monitor devices), flexible dielectric materials with environmental-friendly, low-cost and high-energy efficiency characteristics are in increasing demand. In this work, a flexible dielectric nanocomposite was processed by incorporating two components: cellulose nanofibrils and alum sludge in a polymer matrix. The two components were used in the reinforcement phase as well as for enhancing the dielectric properties; they were processed using waste materials that would otherwise be disposed to landfills. Alum sludge is a by-product of the water treatment process in which aluminum sulfate is prevalently used as the primary coagulant. According to the data from a project partner-Scottish Water: there are approximately 10k tons of alum sludge generated as a waste from the water treatment work to be landfilled every year in Scotland. The industry has been facing escalating financial and environmental pressure to develop more sustainable strategies to deal with alum sludge wastes. In the available literature, some work on reusing alum sludge has been reported (e.g., aluminum recovery or agriculture and land reclamation). However, little work can be found in applying it to processing energy materials (e.g., dielectrics) for enhanced energy density and efficiency. The alum sludge was collected directly from a water treatment plant of Scottish Water and heat-treated and refined before being used in preparing composites. Cellulose nanofibrils were derived from water hyacinth, an invasive aquatic weed that causes significant ecological issues in tropical regions. The harvested water hyacinth was dried and processed using a cost-effective method, including a chemical extraction followed by a homogenization process in order to extract cellulose nanofibrils. Biodegradable elastomer polydimethylsiloxane (PDMS) was used as the polymer matrix and the nanocomposites were processed by casting raw materials in Petri dishes. The processed composites were characterized using various methods, including scanning electron microscopy (SEM), rheological analysis, thermogravimetric and X-ray diffraction analysis. The SEM result showed that cellulose nanofibrils of approximately 20nm in diameter and 100nm in length were obtained and the alum sludge particles were of approximately 200um in diameters. The TGA/DSC analysis result showed that a weight loss of up to 48% can be seen in the raw material of alum sludge and its crystallization process has been started at approximately 800°C. This observation coincides with the XRD result. Other experiments also showed that the composites exhibit comprehensive mechanical and dielectric performances. This work depicts that it is a sustainable practice of reusing such waste materials in preparing flexible, lightweight and miniature dielectric materials for wearable technology applications.Keywords: cellulose, biodegradable, sustainable, alum sludge, nanocomposite, wearable technology, dielectric
Procedia PDF Downloads 8710046 Security in Resource Constraints: Network Energy Efficient Encryption
Authors: Mona Almansoori, Ahmed Mustafa, Ahmad Elshamy
Abstract:
Wireless nodes in a sensor network gather and process critical information designed to process and communicate, information flooding through such network is critical for decision making and data processing, the integrity of such data is one of the most critical factors in wireless security without compromising the processing and transmission capability of the network. This paper presents mechanism to securely transmit data over a chain of sensor nodes without compromising the throughput of the network utilizing available battery resources available at the sensor node.Keywords: hybrid protocol, data integrity, lightweight encryption, neighbor based key sharing, sensor node data processing, Z-MAC
Procedia PDF Downloads 14810045 Image Processing and Calculation of NGRDI Embedded System in Raspberry
Authors: Efren Lopez Jimenez, Maria Isabel Cajero, J. Irving-Vasqueza
Abstract:
The use and processing of digital images have opened up new opportunities for the resolution of problems of various kinds, such as the calculation of different vegetation indexes, among other things, differentiating healthy vegetation from humid vegetation. However, obtaining images from which these indexes are calculated is still the exclusive subject of active research. In the present work, we propose to obtain these images using a low cost embedded system (Raspberry Pi) and its processing, using a set of libraries of open code called OpenCV, in order to obtain the Normalized Red-Green Difference Index (NGRDI).Keywords: Raspberry Pi, vegetation index, Normalized Red-Green Difference Index (NGRDI), OpenCV
Procedia PDF Downloads 29410044 Wood Decay Fungal Strains Useful for Bio-Composite Material Production
Authors: C. Girometta, S. Babbini, R. M. Baiguera, D. S. Branciforti, M. Cartabia, D. Dondi, M. Pellegrini, A. M. Picco, E. Savino
Abstract:
Interest on wood decay fungi (WDF) has been increasing in the last year's thanks to the potentiality of this kind of fungi; research on new WDF strains has increased as well thus pointing out the key role of the culture collections. One of the most recent biotechnological application of WDF is the development of novel materials from natural or recycled resources. Based on different combinations of fungal species, substrate, and processing treatment involved (e.g. heat pressing), it is possible to achieve a wide variety of materials with different features useful for many industrial applications: from packaging to thermal and acoustic insulation. In comparison with the conventional ones, these materials represent a 100% natural and compostable alternative involving low amounts of energy in the production process. The purpose of the present work was to isolate and select WDF strains able to colonize and degrade different plant wastes thus producing a fungal biomass shapeable to achieve bio-composite materials. Strains were selected within the mycological culture collection of Pavia University (MicUNIPV, over 300 strains of WDF). The selected strains have been investigated with regards their ability to colonize and degrade plant residues from the local major cultivations (e.g. poplar, alfalfa, maize, rice, and wheat) and produce the fungal biomass. The degradation of the substrate was assessed by Thermogravimetric analysis (TGA) and Fourier Transform Infrared Spectroscopy (FTIR). Chemical characterization confirmed that TGA and FTIR are complementary techniques able to provide quality-quantitative information on compositional and structural variation that occurs during the transformation from the substrate to the bio-composite material. This pilot study provides a fundamental step to tune further applications in fungus-residues composite biomaterials.Keywords: bio-composite material, lignocellulosic residues, sustainable materials, wood decay fungi
Procedia PDF Downloads 14410043 Mathematical Analysis of Matrix and Filler Formulation in Composite Materials
Authors: Olusegun A. Afolabi, Ndivhuwo Ndou
Abstract:
Composite material is an important area that has gained global visibility in many research fields in recent years. Composite material is the combination of separate materials with different properties to form a single material having different properties from the parent materials. Material composition and combination is an important aspect of composite material. The focus of this study is to provide insight into an easy way of calculating the compositions and formulations of constituent materials that make up any composite material. The compositions of the matrix and filler used for fabricating composite materials are taken into consideration. From the composite fabricated, data can be collected and analyzed based on the test and characterizations such as tensile, flexural, compression, impact, hardness, etc. Also, the densities of the matrix and the filler with regard to their constituent materials are discussed.Keywords: composite material, density, filler, matrix, percentage weight, volume fraction
Procedia PDF Downloads 7610042 Synthesis of Solid Polymeric Materials by Maghnite-H⁺ as a Green Catalyst
Authors: Draoua Zohra, Harrane Amine
Abstract:
The Solid Polymeric Materials have been successfully prepared by the copolymerization of e-caprolactone (CL) and poly (ethylene glycol) (PEG) employing Maghnite-H+ at 80°C. Maghnite-H+ is a solid catalyst non-toxic. The presence of PEG chains leads to a break in the growth of PCL chains and consequently leads to the copolymer tri-block PCL-PEG-PCL. The objective of this study was to synthesize and characterize of Solid Polymeric Materials. The highly hydrophilic nature of polyethylene glycol has sparked our interest in developing a Solid Polymeric based e-caprolactone and poly (ethylene glycol). PCL and PEG are biocompatible materials. Their ring-opening copolymerization using Maghnite H+ makes to the Solid Polymeric Materials. The morphology and structure of Solid polymeric Materials were characterized by ¹H and ¹³C-NMR spectra and Gel Permeation Chromatography (GPC). This paper developed the application of Maghnite-H+ as an efficient catalyst by an easy-to-handle procedure to get solid polymeric materials. A cationic mechanism for the copolymerization reaction was proposed.Keywords: block copolymers, maghnite, montmorillonite, poly(e-caprolactone)
Procedia PDF Downloads 17210041 Language Processing of Seniors with Alzheimer’s Disease: From the Perspective of Temporal Parameters
Authors: Lai Yi-Hsiu
Abstract:
The present paper aims to examine the language processing of Chinese-speaking seniors with Alzheimer’s disease (AD) from the perspective of temporal cues. Twenty healthy adults, 17 healthy seniors, and 13 seniors with AD in Taiwan participated in this study to tell stories based on two sets of pictures. Nine temporal cues were fetched and analyzed. Oral productions in Mandarin Chinese were compared and discussed to examine to what extent and in what way these three groups of participants performed with significant differences. Results indicated that the age effects were significant in filled pauses. The dementia effects were significant in mean duration of pauses, empty pauses, filled pauses, lexical pauses, normalized mean duration of filled pauses and lexical pauses. The findings reported in the current paper help characterize the nature of language processing in seniors with or without AD, and contribute to the interactions between the AD neural mechanism and their temporal parameters.Keywords: language processing, Alzheimer’s disease, Mandarin Chinese, temporal cues
Procedia PDF Downloads 45210040 High Speed Image Rotation Algorithm
Authors: Hee-Choul Kwon, Hyungjin Cho, Heeyong Kwon
Abstract:
Image rotation is one of main pre-processing step in image processing or image pattern recognition. It is implemented with rotation matrix multiplication. However it requires lots of floating point arithmetic operations and trigonometric function calculations, so it takes long execution time. We propose a new high speed image rotation algorithm without two major time-consuming operations. We compare the proposed algorithm with the conventional rotation one with various size images. Experimental results show that the proposed algorithm is superior to the conventional rotation ones.Keywords: high speed rotation operation, image processing, image rotation, pattern recognition, transformation matrix
Procedia PDF Downloads 50910039 Pre-Processing of Ultrasonography Image Quality Improvement in Cases of Cervical Cancer Using Image Enhancement
Authors: Retno Supriyanti, Teguh Budiono, Yogi Ramadhani, Haris B. Widodo, Arwita Mulyawati
Abstract:
Cervical cancer is the leading cause of mortality in cancer-related diseases. In this diagnosis doctors usually perform several tests to determine the presence of cervical cancer in a patient. However, these checks require support equipment to get the results in more detail. One is by using ultrasonography. However, for the developing countries most of the existing ultrasonography has a low resolution. The goal of this research is to obtain abnormalities on low-resolution ultrasound images especially for cervical cancer case. In this paper, we emphasize our work to use Image Enhancement for pre-processing image quality improvement. The result shows that pre-processing stage is promising to support further analysis.Keywords: cervical cancer, mortality, low-resolution, image enhancement.
Procedia PDF Downloads 64210038 Speech Motor Processing and Animal Sound Communication
Authors: Ana Cleide Vieira Gomes Guimbal de Aquino
Abstract:
Sound communication is present in most vertebrates, from fish, mainly in species that live in murky waters, to some species of reptiles, anuran amphibians, birds, and mammals, including primates. There are, in fact, relevant similarities between human language and animal sound communication, and among these similarities are the vocalizations called calls. The first specific call in human babies is crying, which has a characteristic prosodic contour and is motivated most of the time by the need for food and by affecting the puppy-caregiver interaction, with a view to communicating the necessities and food requests and guaranteeing the survival of the species. The present work aims to articulate speech processing in the motor context with aspects of the project entitled emotional states and vocalization: a comparative study of the prosodic contours of crying in human and non-human animals. First, concepts of speech motor processing and general aspects of speech evolution will be presented to relate these two approaches to animal sound communication.Keywords: speech motor processing, animal communication, animal behaviour, language acquisition
Procedia PDF Downloads 9410037 Scheduling Jobs with Stochastic Processing Times or Due Dates on a Server to Minimize the Number of Tardy Jobs
Authors: H. M. Soroush
Abstract:
The problem of scheduling products and services for on-time deliveries is of paramount importance in today’s competitive environments. It arises in many manufacturing and service organizations where it is desirable to complete jobs (products or services) with different weights (penalties) on or before their due dates. In such environments, schedules should frequently decide whether to schedule a job based on its processing time, due-date, and the penalty for tardy delivery to improve the system performance. For example, it is common to measure the weighted number of late jobs or the percentage of on-time shipments to evaluate the performance of a semiconductor production facility or an automobile assembly line. In this paper, we address the problem of scheduling a set of jobs on a server where processing times or due-dates of jobs are random variables and fixed weights (penalties) are imposed on the jobs’ late deliveries. The goal is to find the schedule that minimizes the expected weighted number of tardy jobs. The problem is NP-hard to solve; however, we explore three scenarios of the problem wherein: (i) both processing times and due-dates are stochastic; (ii) processing times are stochastic and due-dates are deterministic; and (iii) processing times are deterministic and due-dates are stochastic. We prove that special cases of these scenarios are solvable optimally in polynomial time, and introduce efficient heuristic methods for the general cases. Our computational results show that the heuristics perform well in yielding either optimal or near optimal sequences. The results also demonstrate that the stochasticity of processing times or due-dates can affect scheduling decisions. Moreover, the proposed problem is general in the sense that its special cases reduce to some new and some classical stochastic single machine models.Keywords: number of late jobs, scheduling, single server, stochastic
Procedia PDF Downloads 50710036 Development of Thermal Insulation Materials Based on Silicate Using Non-Traditional Binders and Fillers
Authors: J. Hroudova, J. Zach, L. Vodova
Abstract:
When insulation and rehabilitation of structures is important to use quality building materials with high utility value. One potentially interesting and promising groups of construction materials in this area are advanced, thermally insulating plaster silicate based. With the present trend reduction of energy consumption of building structures and reducing CO2 emissions to be developed capillary-active materials that are characterized by their low density, low thermal conductivity while maintaining good mechanical properties. The paper describes the results of research activities aimed at the development of thermal insulating and rehabilitation material ongoing at the Technical University in Brno, Faculty of Civil Engineering. The achieved results of this development will be the basis for subsequent experimental analysis of the influence of thermal and moisture loads developed on these materials.Keywords: insulation materials, rehabilitation materials, lightweight aggregate, fly ash, slag, hemp fibers, glass fibers, metakaolin
Procedia PDF Downloads 24010035 Production of Cellulose Nanowhiskers from Red Algae Waste and Its Application in Polymer Composite Development
Authors: Z. Kassab, A. Aboulkas, A. Barakat, M. El Achaby
Abstract:
The red algae are available enormously around the world and their exploitation for the production of agar product has become as an important industry in recent years. However, this industrial processing of red algae generated a large quantity of solid fibrous wastes, which constitute a source of a serious environmental problem. For this reason, the exploitation of this solid waste would help to i) produce new value-added materials and ii) to improve waste disposal from environment. In fact, this solid waste can be fully utilized for the production of cellulose microfibers and nanocrystals because it consists of large amount of cellulose component. For this purpose, the red algae waste was chemically treated via alkali, bleaching and acid hydrolysis treatments with controlled conditions, in order to obtain pure cellulose microfibers and cellulose nanocrystals. The raw product and the as-extracted cellulosic materials were successively characterized using serval analysis techniques, including elemental analysis, X-ray diffraction, thermogravimetric analysis, infrared spectroscopy and transmission electron microscopy. As an application, the as extracted cellulose nanocrystals were used as nanofillers for the production of polymer-based composite films with improved thermal and tensile properties. In these composite materials, the adhesion properties and the large number of functional groups that are presented in the CNC’s surface and the macromolecular chains of the polymer matrix are exploited to improve the interfacial interactions between the both phases, improving the final properties. Consequently, the high performances of these composite materials can be expected to have potential in packaging material applications.Keywords: cellulose nanowhiskers, food packaging, polymer composites, red algae waste
Procedia PDF Downloads 23110034 AI Predictive Modeling of Excited State Dynamics in OPV Materials
Authors: Pranav Gunhal., Krish Jhurani
Abstract:
This study tackles the significant computational challenge of predicting excited state dynamics in organic photovoltaic (OPV) materials—a pivotal factor in the performance of solar energy solutions. Time-dependent density functional theory (TDDFT), though effective, is computationally prohibitive for larger and more complex molecules. As a solution, the research explores the application of transformer neural networks, a type of artificial intelligence (AI) model known for its superior performance in natural language processing, to predict excited state dynamics in OPV materials. The methodology involves a two-fold process. First, the transformer model is trained on an extensive dataset comprising over 10,000 TDDFT calculations of excited state dynamics from a diverse set of OPV materials. Each training example includes a molecular structure and the corresponding TDDFT-calculated excited state lifetimes and key electronic transitions. Second, the trained model is tested on a separate set of molecules, and its predictions are rigorously compared to independent TDDFT calculations. The results indicate a remarkable degree of predictive accuracy. Specifically, for a test set of 1,000 OPV materials, the transformer model predicted excited state lifetimes with a mean absolute error of 0.15 picoseconds, a negligible deviation from TDDFT-calculated values. The model also correctly identified key electronic transitions contributing to the excited state dynamics in 92% of the test cases, signifying a substantial concordance with the results obtained via conventional quantum chemistry calculations. The practical integration of the transformer model with existing quantum chemistry software was also realized, demonstrating its potential as a powerful tool in the arsenal of materials scientists and chemists. The implementation of this AI model is estimated to reduce the computational cost of predicting excited state dynamics by two orders of magnitude compared to conventional TDDFT calculations. The successful utilization of transformer neural networks to accurately predict excited state dynamics provides an efficient computational pathway for the accelerated discovery and design of new OPV materials, potentially catalyzing advancements in the realm of sustainable energy solutions.Keywords: transformer neural networks, organic photovoltaic materials, excited state dynamics, time-dependent density functional theory, predictive modeling
Procedia PDF Downloads 12510033 Computational Material Modeling for Mechanical Properties Prediction of Nanoscale Carbon Based Cementitious Materials
Authors: Maryam Kiani, Abdul Basit Kiani
Abstract:
At larger scales, the performance of cementitious materials is impacted by processes occurring at the nanometer scale. These materials boast intricate hierarchical structures with random features that span from the nanometer to millimeter scale. It is fascinating to observe how the nanoscale processes influence the overall behavior and characteristics of these materials. By delving into and manipulating these processes, scientists and engineers can unlock the potential to create more durable and sustainable infrastructure and construction materials. It's like unraveling a hidden tapestry of secrets that hold the key to building stronger and more resilient structures. The present work employs simulations as the computational modeling methodology to predict mechanical properties for carbon/silica based cementitious materials at the molecular/nano scale level. Studies focused on understanding the effect of higher mechanical properties of cementitious materials with carbon silica nanoparticles via Material Studio materials modeling.Keywords: nanomaterials, SiO₂, carbon black, mechanical properties
Procedia PDF Downloads 15010032 Optimization of Processing Parameters of Acrylonitrile–Butadiene–Styrene Sheets Integrated by Taguchi Method
Authors: Fatemeh Sadat Miri, Morteza Ehsani, Seyed Farshid Hosseini
Abstract:
The present research is concerned with the optimization of extrusion parameters of ABS sheets by the Taguchi experimental design method. In this design method, three parameters of % recycling ABS, processing temperature and degassing time on mechanical properties, hardness, HDT, and color matching of ABS sheets were investigated. The variations of this research are the dosage of recycling ABS, processing temperature, and degassing time. According to experimental test data, the highest level of tensile strength and HDT belongs to the sample with 5% recycling ABS, processing temperature of 230°C, and degassing time of 3 hours. Additionally, the minimum level of MFI and color matching belongs to this sample, too. The present results are in good agreement with the Taguchi method. Based on the outcomes of the Taguchi design method, degassing time has the most effect on the mechanical properties of ABS sheets.Keywords: ABS, process optimization, Taguchi, mechanical properties
Procedia PDF Downloads 7710031 Semiautomatic Calculation of Ejection Fraction Using Echocardiographic Image Processing
Authors: Diana Pombo, Maria Loaiza, Mauricio Quijano, Alberto Cadena, Juan Pablo Tello
Abstract:
In this paper, we present a semi-automatic tool for calculating ejection fraction from an echocardiographic video signal which is derived from a database in DICOM format, of Clinica de la Costa - Barranquilla. Described in this paper are each of the steps and methods used to find the respective calculation that includes acquisition and formation of the test samples, processing and finally the calculation of the parameters to obtain the ejection fraction. Two imaging segmentation methods were compared following a methodological framework that is similar only in the initial stages of processing (process of filtering and image enhancement) and differ in the end when algorithms are implemented (Active Contour and Region Growing Algorithms). The results were compared with the measurements obtained by two different medical specialists in cardiology who calculated the ejection fraction of the study samples using the traditional method, which consists of drawing the region of interest directly from the computer using echocardiography equipment and a simple equation to calculate the desired value. The results showed that if the quality of video samples are good (i.e., after the pre-processing there is evidence of an improvement in the contrast), the values provided by the tool are substantially close to those reported by physicians; also the correlation between physicians does not vary significantly.Keywords: echocardiography, DICOM, processing, segmentation, EDV, ESV, ejection fraction
Procedia PDF Downloads 43010030 Accelerating Side Channel Analysis with Distributed and Parallelized Processing
Authors: Kyunghee Oh, Dooho Choi
Abstract:
Although there is no theoretical weakness in a cryptographic algorithm, Side Channel Analysis can find out some secret data from the physical implementation of a cryptosystem. The analysis is based on extra information such as timing information, power consumption, electromagnetic leaks or even sound which can be exploited to break the system. Differential Power Analysis is one of the most popular analyses, as computing the statistical correlations of the secret keys and power consumptions. It is usually necessary to calculate huge data and takes a long time. It may take several weeks for some devices with countermeasures. We suggest and evaluate the methods to shorten the time to analyze cryptosystems. Our methods include distributed computing and parallelized processing.Keywords: DPA, distributed computing, parallelized processing, side channel analysis
Procedia PDF Downloads 43210029 Influence of Processing Parameters on the Reliability of Sieving as a Particle Size Distribution Measurements
Authors: Eseldin Keleb
Abstract:
In the pharmaceutical industry particle size distribution is an important parameter for the characterization of pharmaceutical powders. The powder flowability, reactivity and compatibility, which have a decisive impact on the final product, are determined by particle size and size distribution. Therefore, the aim of this study was to evaluate the influence of processing parameters on the particle size distribution measurements. Different Size fractions of α-lactose monohydrate and 5% polyvinylpyrrolidone were prepared by wet granulation and were used for the preparation of samples. The influence of sieve load (50, 100, 150, 200, 250, 300, and 350 g), processing time (5, 10, and 15 min), sample size ratios (high percentage of small and large particles), type of disturbances (vibration and shaking) and process reproducibility have been investigated. Results obtained showed that a sieve load of 50 g produce the best separation, a further increase in sample weight resulted in incomplete separation even after the extension of the processing time for 15 min. Performing sieving using vibration was rapider and more efficient than shaking. Meanwhile between day reproducibility showed that particle size distribution measurements are reproducible. However, for samples containing 70% fines or 70% large particles, which processed at optimized parameters, the incomplete separation was always observed. These results indicated that sieving reliability is highly influenced by the particle size distribution of the sample and care must be taken for samples with particle size distribution skewness.Keywords: sieving, reliability, particle size distribution, processing parameters
Procedia PDF Downloads 61810028 Instructional Material Development in ODL: Achievements, Prospects, and Challenges
Authors: Felix Gbenoba, Opeyemi Dahunsi
Abstract:
Customised, self-instructional materials are at the heart of instructional delivery in Open and Distance Learning (ODL). The success of any ODL institution depends on the availability of learning materials in quality and quantity. An ODL study material is expected to imitate what the teacher does in the face-to-face learning environment. This paper evaluates these expectation based on existing data and evidence. It concludes that the reality has not matched the expectation so far in terms of pedagogic aspect of instructional delivery especially in West Africa. This does not mean that instructional materials development has not produced any significant positive results in improving the overall learning (and teaching) experience in these institutions; it implies what will help further to identify the new challenges. Obstacles and problems of instructional materials development that could have affected the open educational resource initiatives are well established. The first section of this paper recalls some of the proposed values of instructional materials. The second section compares achievements so far and suggests that instructional materials development should be consider first at an early stage to realise the aspirations of instructional delivery. The third section highlights the challenges of instructional materials development in the future.Keywords: face-to-face learning, instructional delivery, open and distance education, self-instructional materials
Procedia PDF Downloads 37810027 In vitro Method to Evaluate the Effect of Steam-Flaking on the Quality of Common Cereal Grains
Authors: Wanbao Chen, Qianqian Yao, Zhenming Zhou
Abstract:
Whole grains with intact pericarp are largely resistant to digestion by ruminants because entire kernels are not conducive to bacterial attachment. But processing methods makes the starch more accessible to microbes, and increases the rate and extent of starch degradation in the rumen. To estimate the feasibility of applying a steam-flaking as the processing technique of grains for ruminants, cereal grains (maize, wheat, barley and sorghum) were processed by steam-flaking (steam temperature 105°C, heating time, 45 min). And chemical analysis, in vitro gas production, volatile fatty acid concentrations, and energetic values were adopted to evaluate the effects of steam-flaking. In vitro cultivation was conducted for 48h with the rumen fluid collected from steers fed a total mixed ration consisted of 40% hay and 60% concentrates. The results showed that steam-flaking processing had a significant effect on the contents of neutral detergent fiber and acid detergent fiber (P < 0.01). The concentration of starch gelatinization degree in all grains was also great improved in steam-flaking grains, as steam-flaking processing disintegrates the crystal structure of cereal starch, which may subsequently facilitate absorption of moisture and swelling. Theoretical maximum gas production after steam-flaking processing showed no great difference. However, compared with intact grains, total gas production at 48 h and the rate of gas production were significantly (P < 0.01) increased in all types of grain. Furthermore, there was no effect of steam-flaking processing on total volatile fatty acid, but a decrease in the ratio between acetate and propionate was observed in the current in vitro fermentation. The present study also found that steam-flaking processing increased (P < 0.05) organic matter digestibility and energy concentration of the grains. The collective findings of the present study suggest that steam-flaking processing of grains could improve their rumen fermentation and energy utilization by ruminants. In conclusion, the utilization of steam-flaking would be practical to improve the quality of common cereal grains.Keywords: cereal grains, gas production, in vitro rumen fermentation, steam-flaking processing
Procedia PDF Downloads 27710026 Obtaining of Nanocrystalline Ferrites and Other Complex Oxides by Sol-Gel Method with Participation of Auto-Combustion
Authors: V. S. Bushkova
Abstract:
It is well known that in recent years magnetic materials have received increased attention due to their properties. For this reason a significant number of patents that were published during the last decade are oriented towards synthesis and study of such materials. The aim of this work is to create and study ferrite nanocrystalline materials with spinel structure, using sol-gel technology with participation of auto-combustion. This method is perspective in that it is a cheap and low-temperature technique that allows for the fine control on the product’s chemical composition.Keywords: magnetic materials, ferrites, sol-gel technology, nanocrystalline powders
Procedia PDF Downloads 41510025 Direct Synthesis of Composite Materials Type MCM-41/ZSM-5 by Hydrothermal at Atmospheric Pressure in Sealed Pyrex Tubes
Authors: Zoubida Lounis, Naouel Boumesla, Abd El Kader Bengueddach
Abstract:
The main objective of this study is to synthesize a composite materials by direct synthesis at atmospheric pression having the MFI structure and MCM-41 by using double structuring. In the first part of this work we are interested in the study of the synthesis parameters, in addition to temperature, the crystallization time and pH. The second part of this work is to vary the ratio of the concentrations of both structuring C9 [C9H19(CH3)3NBr] and C16 [C16H33(CH3)3NBr] and determining the area of formation of the two materials (microporous and mesoporous at same time), for this reason we performed a battery of experiments ranging from 0 to 100% for both structural. To enhance the economic purposes of this study, the experiments were carried out by using very cheap and simple process, the pyrex tubes were used instead of the reactors, and the synthesis were done at atmospheric pressure and moderate temperature. The final products (composite materials) were obtained at high and pure quality.Keywords: composite materials, syntheisis, catalysts, mesoporous materials, microporous materials
Procedia PDF Downloads 393