Search results for: rice processing
1568 Physicochemical, Heavy Metals Analysis of Some Multi-Floral Algerian Honeys
Authors: Assia Amri, Naima Layachi, Ali Ladjama
Abstract:
The characterization of some Algerian honey was carried out on the basis of their physico-chemical properties: moisture,hydroxy methyl furfural, diastase activity, pH,free, total and lactonic acidity, electrical conductivity, minerals and proline content. Studied samples are found to be low in moisture and therefore safe from fermentation, low in HMF level and high in diastase activity. Additionally the diastase activity and the HMF content are widely recognized parameters indicating the freshness of honey. Phenolic compounds present in honey are classified into two groups - simple phenols and polyphenols. The simple phenols in honey are various phenol acids, but polyphenols are various flavonoids and flavonides. The aim of our work was to determine antioxidant properties of various Algerian honey samples–the total phenol content, total flavonoids content, as well as honey anti radical activity.The quality of honey samples differs on account of various factors such as season, packaging and processing conditions, floral source, geographical origin and storage period. It is important that precautions should be taken to ensure standardization and rationalization of beekeeping techniques, manufacturing procedures and storing processes to improve honey quality.Keywords: honey, physico-chemical characterization, phenolic coumpound, HMF, diastase activity
Procedia PDF Downloads 4231567 Production of Energetic Nanomaterials by Spray Flash Evaporation
Authors: Martin Klaumünzer, Jakob Hübner, Denis Spitzer
Abstract:
Within this paper, latest results on processing of energetic nanomaterials by means of the Spray Flash Evaporation technique are presented. This technology constitutes a highly effective and continuous way to prepare fascinating materials on the nano- and micro-scale. Within the process, a solution is set under high pressure and sprayed into an evacuated atomization chamber. Subsequent ultrafast evaporation of the solvent leads to an aerosol stream, which is separated by cyclones or filters. No drying gas is required, so the present technique should not be confused with spray dying. Resulting nanothermites, insensitive explosives or propellants and compositions are foreseen to replace toxic (according to REACH) and very sensitive matter in military and civil applications. Diverse examples are given in detail: nano-RDX (n-Cyclotrimethylentrinitramin) and nano-aluminum based systems, mixtures (n-RDX/n-TNT - trinitrotoluene) or even cocrystalline matter like n-CL-20/HMX (Hexanitrohexaazaisowurtzitane/ Cyclotetra-methylentetranitramin). These nanomaterials show reduced sensitivity by trend without losing effectiveness and performance. An analytical study for material characterization was performed by using Atomic Force Microscopy, X-Ray Diffraction, and combined techniques as well as spectroscopic methods. As a matter of course, sensitivity tests regarding electrostatic discharge, impact, and friction are provided.Keywords: continuous synthesis, energetic material, nanoscale, nanoexplosive, nanothermite
Procedia PDF Downloads 2641566 Permanent Deformation Resistance of Asphalt Mixtures with Red Mud as a Filler
Authors: Liseane Padilha Thives, Mayara S. S. Lima, João Victor Staub De Melo, Glicério Trichês
Abstract:
Red mud is a waste resulting from the processing of bauxite to alumina, the raw material of the production of aluminum. The large quantity of red mud generated and inadequately disposed in the environment has motivated researchers to develop methods for reinsertion of this waste into the productive cycle. This work aims to evaluate the resistance to permanent deformation of dense asphalt mixtures with red mud filler. The red mud was characterized by tests of X-ray diffraction, fluorescence, specific mass, laser granulometry, pH and scanning electron microscopy. For the analysis of the influence of the quantity of red mud in the mechanical performance of asphalt mixtures, a total filler content of 7% was established. Asphalt mixtures with 3%, 5% and 7% red mud were produced. A conventional mixture with 7% stone powder filler was used as reference. The asphalt mixtures were evaluated for performance to permanent deformation in the French Rutting Tester (FRT) traffic simulator. The mixture with 5% red mud presented greater resistance to permanent deformation with rutting depth at 30,000 cycles of 3.50%. The asphalt mixtures with red mud presented better performance, with reduction of the rutting of 12.63 to 42.62% in relation to the reference mixture. This study confirmed the viability of reinserting the red mud in the production chain and possible usage in the construction industry. The red mud as filler in asphalt mixtures is a reuse option of this waste and mitigation of the disposal problems, as well as being an environmentally friendly alternative.Keywords: asphalt mixtures, permanent deformation, red mud, pavements
Procedia PDF Downloads 2891565 Purification, Biochemical Characterization and Application of an Extracellular Alkaline Keratinase Produced by Aspergillus sp. DHE7
Authors: Dina Helmy El-Ghonemy, Thanaa Hamed Ali
Abstract:
The aim of this study was to purify and characterize a keratinolytic enzyme produced by Aspergillus sp. DHE7 cultured in basal medium containing chicken feather as substrate. The enzyme was purified through ammonium sulfate saturation of 60%, followed by gel filtration chromatography in Sephadex G-100, with a 16.4-purification fold and recovery yield of 52.2%. Sodium dodecyl sulfate-polyacrylamide gel electrophoresis revealed that the purified enzyme is a monomeric enzyme with an apparent molecular mass of 30 kDa — the purified keratinase of Aspergillus sp. DHE7 exhibited activity in a broad range of pH (7- 9) and temperature (40℃-60℃) profiles with an optimal activity at pH eight and 50℃. The keratinolytic activity was inhibited by protease inhibitors such as phenylmethylsulfonyl fluoride and ethylenediaminetetraacetate, while no reduction of activity was detected by the addition of dimethyl sulfoxide (DMSO). Bivalent cations, Ca²⁺ and Mn²⁺, were able to greatly enhance the activity of keratinase by 125.7% and 194.8%, respectively, when used at one mM final concentration. On the other hand, Cu²⁺ and Hg²⁺ inhibited the enzyme activity, which might be indicative of essential vicinal sulfhydryl groups of the enzyme for productive catalysis. Furthermore, the purified keratinase showed significant stability and compatibility against the tested commercial detergents at 37ºC. Therefore, these results suggested that the purified keratinase from Aspergillus sp. DHE7 may have potential use in the detergent industry and should be of interest in the processing of poultry feather waste.Keywords: Aspergillus sp. DHE7, biochemical characterization, keratinase, purification, waste management
Procedia PDF Downloads 1241564 Simulation on Influence of Environmental Conditions on Part Distortion in Fused Deposition Modelling
Authors: Anto Antony Samy, Atefeh Golbang, Edward Archer, Alistair McIlhagger
Abstract:
Fused deposition modelling (FDM) is one of the additive manufacturing techniques that has become highly attractive in the industrial and academic sectors. However, parts fabricated through FDM are highly susceptible to geometrical defects such as warpage, shrinkage, and delamination that can severely affect their function. Among the thermoplastic polymer feedstock for FDM, semi-crystalline polymers are highly prone to part distortion due to polymer crystallization. In this study, the influence of FDM processing conditions such as chamber temperature and print bed temperature on the induced thermal residual stress and resulting warpage are investigated using the 3D transient thermal model for a semi-crystalline polymer. The thermo-mechanical properties and the viscoelasticity of the polymer, as well as the crystallization physics, which considers the crystallinity of the polymer, are coupled with the evolving temperature gradient of the print model. From the results, it was observed that increasing the chamber temperature from 25°C to 75°C lead to a decrease of 1.5% residual stress, while decreasing bed temperature from 100°C to 60°C, resulted in a 33% increase in residual stress and a significant rise of 138% in warpage. The simulated warpage data is validated by comparing it with the measured warpage values of the samples using 3D scanning.Keywords: finite element analysis, fused deposition modelling, residual stress, warpage
Procedia PDF Downloads 1871563 A Graph-Based Retrieval Model for Passage Search
Authors: Junjie Zhong, Kai Hong, Lei Wang
Abstract:
Passage Retrieval (PR) plays an important role in many Natural Language Processing (NLP) tasks. Traditional efficient retrieval models relying on exact term-matching, such as TF-IDF or BM25, have nowadays been exceeded by pre-trained language models which match by semantics. Though they gain effectiveness, deep language models often require large memory as well as time cost. To tackle the trade-off between efficiency and effectiveness in PR, this paper proposes Graph Passage Retriever (GraphPR), a graph-based model inspired by the development of graph learning techniques. Different from existing works, GraphPR is end-to-end and integrates both term-matching information and semantics. GraphPR constructs a passage-level graph from BM25 retrieval results and trains a GCN-like model on the graph with graph-based objectives. Passages were regarded as nodes in the constructed graph and were embedded in dense vectors. PR can then be implemented using embeddings and a fast vector-similarity search. Experiments on a variety of real-world retrieval datasets show that the proposed model outperforms related models in several evaluation metrics (e.g., mean reciprocal rank, accuracy, F1-scores) while maintaining a relatively low query latency and memory usage.Keywords: efficiency, effectiveness, graph learning, language model, passage retrieval, term-matching model
Procedia PDF Downloads 1501562 Bacteriological Safety of Sachet Drinking Water Sold in Benin City, Nigeria
Authors: Stephen Olusanmi Akintayo
Abstract:
Access to safe drinking water remains a major challenge in Nigeria, and where available, the quality of the water is often in doubt. An alternative to the inadequate clean drinking water is being found in treated drinking water packaged in electrically heated sealed nylon and commonly referred to as “sachet water”. “Sachet water” is a common thing in Nigeria as the selling price is within the reach of members of the low socio- economic class and the setting up of a production unit does not require huge capital input. The bacteriological quality of selected “sachet water” stored at room temperature over a period of 56 days was determined to evaluate the safety of the sachet drinking water. Test for the detection of coliform bacteria was performed, and the result showed no coliform bacteria that indicates the absence of fecal contamination throughout 56 days. Heterotrophic plate count (HPC) was done at an interval 14 days, and the samples showed HPC between 0 cfu/mL and 64 cfu/mL. The highest count was observed on day 1. The count decreased between day 1 and 28, while no growths were observed between day 42 and 56. The decrease in HPC suggested the presence of residual disinfectant in the water. The organisms isolated were identified as Staphylococcus epidermis and S. aureus. The presence of these microorganisms in sachet water is indicative for contamination during processing and handling.Keywords: coliform, heterotrophic plate count, sachet water, Staphyloccocus aureus, Staphyloccocus epidermidis
Procedia PDF Downloads 3411561 Effect of Cuminum Cyminum L. Essential Oil on Staphylococcus Aureus during the Manufacture, Ripening and Storage of White Brined Cheese
Authors: Ali Misaghi, Afshin Akhondzadeh Basti, Ehsan Sadeghi
Abstract:
Staphylococcus aureus is a pathogen of major concern for clinical infection and food borne illness. Humans and most domesticated animals harbor S. aureus, and so we may expect staphylococci to be present in food products of animal origin or in those handled directly by humans, unless heat processing is applied to destroy them. Cuminum cyminum L. has been allocated the topic of some recent studies in addition to its well-documented traditional usage for treatment of toothache, dyspepsia, diarrhea, epilepsy and jaundice. The air-dried seed of the plant was completely immersed in water and subjected to hydro distillation for 3 h, using a clevenger-type apparatus. In this study, the effect of Cuminum cyminum L. essential oil (EO) on growth of Staphylococcus aureus in white brined cheese was evaluated. The experiment included different levels of EO (0, 7.5, 15 and 30 mL/ 100 mL milk) to assess their effects on S. aureus count during the manufacture, ripening and storage of Iranian white brined cheese for up to 75 days. The significant (P < 0.05) inhibitory effects of EO (even at its lowest concentration) on this organism were observed. The significant (P < 0.05) inhibitory effect of the EO on S. aureus shown in this study may improve the scope of the EO function in the food industry.Keywords: cuminum cyminum L. essential oil, staphylococcus aureus, white brined cheese
Procedia PDF Downloads 3891560 Performance Comparison of Thread-Based and Event-Based Web Servers
Authors: Aikaterini Kentroti, Theodore H. Kaskalis
Abstract:
Today, web servers are expected to serve thousands of client requests concurrently within stringent response time limits. In this paper, we evaluate experimentally and compare the performance as well as the resource utilization of popular web servers, which differ in their approach to handle concurrency. More specifically, Central Processing Unit (CPU)- and I/O intensive tests were conducted against the thread-based Apache and Go as well as the event-based Nginx and Node.js under increasing concurrent load. The tests involved concurrent users requesting a term of the Fibonacci sequence (the 10th, 20th, 30th) and the content of a table from the database. The results show that Go achieved the best performance in all benchmark tests. For example, Go reached two times higher throughput than Node.js and five times higher than Apache and Nginx in the 20th Fibonacci term test. In addition, Go had the smallest memory footprint and demonstrated the most efficient resource utilization, in terms of CPU usage. Instead, Node.js had by far the largest memory footprint, consuming up to 90% more memory than Nginx and Apache. Regarding the performance of Apache and Nginx, our findings indicate that Hypertext Preprocessor (PHP) becomes a bottleneck when the servers are requested to respond by performing CPU-intensive tasks under increasing concurrent load.Keywords: apache, Go, Nginx, node.js, web server benchmarking
Procedia PDF Downloads 971559 Low Light Image Enhancement with Multi-Stage Interconnected Autoencoders Integration in Pix to Pix GAN
Authors: Muhammad Atif, Cang Yan
Abstract:
The enhancement of low-light images is a significant area of study aimed at enhancing the quality of captured images in challenging lighting environments. Recently, methods based on convolutional neural networks (CNN) have gained prominence as they offer state-of-the-art performance. However, many approaches based on CNN rely on increasing the size and complexity of the neural network. In this study, we propose an alternative method for improving low-light images using an autoencoder-based multiscale knowledge transfer model. Our method leverages the power of three autoencoders, where the encoders of the first two autoencoders are directly connected to the decoder of the third autoencoder. Additionally, the decoder of the first two autoencoders is connected to the encoder of the third autoencoder. This architecture enables effective knowledge transfer, allowing the third autoencoder to learn and benefit from the enhanced knowledge extracted by the first two autoencoders. We further integrate the proposed model into the PIX to PIX GAN framework. By integrating our proposed model as the generator in the GAN framework, we aim to produce enhanced images that not only exhibit improved visual quality but also possess a more authentic and realistic appearance. These experimental results, both qualitative and quantitative, show that our method is better than the state-of-the-art methodologies.Keywords: low light image enhancement, deep learning, convolutional neural network, image processing
Procedia PDF Downloads 801558 Laser Based Microfabrication of a Microheater Chip for Cell Culture
Authors: Daniel Nieto, Ramiro Couceiro
Abstract:
Microfluidic chips have demonstrated their significant application potentials in microbiological processing and chemical reactions, with the goal of developing monolithic and compact chip-sized multifunctional systems. Heat generation and thermal control are critical in some of the biochemical processes. The paper presents a laser direct-write technique for rapid prototyping and manufacturing of microheater chips and its applicability for perfusion cell culture outside a cell incubator. The aim of the microheater is to take the role of conventional incubators for cell culture for facilitating microscopic observation or other online monitoring activities during cell culture and provides portability of cell culture operation. Microheaters (5 mm × 5 mm) have been successfully fabricated on soda-lime glass substrates covered with aluminum layer of thickness 120 nm. Experimental results show that the microheaters exhibit good performance in temperature rise and decay characteristics, with localized heating at targeted spatial domains. These microheaters were suitable for a maximum long-term operation temperature of 120ºC and validated for long-time operation at 37ºC. for 24 hours. Results demonstrated that the physiology of the cultured SW480 adenocarcinoma of the colon cell line on the developed microheater chip was consistent with that of an incubator.Keywords: laser microfabrication, microheater, bioengineering, cell culture
Procedia PDF Downloads 2971557 Wear Assessment of SS316l-Al2O3 Composites for Heavy Wear Applications
Authors: Catherine Kuforiji, Michel Nganbe
Abstract:
The abrasive wear of composite materials is a major challenge in highly demanding wear applications. Therefore, this study focuses on fabricating, testing and assessing the properties of 50wt% SS316L stainless steel–50wt% Al2O3 particle composites. Composite samples were fabricated using the powder metallurgy route. The effects of the powder metallurgy processing parameters and hard particle reinforcement were studied. The microstructure, density, hardness and toughness were characterized. The wear behaviour was studied using pin-on-disc testing under dry sliding conditions. The highest hardness of 1085.2 HV, the highest theoretical density of 94.7% and the lowest wear rate of 0.00397 mm3/m were obtained at a milling speed of 720 rpm, a compaction pressure of 794.4 MPa and sintering at 1400 °C in an argon atmosphere. Compared to commercial SS316 and fabricated SS316L, the composites had 7.4 times and 11 times lower wear rate, respectively. However, the commercial 90WC-10Co showed 2.2 times lower wear rate compared to the fabricated SS316L-Al2O3 composites primarily due to the higher ceramic content of 90 wt.% in the reference WC-Co. However, eliminating the relatively high porosity of about 5 vol% using processes such as HIP and hot pressing can be expected to lead to further substantial improvements of the composites wear resistance.Keywords: SS316L, Al2O3, powder metallurgy, wear characterization
Procedia PDF Downloads 3041556 Futuristic Black Box Design Considerations and Global Networking for Real Time Monitoring of Flight Performance Parameters
Authors: K. Parandhama Gowd
Abstract:
The aim of this research paper is to conceptualize, discuss, analyze and propose alternate design methodologies for futuristic Black Box for flight safety. The proposal also includes global networking concepts for real time surveillance and monitoring of flight performance parameters including GPS parameters. It is expected that this proposal will serve as a failsafe real time diagnostic tool for accident investigation and location of debris in real time. In this paper, an attempt is made to improve the existing methods of flight data recording techniques and improve upon design considerations for futuristic FDR to overcome the trauma of not able to locate the block box. Since modern day communications and information technologies with large bandwidth are available coupled with faster computer processing techniques, the attempt made in this paper to develop a failsafe recording technique is feasible. Further data fusion/data warehousing technologies are available for exploitation.Keywords: flight data recorder (FDR), black box, diagnostic tool, global networking, cockpit voice and data recorder (CVDR), air traffic control (ATC), air traffic, telemetry, tracking and control centers ATTTCC)
Procedia PDF Downloads 5721555 Adaptive Swarm Balancing Algorithms for Rare-Event Prediction in Imbalanced Healthcare Data
Authors: Jinyan Li, Simon Fong, Raymond Wong, Mohammed Sabah, Fiaidhi Jinan
Abstract:
Clinical data analysis and forecasting have make great contributions to disease control, prevention and detection. However, such data usually suffer from highly unbalanced samples in class distributions. In this paper, we target at the binary imbalanced dataset, where the positive samples take up only the minority. We investigate two different meta-heuristic algorithms, particle swarm optimization and bat-inspired algorithm, and combine both of them with the synthetic minority over-sampling technique (SMOTE) for processing the datasets. One approach is to process the full dataset as a whole. The other is to split up the dataset and adaptively process it one segment at a time. The experimental results reveal that while the performance improvements obtained by the former methods are not scalable to larger data scales, the later one, which we call Adaptive Swarm Balancing Algorithms, leads to significant efficiency and effectiveness improvements on large datasets. We also find it more consistent with the practice of the typical large imbalanced medical datasets. We further use the meta-heuristic algorithms to optimize two key parameters of SMOTE. Leading to more credible performances of the classifier, and shortening the running time compared with the brute-force method.Keywords: Imbalanced dataset, meta-heuristic algorithm, SMOTE, big data
Procedia PDF Downloads 4411554 Factors Influencing the Logistics Services Providers' Performance: A Literature Overview
Authors: A. Aguezzoul
Abstract:
The Logistics Services Providers (LSPs) selection and performance is a strategic decision that affects the overall performance of any company as well as its supply chain. It is a complex process, which takes into account various conflicting quantitative and qualitative factors, as well as outsourced logistics activities. This article focuses on the evolution of the weights associated to these factors over the last years in order to better understand the change in the importance that logistics professionals place on them criteria when choosing their LSPs. For that, an analysis of 17 main studies published during 2014-2017 period was carried out and the results are compared to those of a previous literature review on this subject. Our analysis allowed us to deduce the following observations: 1) the LSPs selection is a multi-criteria process; 2) the empirical character of the majority of studies, conducted particularly in Asian countries; 3) the criteria importance has undergone significant changes following the emergence of information technologies that have favored the work in close collaboration and in partnership between the LSPs and their customers, even on a worldwide scale; 4) the cost criterion is relatively less important than in the past; and finally 5) with the development of sustainable supply chains, the factors associated with the logistic activities of return and waste processing (reverse logistics) are becoming increasingly important in this multi-criteria process of selection and evaluation of LSPs performance.Keywords: logistics outsourcing, logistics providers, multi-criteria decision making, performance
Procedia PDF Downloads 1541553 Analysis of Airborne Data Using Range Migration Algorithm for the Spotlight Mode of Synthetic Aperture Radar
Authors: Peter Joseph Basil Morris, Chhabi Nigam, S. Ramakrishnan, P. Radhakrishna
Abstract:
This paper brings out the analysis of the airborne Synthetic Aperture Radar (SAR) data using the Range Migration Algorithm (RMA) for the spotlight mode of operation. Unlike in polar format algorithm (PFA), space-variant defocusing and geometric distortion effects are mitigated in RMA since it does not assume that the illuminating wave-fronts are planar. This facilitates the use of RMA for imaging scenarios involving severe differential range curvatures enabling the imaging of larger scenes at fine resolution and at shorter ranges with low center frequencies. The RMA algorithm for the spotlight mode of SAR is analyzed in this paper using the airborne data. Pre-processing operations viz: - range de-skew and motion compensation to a line are performed on the raw data before being fed to the RMA component. Various stages of the RMA viz:- 2D Matched Filtering, Along Track Fourier Transform and Slot Interpolation are analyzed to find the performance limits and the dependence of the imaging geometry on the resolution of the final image. The ability of RMA to compensate for severe differential range curvatures in the two-dimensional spatial frequency domain are also illustrated in this paper.Keywords: range migration algorithm, spotlight SAR, synthetic aperture radar, matched filtering, slot interpolation
Procedia PDF Downloads 2411552 Nagabhasma Preparation and Its Effect on Kidneys: A Histopathological Study
Authors: Lydia Andrade, Kumar M. R. Bhat
Abstract:
Heavy metals, especially lead, is considered to be a multi-organ toxicant. However, such heavy metals, are used in the preparation of traditional medicines. Nagabhasma is one of the traditional medicines. Lead is the metal used in its preparation. Lead is converted into a health beneficial, organometallic compound, when subjected to various traditional methods of purification. Therefore, this study is designed to evaluate the effect of such processed lead in various stages of traditionally prepared Nagabhasma on the histological structure of kidneys. Using the human equivalent doses of Nagabhasma, various stages of its preparation were fed orally for 30 days and 60 days (short term and long term). The treated and untreated rats were then sacrificed for the collection of kidneys. The kidneys were processed for histopathological study. The results show severe changes in the histological structure of kidneys. The animals treated with lead acetate showed changes in the epithelial cells lining the bowman’s capsule. The proximal and distal convoluted tubules were dilated leading to atrophy of their epithelial cells. The amount of inflammatory infiltrates was more in this group. A few groups also showed pockets of inter-tubular hemorrhage. These changes, however, were minimized as the stages progressed form stages 1 to 4 of Nagabhasma preparation. Therefore, it is necessary to stringently monitor the processing of lead acetate during the preparation of Nagabhasma.Keywords: heavy metals, kidneys, lead acetate, Nagabhasma
Procedia PDF Downloads 1461551 3D Human Reconstruction over Cloud Based Image Data via AI and Machine Learning
Authors: Kaushik Sathupadi, Sandesh Achar
Abstract:
Human action recognition modeling is a critical task in machine learning. These systems require better techniques for recognizing body parts and selecting optimal features based on vision sensors to identify complex action patterns efficiently. Still, there is a considerable gap and challenges between images and videos, such as brightness, motion variation, and random clutters. This paper proposes a robust approach for classifying human actions over cloud-based image data. First, we apply pre-processing and detection, human and outer shape detection techniques. Next, we extract valuable information in terms of cues. We extract two distinct features: fuzzy local binary patterns and sequence representation. Then, we applied a greedy, randomized adaptive search procedure for data optimization and dimension reduction, and for classification, we used a random forest. We tested our model on two benchmark datasets, AAMAZ and the KTH Multi-view football datasets. Our HMR framework significantly outperforms the other state-of-the-art approaches and achieves a better recognition rate of 91% and 89.6% over the AAMAZ and KTH multi-view football datasets, respectively.Keywords: computer vision, human motion analysis, random forest, machine learning
Procedia PDF Downloads 361550 Automatic Generating CNC-Code for Milling Machine
Authors: Chalakorn Chitsaart, Suchada Rianmora, Mann Rattana-Areeyagon, Wutichai Namjaiprasert
Abstract:
G-code is the main factor in computer numerical control (CNC) machine for controlling the tool-paths and generating the profile of the object’s features. For obtaining high surface accuracy of the surface finish, non-stop operation is required for CNC machine. Recently, to design a new product, the strategy that concerns about a change that has low impact on business and does not consume lot of resources has been introduced. Cost and time for designing minor changes can be reduced since the traditional geometric details of the existing models are applied. In order to support this strategy as the alternative channel for machining operation, this research proposes the automatic generating codes for CNC milling operation. Using this technique can assist the manufacturer to easily change the size and the geometric shape of the product during the operation where the time spent for setting up or processing the machine are reduced. The algorithm implemented on MATLAB platform is developed by analyzing and evaluating the geometric information of the part. Codes are created rapidly to control the operations of the machine. Comparing to the codes obtained from CAM, this developed algorithm can shortly generate and simulate the cutting profile of the part.Keywords: geometric shapes, milling operation, minor changes, CNC Machine, G-code, cutting parameters
Procedia PDF Downloads 3491549 Evaluation of the Incorporation of Modified Starch in Puff Pastry Dough by Mixolab Rheological Analysis
Authors: Alejandra Castillo-Arias, Carlos A. Fuenmayor, Carlos M. Zuluaga-Domínguez
Abstract:
The connection between health and nutrition has driven the food industry to explore healthier and more sustainable alternatives. Key strategies to enhance nutritional quality and extend shelf life include reducing saturated fats and incorporating natural ingredients. One area of focus is the use of modified starch in baked goods, which has attracted significant interest in food science and industry due to its functional benefits. Modified starches are commonly used for their gelling, thickening, and water-retention properties. Derived from sources like waxy corn, potatoes, tapioca, or rice, these polysaccharides improve thermal stability and resistance to dough. The use of modified starch enhances the texture and structure of baked goods, which is crucial for consumer acceptance. In this study, it was evaluated the effects of modified starch inclusion on dough used for puff pastry elaboration, measured with Mixolab analysis. This technique assesses flour quality by examining its behavior under varying conditions, providing a comprehensive profile of its baking properties. The analysis included measurements of water absorption capacity, dough development time, dough stability, softening, final consistency, and starch gelatinization. Each of these parameters offers insights into how the flour will perform during baking and the quality of the final product. The performance of wheat flour with varying levels of modified starch inclusion (10%, 20%, 30%, and 40%) was evaluated through Mixolab analysis, with a control sample consisting of 100% wheat flour. Water absorption, gluten content, and retrogradation indices were analyzed to understand how modified starch affects dough properties. The results showed that the inclusion of modified starch increased the absorption index, especially at levels above 30%, indicating a dough with better handling qualities and potentially improved texture in the final baked product. However, the reduction in wheat flour resulted in a lower kneading index, affecting dough strength. Conversely, incorporating more than 20% modified starch reduced the retrogradation index, indicating improved stability and resistance to crystallization after cooling. Additionally, the modified starch improved the gluten index, contributing to better dough elasticity and stability, providing good structural support and resistance to deformation during mixing and baking. As expected, the control sample exhibited a higher amylase index, due to the presence of enzymes in wheat flour. However, this is of low concern in puff pastry dough, as amylase activity is more relevant in fermented doughs, which is not the case here. Overall, the use of modified starch in puff pastry enhanced product quality by improving texture, structure, and shelf life, particularly when used at levels between 30% and 40%. This research underscores the potential of modified starches to address health concerns associated with traditional starches and to contribute to the development of higher-quality, consumer-friendly baked products. Furthermore, the findings suggest that modified starches could play a pivotal role in future innovations within the baking industry, particularly in products aiming to balance healthfulness with sensory appeal. By incorporating modified starch into their formulations, bakeries can meet the growing demand for healthier, more sustainable products while maintaining the indulgent qualities that consumers expect from baked goods.Keywords: baking quality, dough properties, modified starch, puff pastry
Procedia PDF Downloads 221548 Information Theoretic Approach for Beamforming in Wireless Communications
Authors: Syed Khurram Mahmud, Athar Naveed, Shoaib Arif
Abstract:
Beamforming is a signal processing technique extensively utilized in wireless communications and radars for desired signal intensification and interference signal minimization through spatial selectivity. In this paper, we present a method for calculation of optimal weight vectors for smart antenna array, to achieve a directive pattern during transmission and selective reception in interference prone environment. In proposed scheme, Mutual Information (MI) extrema are evaluated through an energy constrained objective function, which is based on a-priori information of interference source and desired array factor. Signal to Interference plus Noise Ratio (SINR) performance is evaluated for both transmission and reception. In our scheme, MI is presented as an index to identify trade-off between information gain, SINR, illumination time and spatial selectivity in an energy constrained optimization problem. The employed method yields lesser computational complexity, which is presented through comparative analysis with conventional methods in vogue. MI based beamforming offers enhancement of signal integrity in degraded environment while reducing computational intricacy and correlating key performance indicators.Keywords: beamforming, interference, mutual information, wireless communications
Procedia PDF Downloads 2801547 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering
Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi
Abstract:
In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering
Procedia PDF Downloads 1501546 Non-Contact Measurement of Soil Deformation in a Cyclic Triaxial Test
Authors: Erica Elice Uy, Toshihiro Noda, Kentaro Nakai, Jonathan Dungca
Abstract:
Deformation in a conventional cyclic triaxial test is normally measured by using point-wise measuring device. In this study, non-contact measurement technique was applied to be able to monitor and measure the occurrence of non-homogeneous behavior of the soil under cyclic loading. Non-contact measurement is executed through image processing. Two-dimensional measurements were performed using Lucas and Kanade optical flow algorithm and it was implemented Labview. In this technique, the non-homogeneous deformation was monitored using a mirrorless camera. A mirrorless camera was used because it is economical and it has the capacity to take pictures at a fast rate. The camera was first calibrated to remove the distortion brought about the lens and the testing environment as well. Calibration was divided into 2 phases. The first phase was the calibration of the camera parameters and distortion caused by the lens. The second phase was to for eliminating the distortion brought about the triaxial plexiglass. A correction factor was established from this phase. A series of consolidated undrained cyclic triaxial test was performed using a coarse soil. The results from the non-contact measurement technique were compared to the measured deformation from the linear variable displacement transducer. It was observed that deformation was higher at the area where failure occurs.Keywords: cyclic loading, non-contact measurement, non-homogeneous, optical flow
Procedia PDF Downloads 3011545 Stabilizing Additively Manufactured Superalloys at High Temperatures
Authors: Keivan Davami, Michael Munther, Lloyd Hackel
Abstract:
The control of properties and material behavior by implementing thermal-mechanical processes is based on mechanical deformation and annealing according to a precise schedule that will produce a unique and stable combination of grain structure, dislocation substructure, texture, and dispersion of precipitated phases. The authors recently developed a thermal-mechanical technique to stabilize the microstructure of additively manufactured nickel-based superalloys even after exposure to high temperatures. However, the mechanism(s) that controls this stability is still under investigation. Laser peening (LP), also called laser shock peening (LSP), is a shock based (50 ns duration) post-processing technique used for extending performance levels and improving service life of critical components by developing deep levels of plastic deformation, thereby generating high density of dislocations and inducing compressive residual stresses in the surface and deep subsurface of components. These compressive residual stresses are usually accompanied with an increase in hardness and enhance the material’s resistance to surface-related failures such as creep, fatigue, contact damage, and stress corrosion cracking. While the LP process enhances the life span and durability of the material, the induced compressive residual stresses relax at high temperatures (>0.5Tm, where Tm is the absolute melting temperature), limiting the applicability of the technology. At temperatures above 0.5Tm, the compressive residual stresses relax, and yield strength begins to drop dramatically. The principal reason is the increasing rate of solid-state diffusion, which affects both the dislocations and the microstructural barriers. Dislocation configurations commonly recover by mechanisms such as climbing and recombining rapidly at high temperatures. Furthermore, precipitates coarsen, and grains grow; virtually all of the available microstructural barriers become ineffective.Our results indicate that by using “cyclic” treatments with sequential LP and annealing steps, the compressive stresses survive, and the microstructure is stable after exposure to temperatures exceeding 0.5Tm for a long period of time. When the laser peening process is combined with annealing, dislocations formed as a result of LPand precipitates formed during annealing have a complex interaction that provides further stability at high temperatures. From a scientific point of view, this research lays the groundwork for studying a variety of physical, materials science, and mechanical engineering concepts. This research could lead to metals operating at higher sustained temperatures enabling improved system efficiencies. The strengthening of metals by a variety of means (alloying, work hardening, and other processes) has been of interest for a wide range of applications. However, the mechanistic understanding of the often complex processes of interactionsbetween dislocations with solute atoms and with precipitates during plastic deformation have largely remained scattered in the literature. In this research, the elucidation of the actual mechanisms involved in the novel cyclic LP/annealing processes as a scientific pursuit is investigated through parallel studies of dislocation theory and the implementation of advanced experimental tools. The results of this research help with the validation of a novel laser processing technique for high temperature applications. This will greatly expand the applications of the laser peening technology originally devised only for temperatures lower than half of the melting temperature.Keywords: laser shock peening, mechanical properties, indentation, high temperature stability
Procedia PDF Downloads 1491544 The Evolution of the Human Brain from the Hind Brain to the Fore Brain: Dialectics from the African Perspective in Understanding Stunted Development in Science and Technology
Authors: Philemon Wokoma Iyagba, Obey Onenee Christie
Abstract:
From the hindbrain, which is responsible for motor activities, to the forebrain, responsible for processing information related to complex cognitive activities, the human brain has continued to evolve over the years. This evolution- has been progressive, leading to advancements in science and technology. However, the development of science and technology in Africa, where ancient civilization arguably began, has been retrogressive. Dialectics was done by dissecting different opinions on the reason behind the stunted development of science and technology in Africa. The researchers proposed that the inability to sustain the technological advancements made by early Africans is due to poor or lack of replicability of the African knowledge-based system, almost no or poor documentation of adopted procedures and the approval-seeking mentality that cheaply paved the way for westernization which also led to the adulteration of the African way of life and education without making room for incorporating her identity and proper alignment of her rich cultural heritage in education and her enormous achievements before and during the middle age. This article discussed conceptual issues, with its positions based on established facts, the discussion was based on relevant literature and recommendations were made accordingly.Keywords: forebrain, hindbrain, dialectics from African perspective, development in science and technology
Procedia PDF Downloads 771543 Non-Targeted Adversarial Image Classification Attack-Region Modification Methods
Authors: Bandar Alahmadi, Lethia Jackson
Abstract:
Machine Learning model is used today in many real-life applications. The safety and security of such model is important, so the results of the model are as accurate as possible. One challenge of machine learning model security is the adversarial examples attack. Adversarial examples are designed by the attacker to cause the machine learning model to misclassify the input. We propose a method to generate adversarial examples to attack image classifiers. We are modifying the successfully classified images, so a classifier misclassifies them after the modification. In our method, we do not update the whole image, but instead we detect the important region, modify it, place it back to the original image, and then run it through a classifier. The algorithm modifies the detected region using two methods. First, it will add abstract image matrix on back of the detected image matrix. Then, it will perform a rotation attack to rotate the detected region around its axes, and embed the trace of image in image background. Finally, the attacked region is placed in its original position, from where it was removed, and a smoothing filter is applied to smooth the background with foreground. We test our method in cascade classifier, and the algorithm is efficient, the classifier confident has dropped to almost zero. We also try it in CNN (Convolutional neural network) with higher setting and the algorithm was successfully worked.Keywords: adversarial examples, attack, computer vision, image processing
Procedia PDF Downloads 3391542 The Global-Local Dimension in Cognitive Control after Left Lateral Prefrontal Cortex Damage: Evidence from the Non-Verbal Domain
Authors: Eleni Peristeri, Georgia Fotiadou, Ianthi-Maria Tsimpli
Abstract:
The local-global dimension has been studied extensively in healthy controls and preference for globally processed stimuli has been validated in both the visual and auditory modalities. Critically, the local-global dimension has an inherent interference resolution component, a type of cognitive control, and left-prefrontal-cortex-damaged (LPFC) individuals have exhibited inability to override habitual response behaviors in item recognition tasks that involve representational interference. Eight patients with damage in the left PFC (age range: 32;5 to 69;0. Mean age: 54;6 yrs) and twenty age- and education-matched language-unimpaired adults (mean age: 56;7yrs) have participated in the study. Distinct performance patterns were found between the language-unimpaired and the LPFC-damaged group which have mainly stemmed from the latter’s difficulty with inhibiting global stimuli in incongruent trials. Overall, the local-global attentional dimension affects LPFC-damaged individuals with non-fluent aphasia in non-language domains implicating distinct types of inhibitory processes depending on the level of processing.Keywords: left lateral prefrontal cortex damage (LPFC), local-global non-language attention, representational interference, non-fluent aphasia
Procedia PDF Downloads 4691541 Denoising of Motor Unit Action Potential Based on Tunable Band-Pass Filter
Authors: Khalida S. Rijab, Mohammed E. Safi, Ayad A. Ibrahim
Abstract:
When electrical electrodes are mounted on the skin surface of the muscle, a signal is detected when a skeletal muscle undergoes contraction; the signal is known as surface electromyographic signal (EMG). This signal has a noise-like interference pattern resulting from the temporal and spatial summation of action potentials (AP) of all active motor units (MU) near electrode detection. By appropriate processing (Decomposition), the surface EMG signal may be used to give an estimate of motor unit action potential. In this work, a denoising technique is applied to the MUAP signals extracted from the spatial filter (IB2). A set of signals from a non-invasive two-dimensional grid of 16 electrodes from different types of subjects, muscles, and sex are recorded. These signals will acquire noise during recording and detection. A digital fourth order band- pass Butterworth filter is used for denoising, with a tuned band-pass frequency of suitable choice of cutoff frequencies is investigated, with the aim of obtaining a suitable band pass frequency. Results show an improvement of (1-3 dB) in the signal to noise ratio (SNR) have been achieved, relative to the raw spatial filter output signals for all cases that were under investigation. Furthermore, the research’s goal included also estimation and reconstruction of the mean shape of the MUAP.Keywords: EMG, Motor Unit, Digital Filter, Denoising
Procedia PDF Downloads 4011540 Inflammatory Alleviation on Microglia Cells by an Apoptotic Mimicry
Authors: Yi-Feng Kao, Huey-Jine Chai, Chin-I Chang, Yi-Chen Chen, June-Ru Chen
Abstract:
Microglia is a macrophage that resides in brain, and overactive microglia may result in brain neuron damage or inflammation. In this study, the phospholipids was extracted from squid skin and manufactured into a liposome (SQ liposome) to mimic apoptotic body. We then evaluated anti-inflammatory effects of SQ liposome on mouse microglial cell line (BV-2) by lipopolysaccharide (LPS) induction. First, the major phospholipid constituents in the squid skin extract were including 46.2% of phosphatidylcholine, 18.4% of phosphatidylethanolamine, 7.7% of phosphatidylserine, 3.5% of phosphatidylinositol, 4.9% of Lysophosphatidylcholine and 19.3% of other phospholipids by HPLC-UV analysis. The contents of eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) in the squid skin extract were 11.8 and 28.7%, respectively. The microscopic images showed that microglia cells can engulf apoptotic cells or SQ-liposome. In cell based studies, there was no cytotoxicity to BV-2 as the concentration of SQ-liposome was less than 2.5 mg/mL. The LPS induced pro-inflammatory cytokines, including tumor necrosis factor-alpha (TNF-α) and interleukin-6 (IL-6), were significant suppressed (P < 0.05) by pretreated 0.03~2.5mg/ml SQ liposome. Oppositely, the anti-inflammatory cytokines transforming growth factor-beta (TGF-β) and interleukin-10 (IL-10) secretion were enhanced (P < 0.05). The results suggested that SQ-liposome possess anti-inflammatory properties on BV-2 and may be a good strategy for against neuro-inflammatory disease.Keywords: apoptotic mimicry, neuroinflammation, microglia, squid processing by-products
Procedia PDF Downloads 4831539 Anthropometric Data Variation within Gari-Frying Population
Authors: T. M. Samuel, O. O. Aremu, I. O. Ismaila, L. I. Onu, B. O. Adetifa, S. E. Adegbite, O. O. Olokoshe
Abstract:
The imperative of anthropometry in designing to fit cannot be overemphasized. Of essence is the variability of measurements among population for which data is collected. In this paper anthropometric data were collected for the design of gari-frying facility such that work system would be designed to fit the gari-frying population in the Southwestern states of Nigeria comprising Lagos, Ogun, Oyo, Osun, Ondo, and Ekiti. Twenty-seven body dimensions were measured among 120 gari-frying processors. Statistical analysis was performed using SPSS package to determine the mean, standard deviation, minimum value, maximum value and percentiles (2nd, 5th, 25th, 50th, 75th, 95th, and 98th) of the different anthropometric parameters. One sample t-test was conducted to determine the variation within the population. The 50th percentiles of some of the anthropometric parameters were compared with those from other populations in literature. The correlation between the worker’s age and the body anthropometry was also investigated.The mean weight, height, shoulder height (sitting), eye height (standing) and eye height (sitting) are 63.37 kg, 1.57 m, 0.55 m, 1.45 m, and 0.67 m respectively.Result also shows a high correlation with other populations and a statistically significant difference in variability of data within the population in all the body dimensions measured. With a mean age of 42.36 years, results shows that age will be a wrong indicator for estimating the anthropometry for the population.Keywords: anthropometry, cassava processing, design to fit, gari-frying, workstation design
Procedia PDF Downloads 253