Search results for: thin film processing
1638 Assessment of the Potential of Fuel-derived Rice Husk Ash as Pozzolanic Material
Authors: Jesha Faye T. Librea, Leslie Joy L. Diaz
Abstract:
Fuel-derived rice husk ash (fRHA) is a waste material from industries employing rice husk as a biomass fuel which, on the downside, causes disposal and environmental problems. To mitigate this, the fRHA was evaluated for use in other applications such as a pozzolanic material for the construction industry. In this study, the assessment of the potential of fRHA as pozzolanic supplementary cementitious material was conducted by determining the chemical and physical properties of fRHA according to ASTM C618, evaluating the fineness of the material according to ASTM C430, and determining its pozzolanic activity using Luxan Method. The material was found to have a high amorphous silica content of around 95.82 % with traces of alkaline and carbon impurities. The retained carbon residue is 7.18 %, which is within the limit of the specifications for natural pozzolans indicated in ASTM C618. The fineness of the fRHA is at 88.88 % retained at a 45-micron sieve, which, however, exceeded the limit of 34 %. This large particle size distribution was found to affect the pozzolanic activity of the fRHA. This was shown in the Luxan test, where the fRHA was identified as non-pozzolan due to its low pozzolanic activity index of 0.262. Thus, further processing must be done to the fRHA to pass the required ASTM fineness, have a higher pozzolanic activity index, and fully qualify as a pozzolanic material.Keywords: rice husk ash, pozzolanic, fuel-derived ash, supplementary cementitious material
Procedia PDF Downloads 661637 A Hybrid Feature Selection and Deep Learning Algorithm for Cancer Disease Classification
Authors: Niousha Bagheri Khulenjani, Mohammad Saniee Abadeh
Abstract:
Learning from very big datasets is a significant problem for most present data mining and machine learning algorithms. MicroRNA (miRNA) is one of the important big genomic and non-coding datasets presenting the genome sequences. In this paper, a hybrid method for the classification of the miRNA data is proposed. Due to the variety of cancers and high number of genes, analyzing the miRNA dataset has been a challenging problem for researchers. The number of features corresponding to the number of samples is high and the data suffer from being imbalanced. The feature selection method has been used to select features having more ability to distinguish classes and eliminating obscures features. Afterward, a Convolutional Neural Network (CNN) classifier for classification of cancer types is utilized, which employs a Genetic Algorithm to highlight optimized hyper-parameters of CNN. In order to make the process of classification by CNN faster, Graphics Processing Unit (GPU) is recommended for calculating the mathematic equation in a parallel way. The proposed method is tested on a real-world dataset with 8,129 patients, 29 different types of tumors, and 1,046 miRNA biomarkers, taken from The Cancer Genome Atlas (TCGA) database.Keywords: cancer classification, feature selection, deep learning, genetic algorithm
Procedia PDF Downloads 1111636 Colour Quick Response Code with High Damage Resistance Capability
Authors: Minh Nguyen
Abstract:
Today, QR or Quick Response Codes are prevalent, and mobile/smart devices can efficiently read and understand them. Therefore, we can see their appearance in many areas, such as storing web pages/websites, business phone numbers, redirecting to an app download, business location, social media. The popularity of the QR Code is mainly because of its many advantages, such as it can hold a good amount of information, is small, easy to scan and read by a general RGB camera, and it can still work with some damages on its surface. However, there are still some issues. For instance, some areas needed to be kept untouched for its successful decode (e.g., the “Finder Patterns,” the “Quiet Zone,” etc.), the capability of built-in auto-correction is not robust enough, and it is not flexible enough for many application such as Augment Reality (AR). We proposed a new Colour Quick Response Code that has several advantages over the original ones: (1) there is no untouchable area, (2) it allows up to 40% of the entire code area to be damaged, (3) it is more beneficial for Augmented Reality applications, and (4) it is back-compatible and readable by available QR Code scanners such as Pyzbar. From our experience, our Colour Quick Response Code is significantly more flexible on damage compared to the original QR Code. Our code is believed to be suitable in situations where standard 2D Barcodes fail to work, such as curved and shiny surfaces, for instance, medical blood test sample tubes and syringes.Keywords: QR code, computer vision, image processing, 2D barcode
Procedia PDF Downloads 1181635 Simulation and Experimental Research on Pocketing Operation for Toolpath Optimization in CNC Milling
Authors: Rakesh Prajapati, Purvik Patel, Avadhoot Rajurkar
Abstract:
Nowadays, manufacturing industries augment their production lines with modern machining centers backed by CAM software. Several attempts are being made to cut down the programming time for machining complex geometries. Special programs/software have been developed to generate the digital numerical data and to prepare NC programs by using suitable post-processors for different machines. By selecting the tools and manufacturing process then applying tool paths and NC program are generated. More and more complex mechanical parts that earlier were being cast and assembled/manufactured by other processes are now being machined. Majority of these parts require lots of pocketing operations and find their applications in die and mold, turbo machinery, aircraft, nuclear, defense etc. Pocketing operations involve removal of large quantity of material from the metal surface. The modeling of warm cast and clamping a piece of food processing parts which the used of Pro-E and MasterCAM® software. Pocketing operation has been specifically chosen for toolpath optimization. Then after apply Pocketing toolpath, Multi Tool Selection and Reduce Air Time give the results of software simulation time and experimental machining time.Keywords: toolpath, part program, optimization, pocket
Procedia PDF Downloads 2881634 An Application for Risk of Crime Prediction Using Machine Learning
Authors: Luis Fonseca, Filipe Cabral Pinto, Susana Sargento
Abstract:
The increase of the world population, especially in large urban centers, has resulted in new challenges particularly with the control and optimization of public safety. Thus, in the present work, a solution is proposed for the prediction of criminal occurrences in a city based on historical data of incidents and demographic information. The entire research and implementation will be presented start with the data collection from its original source, the treatment and transformations applied to them, choice and the evaluation and implementation of the Machine Learning model up to the application layer. Classification models will be implemented to predict criminal risk for a given time interval and location. Machine Learning algorithms such as Random Forest, Neural Networks, K-Nearest Neighbors and Logistic Regression will be used to predict occurrences, and their performance will be compared according to the data processing and transformation used. The results show that the use of Machine Learning techniques helps to anticipate criminal occurrences, which contributed to the reinforcement of public security. Finally, the models were implemented on a platform that will provide an API to enable other entities to make requests for predictions in real-time. An application will also be presented where it is possible to show criminal predictions visually.Keywords: crime prediction, machine learning, public safety, smart city
Procedia PDF Downloads 1121633 Development of a Shape Based Estimation Technology Using Terrestrial Laser Scanning
Authors: Gichun Cha, Byoungjoon Yu, Jihwan Park, Minsoo Park, Junghyun Im, Sehwan Park, Sujung Sin, Seunghee Park
Abstract:
The goal of this research is to estimate a structural shape change using terrestrial laser scanning. This study proceeds with development of data reduction and shape change estimation algorithm for large-capacity scan data. The point cloud of scan data was converted to voxel and sampled. Technique of shape estimation is studied to detect changes in structure patterns, such as skyscrapers, bridges, and tunnels based on large point cloud data. The point cloud analysis applies the octree data structure to speed up the post-processing process for change detection. The point cloud data is the relative representative value of shape information, and it used as a model for detecting point cloud changes in a data structure. Shape estimation model is to develop a technology that can detect not only normal but also immediate structural changes in the event of disasters such as earthquakes, typhoons, and fires, thereby preventing major accidents caused by aging and disasters. The study will be expected to improve the efficiency of structural health monitoring and maintenance.Keywords: terrestrial laser scanning, point cloud, shape information model, displacement measurement
Procedia PDF Downloads 2351632 Sensor Monitoring of the Concentrations of Different Gases Present in Synthesis of Ammonia Based on Multi-Scale Entropy and Multivariate Statistics
Authors: S. Aouabdi, M. Taibi
Abstract:
The supervision of chemical processes is the subject of increased development because of the increasing demands on reliability and safety. An important aspect of the safe operation of chemical process is the earlier detection of (process faults or other special events) and the location and removal of the factors causing such events, than is possible by conventional limit and trend checks. With the aid of process models, estimation and decision methods it is possible to also monitor hundreds of variables in a single operating unit, and these variables may be recorded hundreds or thousands of times per day. In the absence of appropriate processing method, only limited information can be extracted from these data. Hence, a tool is required that can project the high-dimensional process space into a low-dimensional space amenable to direct visualization, and that can also identify key variables and important features of the data. Our contribution based on powerful techniques for development of a new monitoring method based on multi-scale entropy MSE in order to characterize the behaviour of the concentrations of different gases present in synthesis and soft sensor based on PCA is applied to estimate these variables.Keywords: ammonia synthesis, concentrations of different gases, soft sensor, multi-scale entropy, multivarite statistics
Procedia PDF Downloads 3361631 Biological Activity of Bilberry Pomace
Authors: Gordana S. Ćetković, Vesna T. Tumbas Šaponjac, Sonja M. Djilas, Jasna M. Čanadanović-Brunet, Sladjana M. Stajčić, Jelena J. Vulić
Abstract:
Bilberry is one of the most important dietary sources of phenolic compounds, including anthocyanins, phenolic acids, flavonol glycosides and flavan-3-ols. These phytochemicals have different biological activities and therefore may improve our health condition. Also, anthocyanins are interesting to the food industry as colourants. In the present study, bilberry pomace, a by-product of juice processing, was used as a potential source of bioactive compounds. The contents of total phenolic acids, flavonoids and anthocyanins in bilberry pomace were determined by HPLC/UV-Vis. The biological activities of bilberry pomace were evaluated by reducing power (RP) and α-glucosidase inhibitory potential (α-GIP), and expressed as RP0.5 value (the effective concentration of bilberry pomace extract assigned at 0.5 value of absorption) and IC50 value (the concentration of bilberry pomace extract necessary to inhibit 50% of α-glucosidase enzyme activity). Total phenolic acids content was 807.12 ± 25.16 mg/100 g pomace, flavonoids 54.36 ± 1.83mg/100 g pomace and anthocyanins 3426.18 ± 112.09 mg/100 g pomace. The RP0.5 value of bilberry pomace was 0.38 ± 0.02 mg/ml, while IC50 value was 1.82 ± 0.11 mg/ml. These results have revealed the potential for valorization of bilberry juice production by-products for further industrial use as a rich source of bioactive compounds and natural colourants (mainly anthocyanins).Keywords: bilberry pomace, phenolics, antioxidant activity, reducing power, α-glucosidase enzyme activity
Procedia PDF Downloads 5991630 Wind Load Reduction Effect of Exterior Porous Skin on Facade Performance
Authors: Ying-Chang Yu, Yuan-Lung Lo
Abstract:
Building envelope design is one of the most popular design fields of architectural profession in nowadays. The main design trend of such system is to highlight the designer's aesthetic intention from the outlook of building project. Due to the trend of current façade design, the building envelope contains more and more layers of components, such as double skin façade, photovoltaic panels, solar control system, or even ornamental components. These exterior components are designed for various functional purposes. Most researchers focus on how these exterior elements should be structurally sound secured. However, not many researchers consider these elements would help to improve the performance of façade system. When the exterior elements are deployed in large scale, it creates an additional layer outside of original façade system and acts like a porous interface which would interfere with the aerodynamic of façade surface in micro-scale. A standard façade performance consists with 'water penetration, air infiltration rate, operation force, and component deflection ratio', and these key performances are majorly driven by the 'Design Wind Load' coded in local regulation. A design wind load is usually determined by the maximum wind pressure which occurs on the surface due to the geometry or location of building in extreme conditions. This research was designed to identify the air damping phenomenon of micro turbulence caused by porous exterior layer leading to surface wind load reduction for improvement of façade system performance. A series of wind tunnel test on dynamic pressure sensor array covered by various scale of porous exterior skin was conducted to verify the effect of wind pressure reduction. The testing specimens were designed to simulate the typical building with two-meter extension offsetting from building surface. Multiple porous exterior skins were prepared to replicate various opening ratio of surface which may cause different level of damping effect. This research adopted 'Pitot static tube', 'Thermal anemometers', and 'Hot film probe' to collect the data of surface dynamic pressure behind porous skin. Turbulence and distributed resistance are the two main factors of aerodynamic which would reduce the actual wind pressure. From initiative observation, the reading of surface wind pressure was effectively reduced behind porous media. In such case, an actual building envelope system may be benefited by porous skin from the reduction of surface wind pressure, which may improve the performance of envelope system consequently.Keywords: multi-layer facade, porous media, facade performance, turbulence and distributed resistance, wind tunnel test
Procedia PDF Downloads 2201629 AI and the Future of Misinformation: Opportunities and Challenges
Authors: Noor Azwa Azreen Binti Abd. Aziz, Muhamad Zaim Bin Mohd Rozi
Abstract:
Moving towards the 4th Industrial Revolution, artificial intelligence (AI) is now more popular than ever. This subject is gaining significance every day and is continually expanding, often merging with other fields. Instead of merely being passive observers, there are benefits to understanding modern technology by delving into its inner workings. However, in a world teeming with digital information, the impact of AI on the spread of disinformation has garnered significant attention. The dissemination of inaccurate or misleading information is referred to as misinformation, posing a serious threat to democratic society, public debate, and individual decision-making. This article delves deep into the connection between AI and the dissemination of false information, exploring its potential, risks, and ethical issues as AI technology advances. The rise of AI has ushered in a new era in the dissemination of misinformation as AI-driven technologies are increasingly responsible for curating, recommending, and amplifying information on online platforms. While AI holds the potential to enhance the detection and mitigation of misinformation through natural language processing and machine learning, it also raises concerns about the amplification and propagation of false information. AI-powered deepfake technology, for instance, can generate hyper-realistic videos and audio recordings, making it increasingly challenging to discern fact from fiction.Keywords: artificial intelligence, digital information, disinformation, ethical issues, misinformation
Procedia PDF Downloads 921628 Correlation between Funding and Publications: A Pre-Step towards Future Research Prediction
Authors: Ning Kang, Marius Doornenbal
Abstract:
Funding is a very important – if not crucial – resource for research projects. Usually, funding organizations will publish a description of the funded research to describe the scope of the funding award. Logically, we would expect research outcomes to align with this funding award. For that reason, we might be able to predict future research topics based on present funding award data. That said, it remains to be shown if and how future research topics can be predicted by using the funding information. In this paper, we extract funding project information and their generated paper abstracts from the Gateway to Research database as a group, and use the papers from the same domains and publication years in the Scopus database as a baseline comparison group. We annotate both the project awards and the papers resulting from the funded projects with linguistic features (noun phrases), and then calculate tf-idf and cosine similarity between these two set of features. We show that the cosine similarity between the project-generated papers group is bigger than the project-baseline group, and also that these two groups of similarities are significantly different. Based on this result, we conclude that the funding information actually correlates with the content of future research output for the funded project on the topical level. How funding really changes the course of science or of scientific careers remains an elusive question.Keywords: natural language processing, noun phrase, tf-idf, cosine similarity
Procedia PDF Downloads 2461627 Physicochemical, Heavy Metals Analysis of Some Multi-Floral Algerian Honeys
Authors: Assia Amri, Naima Layachi, Ali Ladjama
Abstract:
The characterization of some Algerian honey was carried out on the basis of their physico-chemical properties: moisture,hydroxy methyl furfural, diastase activity, pH,free, total and lactonic acidity, electrical conductivity, minerals and proline content. Studied samples are found to be low in moisture and therefore safe from fermentation, low in HMF level and high in diastase activity. Additionally the diastase activity and the HMF content are widely recognized parameters indicating the freshness of honey. Phenolic compounds present in honey are classified into two groups - simple phenols and polyphenols. The simple phenols in honey are various phenol acids, but polyphenols are various flavonoids and flavonides. The aim of our work was to determine antioxidant properties of various Algerian honey samples–the total phenol content, total flavonoids content, as well as honey anti radical activity.The quality of honey samples differs on account of various factors such as season, packaging and processing conditions, floral source, geographical origin and storage period. It is important that precautions should be taken to ensure standardization and rationalization of beekeeping techniques, manufacturing procedures and storing processes to improve honey quality.Keywords: honey, physico-chemical characterization, phenolic coumpound, HMF, diastase activity
Procedia PDF Downloads 4231626 Production of Energetic Nanomaterials by Spray Flash Evaporation
Authors: Martin Klaumünzer, Jakob Hübner, Denis Spitzer
Abstract:
Within this paper, latest results on processing of energetic nanomaterials by means of the Spray Flash Evaporation technique are presented. This technology constitutes a highly effective and continuous way to prepare fascinating materials on the nano- and micro-scale. Within the process, a solution is set under high pressure and sprayed into an evacuated atomization chamber. Subsequent ultrafast evaporation of the solvent leads to an aerosol stream, which is separated by cyclones or filters. No drying gas is required, so the present technique should not be confused with spray dying. Resulting nanothermites, insensitive explosives or propellants and compositions are foreseen to replace toxic (according to REACH) and very sensitive matter in military and civil applications. Diverse examples are given in detail: nano-RDX (n-Cyclotrimethylentrinitramin) and nano-aluminum based systems, mixtures (n-RDX/n-TNT - trinitrotoluene) or even cocrystalline matter like n-CL-20/HMX (Hexanitrohexaazaisowurtzitane/ Cyclotetra-methylentetranitramin). These nanomaterials show reduced sensitivity by trend without losing effectiveness and performance. An analytical study for material characterization was performed by using Atomic Force Microscopy, X-Ray Diffraction, and combined techniques as well as spectroscopic methods. As a matter of course, sensitivity tests regarding electrostatic discharge, impact, and friction are provided.Keywords: continuous synthesis, energetic material, nanoscale, nanoexplosive, nanothermite
Procedia PDF Downloads 2641625 Permanent Deformation Resistance of Asphalt Mixtures with Red Mud as a Filler
Authors: Liseane Padilha Thives, Mayara S. S. Lima, João Victor Staub De Melo, Glicério Trichês
Abstract:
Red mud is a waste resulting from the processing of bauxite to alumina, the raw material of the production of aluminum. The large quantity of red mud generated and inadequately disposed in the environment has motivated researchers to develop methods for reinsertion of this waste into the productive cycle. This work aims to evaluate the resistance to permanent deformation of dense asphalt mixtures with red mud filler. The red mud was characterized by tests of X-ray diffraction, fluorescence, specific mass, laser granulometry, pH and scanning electron microscopy. For the analysis of the influence of the quantity of red mud in the mechanical performance of asphalt mixtures, a total filler content of 7% was established. Asphalt mixtures with 3%, 5% and 7% red mud were produced. A conventional mixture with 7% stone powder filler was used as reference. The asphalt mixtures were evaluated for performance to permanent deformation in the French Rutting Tester (FRT) traffic simulator. The mixture with 5% red mud presented greater resistance to permanent deformation with rutting depth at 30,000 cycles of 3.50%. The asphalt mixtures with red mud presented better performance, with reduction of the rutting of 12.63 to 42.62% in relation to the reference mixture. This study confirmed the viability of reinserting the red mud in the production chain and possible usage in the construction industry. The red mud as filler in asphalt mixtures is a reuse option of this waste and mitigation of the disposal problems, as well as being an environmentally friendly alternative.Keywords: asphalt mixtures, permanent deformation, red mud, pavements
Procedia PDF Downloads 2901624 Purification, Biochemical Characterization and Application of an Extracellular Alkaline Keratinase Produced by Aspergillus sp. DHE7
Authors: Dina Helmy El-Ghonemy, Thanaa Hamed Ali
Abstract:
The aim of this study was to purify and characterize a keratinolytic enzyme produced by Aspergillus sp. DHE7 cultured in basal medium containing chicken feather as substrate. The enzyme was purified through ammonium sulfate saturation of 60%, followed by gel filtration chromatography in Sephadex G-100, with a 16.4-purification fold and recovery yield of 52.2%. Sodium dodecyl sulfate-polyacrylamide gel electrophoresis revealed that the purified enzyme is a monomeric enzyme with an apparent molecular mass of 30 kDa — the purified keratinase of Aspergillus sp. DHE7 exhibited activity in a broad range of pH (7- 9) and temperature (40℃-60℃) profiles with an optimal activity at pH eight and 50℃. The keratinolytic activity was inhibited by protease inhibitors such as phenylmethylsulfonyl fluoride and ethylenediaminetetraacetate, while no reduction of activity was detected by the addition of dimethyl sulfoxide (DMSO). Bivalent cations, Ca²⁺ and Mn²⁺, were able to greatly enhance the activity of keratinase by 125.7% and 194.8%, respectively, when used at one mM final concentration. On the other hand, Cu²⁺ and Hg²⁺ inhibited the enzyme activity, which might be indicative of essential vicinal sulfhydryl groups of the enzyme for productive catalysis. Furthermore, the purified keratinase showed significant stability and compatibility against the tested commercial detergents at 37ºC. Therefore, these results suggested that the purified keratinase from Aspergillus sp. DHE7 may have potential use in the detergent industry and should be of interest in the processing of poultry feather waste.Keywords: Aspergillus sp. DHE7, biochemical characterization, keratinase, purification, waste management
Procedia PDF Downloads 1251623 Simulation on Influence of Environmental Conditions on Part Distortion in Fused Deposition Modelling
Authors: Anto Antony Samy, Atefeh Golbang, Edward Archer, Alistair McIlhagger
Abstract:
Fused deposition modelling (FDM) is one of the additive manufacturing techniques that has become highly attractive in the industrial and academic sectors. However, parts fabricated through FDM are highly susceptible to geometrical defects such as warpage, shrinkage, and delamination that can severely affect their function. Among the thermoplastic polymer feedstock for FDM, semi-crystalline polymers are highly prone to part distortion due to polymer crystallization. In this study, the influence of FDM processing conditions such as chamber temperature and print bed temperature on the induced thermal residual stress and resulting warpage are investigated using the 3D transient thermal model for a semi-crystalline polymer. The thermo-mechanical properties and the viscoelasticity of the polymer, as well as the crystallization physics, which considers the crystallinity of the polymer, are coupled with the evolving temperature gradient of the print model. From the results, it was observed that increasing the chamber temperature from 25°C to 75°C lead to a decrease of 1.5% residual stress, while decreasing bed temperature from 100°C to 60°C, resulted in a 33% increase in residual stress and a significant rise of 138% in warpage. The simulated warpage data is validated by comparing it with the measured warpage values of the samples using 3D scanning.Keywords: finite element analysis, fused deposition modelling, residual stress, warpage
Procedia PDF Downloads 1871622 A Graph-Based Retrieval Model for Passage Search
Authors: Junjie Zhong, Kai Hong, Lei Wang
Abstract:
Passage Retrieval (PR) plays an important role in many Natural Language Processing (NLP) tasks. Traditional efficient retrieval models relying on exact term-matching, such as TF-IDF or BM25, have nowadays been exceeded by pre-trained language models which match by semantics. Though they gain effectiveness, deep language models often require large memory as well as time cost. To tackle the trade-off between efficiency and effectiveness in PR, this paper proposes Graph Passage Retriever (GraphPR), a graph-based model inspired by the development of graph learning techniques. Different from existing works, GraphPR is end-to-end and integrates both term-matching information and semantics. GraphPR constructs a passage-level graph from BM25 retrieval results and trains a GCN-like model on the graph with graph-based objectives. Passages were regarded as nodes in the constructed graph and were embedded in dense vectors. PR can then be implemented using embeddings and a fast vector-similarity search. Experiments on a variety of real-world retrieval datasets show that the proposed model outperforms related models in several evaluation metrics (e.g., mean reciprocal rank, accuracy, F1-scores) while maintaining a relatively low query latency and memory usage.Keywords: efficiency, effectiveness, graph learning, language model, passage retrieval, term-matching model
Procedia PDF Downloads 1501621 Bacteriological Safety of Sachet Drinking Water Sold in Benin City, Nigeria
Authors: Stephen Olusanmi Akintayo
Abstract:
Access to safe drinking water remains a major challenge in Nigeria, and where available, the quality of the water is often in doubt. An alternative to the inadequate clean drinking water is being found in treated drinking water packaged in electrically heated sealed nylon and commonly referred to as “sachet water”. “Sachet water” is a common thing in Nigeria as the selling price is within the reach of members of the low socio- economic class and the setting up of a production unit does not require huge capital input. The bacteriological quality of selected “sachet water” stored at room temperature over a period of 56 days was determined to evaluate the safety of the sachet drinking water. Test for the detection of coliform bacteria was performed, and the result showed no coliform bacteria that indicates the absence of fecal contamination throughout 56 days. Heterotrophic plate count (HPC) was done at an interval 14 days, and the samples showed HPC between 0 cfu/mL and 64 cfu/mL. The highest count was observed on day 1. The count decreased between day 1 and 28, while no growths were observed between day 42 and 56. The decrease in HPC suggested the presence of residual disinfectant in the water. The organisms isolated were identified as Staphylococcus epidermis and S. aureus. The presence of these microorganisms in sachet water is indicative for contamination during processing and handling.Keywords: coliform, heterotrophic plate count, sachet water, Staphyloccocus aureus, Staphyloccocus epidermidis
Procedia PDF Downloads 3411620 Effect of Cuminum Cyminum L. Essential Oil on Staphylococcus Aureus during the Manufacture, Ripening and Storage of White Brined Cheese
Authors: Ali Misaghi, Afshin Akhondzadeh Basti, Ehsan Sadeghi
Abstract:
Staphylococcus aureus is a pathogen of major concern for clinical infection and food borne illness. Humans and most domesticated animals harbor S. aureus, and so we may expect staphylococci to be present in food products of animal origin or in those handled directly by humans, unless heat processing is applied to destroy them. Cuminum cyminum L. has been allocated the topic of some recent studies in addition to its well-documented traditional usage for treatment of toothache, dyspepsia, diarrhea, epilepsy and jaundice. The air-dried seed of the plant was completely immersed in water and subjected to hydro distillation for 3 h, using a clevenger-type apparatus. In this study, the effect of Cuminum cyminum L. essential oil (EO) on growth of Staphylococcus aureus in white brined cheese was evaluated. The experiment included different levels of EO (0, 7.5, 15 and 30 mL/ 100 mL milk) to assess their effects on S. aureus count during the manufacture, ripening and storage of Iranian white brined cheese for up to 75 days. The significant (P < 0.05) inhibitory effects of EO (even at its lowest concentration) on this organism were observed. The significant (P < 0.05) inhibitory effect of the EO on S. aureus shown in this study may improve the scope of the EO function in the food industry.Keywords: cuminum cyminum L. essential oil, staphylococcus aureus, white brined cheese
Procedia PDF Downloads 3891619 Performance Comparison of Thread-Based and Event-Based Web Servers
Authors: Aikaterini Kentroti, Theodore H. Kaskalis
Abstract:
Today, web servers are expected to serve thousands of client requests concurrently within stringent response time limits. In this paper, we evaluate experimentally and compare the performance as well as the resource utilization of popular web servers, which differ in their approach to handle concurrency. More specifically, Central Processing Unit (CPU)- and I/O intensive tests were conducted against the thread-based Apache and Go as well as the event-based Nginx and Node.js under increasing concurrent load. The tests involved concurrent users requesting a term of the Fibonacci sequence (the 10th, 20th, 30th) and the content of a table from the database. The results show that Go achieved the best performance in all benchmark tests. For example, Go reached two times higher throughput than Node.js and five times higher than Apache and Nginx in the 20th Fibonacci term test. In addition, Go had the smallest memory footprint and demonstrated the most efficient resource utilization, in terms of CPU usage. Instead, Node.js had by far the largest memory footprint, consuming up to 90% more memory than Nginx and Apache. Regarding the performance of Apache and Nginx, our findings indicate that Hypertext Preprocessor (PHP) becomes a bottleneck when the servers are requested to respond by performing CPU-intensive tasks under increasing concurrent load.Keywords: apache, Go, Nginx, node.js, web server benchmarking
Procedia PDF Downloads 971618 Low Light Image Enhancement with Multi-Stage Interconnected Autoencoders Integration in Pix to Pix GAN
Authors: Muhammad Atif, Cang Yan
Abstract:
The enhancement of low-light images is a significant area of study aimed at enhancing the quality of captured images in challenging lighting environments. Recently, methods based on convolutional neural networks (CNN) have gained prominence as they offer state-of-the-art performance. However, many approaches based on CNN rely on increasing the size and complexity of the neural network. In this study, we propose an alternative method for improving low-light images using an autoencoder-based multiscale knowledge transfer model. Our method leverages the power of three autoencoders, where the encoders of the first two autoencoders are directly connected to the decoder of the third autoencoder. Additionally, the decoder of the first two autoencoders is connected to the encoder of the third autoencoder. This architecture enables effective knowledge transfer, allowing the third autoencoder to learn and benefit from the enhanced knowledge extracted by the first two autoencoders. We further integrate the proposed model into the PIX to PIX GAN framework. By integrating our proposed model as the generator in the GAN framework, we aim to produce enhanced images that not only exhibit improved visual quality but also possess a more authentic and realistic appearance. These experimental results, both qualitative and quantitative, show that our method is better than the state-of-the-art methodologies.Keywords: low light image enhancement, deep learning, convolutional neural network, image processing
Procedia PDF Downloads 811617 Laser Based Microfabrication of a Microheater Chip for Cell Culture
Authors: Daniel Nieto, Ramiro Couceiro
Abstract:
Microfluidic chips have demonstrated their significant application potentials in microbiological processing and chemical reactions, with the goal of developing monolithic and compact chip-sized multifunctional systems. Heat generation and thermal control are critical in some of the biochemical processes. The paper presents a laser direct-write technique for rapid prototyping and manufacturing of microheater chips and its applicability for perfusion cell culture outside a cell incubator. The aim of the microheater is to take the role of conventional incubators for cell culture for facilitating microscopic observation or other online monitoring activities during cell culture and provides portability of cell culture operation. Microheaters (5 mm × 5 mm) have been successfully fabricated on soda-lime glass substrates covered with aluminum layer of thickness 120 nm. Experimental results show that the microheaters exhibit good performance in temperature rise and decay characteristics, with localized heating at targeted spatial domains. These microheaters were suitable for a maximum long-term operation temperature of 120ºC and validated for long-time operation at 37ºC. for 24 hours. Results demonstrated that the physiology of the cultured SW480 adenocarcinoma of the colon cell line on the developed microheater chip was consistent with that of an incubator.Keywords: laser microfabrication, microheater, bioengineering, cell culture
Procedia PDF Downloads 2971616 Wear Assessment of SS316l-Al2O3 Composites for Heavy Wear Applications
Authors: Catherine Kuforiji, Michel Nganbe
Abstract:
The abrasive wear of composite materials is a major challenge in highly demanding wear applications. Therefore, this study focuses on fabricating, testing and assessing the properties of 50wt% SS316L stainless steel–50wt% Al2O3 particle composites. Composite samples were fabricated using the powder metallurgy route. The effects of the powder metallurgy processing parameters and hard particle reinforcement were studied. The microstructure, density, hardness and toughness were characterized. The wear behaviour was studied using pin-on-disc testing under dry sliding conditions. The highest hardness of 1085.2 HV, the highest theoretical density of 94.7% and the lowest wear rate of 0.00397 mm3/m were obtained at a milling speed of 720 rpm, a compaction pressure of 794.4 MPa and sintering at 1400 °C in an argon atmosphere. Compared to commercial SS316 and fabricated SS316L, the composites had 7.4 times and 11 times lower wear rate, respectively. However, the commercial 90WC-10Co showed 2.2 times lower wear rate compared to the fabricated SS316L-Al2O3 composites primarily due to the higher ceramic content of 90 wt.% in the reference WC-Co. However, eliminating the relatively high porosity of about 5 vol% using processes such as HIP and hot pressing can be expected to lead to further substantial improvements of the composites wear resistance.Keywords: SS316L, Al2O3, powder metallurgy, wear characterization
Procedia PDF Downloads 3041615 Futuristic Black Box Design Considerations and Global Networking for Real Time Monitoring of Flight Performance Parameters
Authors: K. Parandhama Gowd
Abstract:
The aim of this research paper is to conceptualize, discuss, analyze and propose alternate design methodologies for futuristic Black Box for flight safety. The proposal also includes global networking concepts for real time surveillance and monitoring of flight performance parameters including GPS parameters. It is expected that this proposal will serve as a failsafe real time diagnostic tool for accident investigation and location of debris in real time. In this paper, an attempt is made to improve the existing methods of flight data recording techniques and improve upon design considerations for futuristic FDR to overcome the trauma of not able to locate the block box. Since modern day communications and information technologies with large bandwidth are available coupled with faster computer processing techniques, the attempt made in this paper to develop a failsafe recording technique is feasible. Further data fusion/data warehousing technologies are available for exploitation.Keywords: flight data recorder (FDR), black box, diagnostic tool, global networking, cockpit voice and data recorder (CVDR), air traffic control (ATC), air traffic, telemetry, tracking and control centers ATTTCC)
Procedia PDF Downloads 5721614 Adaptive Swarm Balancing Algorithms for Rare-Event Prediction in Imbalanced Healthcare Data
Authors: Jinyan Li, Simon Fong, Raymond Wong, Mohammed Sabah, Fiaidhi Jinan
Abstract:
Clinical data analysis and forecasting have make great contributions to disease control, prevention and detection. However, such data usually suffer from highly unbalanced samples in class distributions. In this paper, we target at the binary imbalanced dataset, where the positive samples take up only the minority. We investigate two different meta-heuristic algorithms, particle swarm optimization and bat-inspired algorithm, and combine both of them with the synthetic minority over-sampling technique (SMOTE) for processing the datasets. One approach is to process the full dataset as a whole. The other is to split up the dataset and adaptively process it one segment at a time. The experimental results reveal that while the performance improvements obtained by the former methods are not scalable to larger data scales, the later one, which we call Adaptive Swarm Balancing Algorithms, leads to significant efficiency and effectiveness improvements on large datasets. We also find it more consistent with the practice of the typical large imbalanced medical datasets. We further use the meta-heuristic algorithms to optimize two key parameters of SMOTE. Leading to more credible performances of the classifier, and shortening the running time compared with the brute-force method.Keywords: Imbalanced dataset, meta-heuristic algorithm, SMOTE, big data
Procedia PDF Downloads 4421613 Factors Influencing the Logistics Services Providers' Performance: A Literature Overview
Authors: A. Aguezzoul
Abstract:
The Logistics Services Providers (LSPs) selection and performance is a strategic decision that affects the overall performance of any company as well as its supply chain. It is a complex process, which takes into account various conflicting quantitative and qualitative factors, as well as outsourced logistics activities. This article focuses on the evolution of the weights associated to these factors over the last years in order to better understand the change in the importance that logistics professionals place on them criteria when choosing their LSPs. For that, an analysis of 17 main studies published during 2014-2017 period was carried out and the results are compared to those of a previous literature review on this subject. Our analysis allowed us to deduce the following observations: 1) the LSPs selection is a multi-criteria process; 2) the empirical character of the majority of studies, conducted particularly in Asian countries; 3) the criteria importance has undergone significant changes following the emergence of information technologies that have favored the work in close collaboration and in partnership between the LSPs and their customers, even on a worldwide scale; 4) the cost criterion is relatively less important than in the past; and finally 5) with the development of sustainable supply chains, the factors associated with the logistic activities of return and waste processing (reverse logistics) are becoming increasingly important in this multi-criteria process of selection and evaluation of LSPs performance.Keywords: logistics outsourcing, logistics providers, multi-criteria decision making, performance
Procedia PDF Downloads 1541612 Analysis of Airborne Data Using Range Migration Algorithm for the Spotlight Mode of Synthetic Aperture Radar
Authors: Peter Joseph Basil Morris, Chhabi Nigam, S. Ramakrishnan, P. Radhakrishna
Abstract:
This paper brings out the analysis of the airborne Synthetic Aperture Radar (SAR) data using the Range Migration Algorithm (RMA) for the spotlight mode of operation. Unlike in polar format algorithm (PFA), space-variant defocusing and geometric distortion effects are mitigated in RMA since it does not assume that the illuminating wave-fronts are planar. This facilitates the use of RMA for imaging scenarios involving severe differential range curvatures enabling the imaging of larger scenes at fine resolution and at shorter ranges with low center frequencies. The RMA algorithm for the spotlight mode of SAR is analyzed in this paper using the airborne data. Pre-processing operations viz: - range de-skew and motion compensation to a line are performed on the raw data before being fed to the RMA component. Various stages of the RMA viz:- 2D Matched Filtering, Along Track Fourier Transform and Slot Interpolation are analyzed to find the performance limits and the dependence of the imaging geometry on the resolution of the final image. The ability of RMA to compensate for severe differential range curvatures in the two-dimensional spatial frequency domain are also illustrated in this paper.Keywords: range migration algorithm, spotlight SAR, synthetic aperture radar, matched filtering, slot interpolation
Procedia PDF Downloads 2411611 Nagabhasma Preparation and Its Effect on Kidneys: A Histopathological Study
Authors: Lydia Andrade, Kumar M. R. Bhat
Abstract:
Heavy metals, especially lead, is considered to be a multi-organ toxicant. However, such heavy metals, are used in the preparation of traditional medicines. Nagabhasma is one of the traditional medicines. Lead is the metal used in its preparation. Lead is converted into a health beneficial, organometallic compound, when subjected to various traditional methods of purification. Therefore, this study is designed to evaluate the effect of such processed lead in various stages of traditionally prepared Nagabhasma on the histological structure of kidneys. Using the human equivalent doses of Nagabhasma, various stages of its preparation were fed orally for 30 days and 60 days (short term and long term). The treated and untreated rats were then sacrificed for the collection of kidneys. The kidneys were processed for histopathological study. The results show severe changes in the histological structure of kidneys. The animals treated with lead acetate showed changes in the epithelial cells lining the bowman’s capsule. The proximal and distal convoluted tubules were dilated leading to atrophy of their epithelial cells. The amount of inflammatory infiltrates was more in this group. A few groups also showed pockets of inter-tubular hemorrhage. These changes, however, were minimized as the stages progressed form stages 1 to 4 of Nagabhasma preparation. Therefore, it is necessary to stringently monitor the processing of lead acetate during the preparation of Nagabhasma.Keywords: heavy metals, kidneys, lead acetate, Nagabhasma
Procedia PDF Downloads 1461610 3D Human Reconstruction over Cloud Based Image Data via AI and Machine Learning
Authors: Kaushik Sathupadi, Sandesh Achar
Abstract:
Human action recognition modeling is a critical task in machine learning. These systems require better techniques for recognizing body parts and selecting optimal features based on vision sensors to identify complex action patterns efficiently. Still, there is a considerable gap and challenges between images and videos, such as brightness, motion variation, and random clutters. This paper proposes a robust approach for classifying human actions over cloud-based image data. First, we apply pre-processing and detection, human and outer shape detection techniques. Next, we extract valuable information in terms of cues. We extract two distinct features: fuzzy local binary patterns and sequence representation. Then, we applied a greedy, randomized adaptive search procedure for data optimization and dimension reduction, and for classification, we used a random forest. We tested our model on two benchmark datasets, AAMAZ and the KTH Multi-view football datasets. Our HMR framework significantly outperforms the other state-of-the-art approaches and achieves a better recognition rate of 91% and 89.6% over the AAMAZ and KTH multi-view football datasets, respectively.Keywords: computer vision, human motion analysis, random forest, machine learning
Procedia PDF Downloads 391609 Automatic Generating CNC-Code for Milling Machine
Authors: Chalakorn Chitsaart, Suchada Rianmora, Mann Rattana-Areeyagon, Wutichai Namjaiprasert
Abstract:
G-code is the main factor in computer numerical control (CNC) machine for controlling the tool-paths and generating the profile of the object’s features. For obtaining high surface accuracy of the surface finish, non-stop operation is required for CNC machine. Recently, to design a new product, the strategy that concerns about a change that has low impact on business and does not consume lot of resources has been introduced. Cost and time for designing minor changes can be reduced since the traditional geometric details of the existing models are applied. In order to support this strategy as the alternative channel for machining operation, this research proposes the automatic generating codes for CNC milling operation. Using this technique can assist the manufacturer to easily change the size and the geometric shape of the product during the operation where the time spent for setting up or processing the machine are reduced. The algorithm implemented on MATLAB platform is developed by analyzing and evaluating the geometric information of the part. Codes are created rapidly to control the operations of the machine. Comparing to the codes obtained from CAM, this developed algorithm can shortly generate and simulate the cutting profile of the part.Keywords: geometric shapes, milling operation, minor changes, CNC Machine, G-code, cutting parameters
Procedia PDF Downloads 349