Search results for: input processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5522

Search results for: input processing

1832 Physicochemical, Heavy Metals Analysis of Some Multi-Floral Algerian Honeys

Authors: Assia Amri, Naima Layachi, Ali Ladjama

Abstract:

The characterization of some Algerian honey was carried out on the basis of their physico-chemical properties: moisture,hydroxy methyl furfural, diastase activity, pH,free, total and lactonic acidity, electrical conductivity, minerals and proline content. Studied samples are found to be low in moisture and therefore safe from fermentation, low in HMF level and high in diastase activity. Additionally the diastase activity and the HMF content are widely recognized parameters indicating the freshness of honey. Phenolic compounds present in honey are classified into two groups - simple phenols and polyphenols. The simple phenols in honey are various phenol acids, but polyphenols are various flavonoids and flavonides. The aim of our work was to determine antioxidant properties of various Algerian honey samples–the total phenol content, total flavonoids content, as well as honey anti radical activity.The quality of honey samples differs on account of various factors such as season, packaging and processing conditions, floral source, geographical origin and storage period. It is important that precautions should be taken to ensure standardization and rationalization of beekeeping techniques, manufacturing procedures and storing processes to improve honey quality.

Keywords: honey, physico-chemical characterization, phenolic coumpound, HMF, diastase activity

Procedia PDF Downloads 399
1831 Production of Energetic Nanomaterials by Spray Flash Evaporation

Authors: Martin Klaumünzer, Jakob Hübner, Denis Spitzer

Abstract:

Within this paper, latest results on processing of energetic nanomaterials by means of the Spray Flash Evaporation technique are presented. This technology constitutes a highly effective and continuous way to prepare fascinating materials on the nano- and micro-scale. Within the process, a solution is set under high pressure and sprayed into an evacuated atomization chamber. Subsequent ultrafast evaporation of the solvent leads to an aerosol stream, which is separated by cyclones or filters. No drying gas is required, so the present technique should not be confused with spray dying. Resulting nanothermites, insensitive explosives or propellants and compositions are foreseen to replace toxic (according to REACH) and very sensitive matter in military and civil applications. Diverse examples are given in detail: nano-RDX (n-Cyclotrimethylentrinitramin) and nano-aluminum based systems, mixtures (n-RDX/n-TNT - trinitrotoluene) or even cocrystalline matter like n-CL-20/HMX (Hexanitrohexaazaisowurtzitane/ Cyclotetra-methylentetranitramin). These nanomaterials show reduced sensitivity by trend without losing effectiveness and performance. An analytical study for material characterization was performed by using Atomic Force Microscopy, X-Ray Diffraction, and combined techniques as well as spectroscopic methods. As a matter of course, sensitivity tests regarding electrostatic discharge, impact, and friction are provided.

Keywords: continuous synthesis, energetic material, nanoscale, nanoexplosive, nanothermite

Procedia PDF Downloads 247
1830 Permanent Deformation Resistance of Asphalt Mixtures with Red Mud as a Filler

Authors: Liseane Padilha Thives, Mayara S. S. Lima, João Victor Staub De Melo, Glicério Trichês

Abstract:

Red mud is a waste resulting from the processing of bauxite to alumina, the raw material of the production of aluminum. The large quantity of red mud generated and inadequately disposed in the environment has motivated researchers to develop methods for reinsertion of this waste into the productive cycle. This work aims to evaluate the resistance to permanent deformation of dense asphalt mixtures with red mud filler. The red mud was characterized by tests of X-ray diffraction, fluorescence, specific mass, laser granulometry, pH and scanning electron microscopy. For the analysis of the influence of the quantity of red mud in the mechanical performance of asphalt mixtures, a total filler content of 7% was established. Asphalt mixtures with 3%, 5% and 7% red mud were produced. A conventional mixture with 7% stone powder filler was used as reference. The asphalt mixtures were evaluated for performance to permanent deformation in the French Rutting Tester (FRT) traffic simulator. The mixture with 5% red mud presented greater resistance to permanent deformation with rutting depth at 30,000 cycles of 3.50%. The asphalt mixtures with red mud presented better performance, with reduction of the rutting of 12.63 to 42.62% in relation to the reference mixture. This study confirmed the viability of reinserting the red mud in the production chain and possible usage in the construction industry. The red mud as filler in asphalt mixtures is a reuse option of this waste and mitigation of the disposal problems, as well as being an environmentally friendly alternative.

Keywords: asphalt mixtures, permanent deformation, red mud, pavements

Procedia PDF Downloads 268
1829 Purification, Biochemical Characterization and Application of an Extracellular Alkaline Keratinase Produced by Aspergillus sp. DHE7

Authors: Dina Helmy El-Ghonemy, Thanaa Hamed Ali

Abstract:

The aim of this study was to purify and characterize a keratinolytic enzyme produced by Aspergillus sp. DHE7 cultured in basal medium containing chicken feather as substrate. The enzyme was purified through ammonium sulfate saturation of 60%, followed by gel filtration chromatography in Sephadex G-100, with a 16.4-purification fold and recovery yield of 52.2%. Sodium dodecyl sulfate-polyacrylamide gel electrophoresis revealed that the purified enzyme is a monomeric enzyme with an apparent molecular mass of 30 kDa — the purified keratinase of Aspergillus sp. DHE7 exhibited activity in a broad range of pH (7- 9) and temperature (40℃-60℃) profiles with an optimal activity at pH eight and 50℃. The keratinolytic activity was inhibited by protease inhibitors such as phenylmethylsulfonyl fluoride and ethylenediaminetetraacetate, while no reduction of activity was detected by the addition of dimethyl sulfoxide (DMSO). Bivalent cations, Ca²⁺ and Mn²⁺, were able to greatly enhance the activity of keratinase by 125.7% and 194.8%, respectively, when used at one mM final concentration. On the other hand, Cu²⁺ and Hg²⁺ inhibited the enzyme activity, which might be indicative of essential vicinal sulfhydryl groups of the enzyme for productive catalysis. Furthermore, the purified keratinase showed significant stability and compatibility against the tested commercial detergents at 37ºC. Therefore, these results suggested that the purified keratinase from Aspergillus sp. DHE7 may have potential use in the detergent industry and should be of interest in the processing of poultry feather waste.

Keywords: Aspergillus sp. DHE7, biochemical characterization, keratinase, purification, waste management

Procedia PDF Downloads 110
1828 Simulation on Influence of Environmental Conditions on Part Distortion in Fused Deposition Modelling

Authors: Anto Antony Samy, Atefeh Golbang, Edward Archer, Alistair McIlhagger

Abstract:

Fused deposition modelling (FDM) is one of the additive manufacturing techniques that has become highly attractive in the industrial and academic sectors. However, parts fabricated through FDM are highly susceptible to geometrical defects such as warpage, shrinkage, and delamination that can severely affect their function. Among the thermoplastic polymer feedstock for FDM, semi-crystalline polymers are highly prone to part distortion due to polymer crystallization. In this study, the influence of FDM processing conditions such as chamber temperature and print bed temperature on the induced thermal residual stress and resulting warpage are investigated using the 3D transient thermal model for a semi-crystalline polymer. The thermo-mechanical properties and the viscoelasticity of the polymer, as well as the crystallization physics, which considers the crystallinity of the polymer, are coupled with the evolving temperature gradient of the print model. From the results, it was observed that increasing the chamber temperature from 25°C to 75°C lead to a decrease of 1.5% residual stress, while decreasing bed temperature from 100°C to 60°C, resulted in a 33% increase in residual stress and a significant rise of 138% in warpage. The simulated warpage data is validated by comparing it with the measured warpage values of the samples using 3D scanning.

Keywords: finite element analysis, fused deposition modelling, residual stress, warpage

Procedia PDF Downloads 170
1827 A Graph-Based Retrieval Model for Passage Search

Authors: Junjie Zhong, Kai Hong, Lei Wang

Abstract:

Passage Retrieval (PR) plays an important role in many Natural Language Processing (NLP) tasks. Traditional efficient retrieval models relying on exact term-matching, such as TF-IDF or BM25, have nowadays been exceeded by pre-trained language models which match by semantics. Though they gain effectiveness, deep language models often require large memory as well as time cost. To tackle the trade-off between efficiency and effectiveness in PR, this paper proposes Graph Passage Retriever (GraphPR), a graph-based model inspired by the development of graph learning techniques. Different from existing works, GraphPR is end-to-end and integrates both term-matching information and semantics. GraphPR constructs a passage-level graph from BM25 retrieval results and trains a GCN-like model on the graph with graph-based objectives. Passages were regarded as nodes in the constructed graph and were embedded in dense vectors. PR can then be implemented using embeddings and a fast vector-similarity search. Experiments on a variety of real-world retrieval datasets show that the proposed model outperforms related models in several evaluation metrics (e.g., mean reciprocal rank, accuracy, F1-scores) while maintaining a relatively low query latency and memory usage.

Keywords: efficiency, effectiveness, graph learning, language model, passage retrieval, term-matching model

Procedia PDF Downloads 116
1826 Evaluation of Public Library Adult Programs: Use of Servqual and Nippa Assessment Standards

Authors: Anna Ching-Yu Wong

Abstract:

This study aims to identify the quality and effectiveness of the adult programs provided by the public library using the ServQUAL Method and the National Library Public Programs Assessment guidelines (NIPPA, June 2019). ServQUAl covers several variables, namely: tangible, reliability, responsiveness, assurance, and empathy. NIPPA guidelines focus on program characteristics, particularly on the outcomes – the level of satisfaction from program participants. The reached populations were adults who participated in library adult programs at a small-town public library in Kansas. This study was designed as quantitative evaluative research which analyzed the quality and effectiveness of the library adult programs by analyzing the role of each factor based on ServQUAL and the NIPPA's library program assessment guidelines. Data were collected from November 2019 to January 2020 using a questionnaire with a Likert Scale. The data obtained were analyzed in a descriptive quantitative manner. The impact of this research can provide information about the quality and effectiveness of existing programs and can be used as input to develop strategies for developing future adult programs. Overall the result of ServQUAL measurement is in very good quality, but still, areas need improvement and emphasis in each variable: Tangible Variables still need improvement in indicators of the temperature and space of the meeting room. Reliability Variable still needs improvement in the timely delivery of the programs. Responsiveness Variable still needs improvement in terms of the ability of the presenters to convey trust and confidence from participants. Assurance Variables still need improvement in the indicator of knowledge and skills of program presenters. Empathy Variable still needs improvement in terms of the presenters' willingness to provide extra assistance. The result of program outcomes measurement based on NIPPA guidelines is very positive. Over 96% of participants indicated that the programs were informative and fun. They learned new knowledge and new skills and would recommend the programs to their friends and families. They believed that together, the library and participants build stronger and healthier communities.

Keywords: ServQual model, ServQual in public libraries, library program assessment, NIPPA library programs assessment

Procedia PDF Downloads 86
1825 Effect of Cuminum Cyminum L. Essential Oil on Staphylococcus Aureus during the Manufacture, Ripening and Storage of White Brined Cheese

Authors: Ali Misaghi, Afshin Akhondzadeh Basti, Ehsan Sadeghi

Abstract:

Staphylococcus aureus is a pathogen of major concern for clinical infection and food borne illness. Humans and most domesticated animals harbor S. aureus, and so we may expect staphylococci to be present in food products of animal origin or in those handled directly by humans, unless heat processing is applied to destroy them. Cuminum cyminum L. has been allocated the topic of some recent studies in addition to its well-documented traditional usage for treatment of toothache, dyspepsia, diarrhea, epilepsy and jaundice. The air-dried seed of the plant was completely immersed in water and subjected to hydro distillation for 3 h, using a clevenger-type apparatus. In this study, the effect of Cuminum cyminum L. essential oil (EO) on growth of Staphylococcus aureus in white brined cheese was evaluated. The experiment included different levels of EO (0, 7.5, 15 and 30 mL/ 100 mL milk) to assess their effects on S. aureus count during the manufacture, ripening and storage of Iranian white brined cheese for up to 75 days. The significant (P < 0.05) inhibitory effects of EO (even at its lowest concentration) on this organism were observed. The significant (P < 0.05) inhibitory effect of the EO on S. aureus shown in this study may improve the scope of the EO function in the food industry.

Keywords: cuminum cyminum L. essential oil, staphylococcus aureus, white brined cheese

Procedia PDF Downloads 379
1824 Performance Comparison of Thread-Based and Event-Based Web Servers

Authors: Aikaterini Kentroti, Theodore H. Kaskalis

Abstract:

Today, web servers are expected to serve thousands of client requests concurrently within stringent response time limits. In this paper, we evaluate experimentally and compare the performance as well as the resource utilization of popular web servers, which differ in their approach to handle concurrency. More specifically, Central Processing Unit (CPU)- and I/O intensive tests were conducted against the thread-based Apache and Go as well as the event-based Nginx and Node.js under increasing concurrent load. The tests involved concurrent users requesting a term of the Fibonacci sequence (the 10th, 20th, 30th) and the content of a table from the database. The results show that Go achieved the best performance in all benchmark tests. For example, Go reached two times higher throughput than Node.js and five times higher than Apache and Nginx in the 20th Fibonacci term test. In addition, Go had the smallest memory footprint and demonstrated the most efficient resource utilization, in terms of CPU usage. Instead, Node.js had by far the largest memory footprint, consuming up to 90% more memory than Nginx and Apache. Regarding the performance of Apache and Nginx, our findings indicate that Hypertext Preprocessor (PHP) becomes a bottleneck when the servers are requested to respond by performing CPU-intensive tasks under increasing concurrent load.

Keywords: apache, Go, Nginx, node.js, web server benchmarking

Procedia PDF Downloads 77
1823 Low Light Image Enhancement with Multi-Stage Interconnected Autoencoders Integration in Pix to Pix GAN

Authors: Muhammad Atif, Cang Yan

Abstract:

The enhancement of low-light images is a significant area of study aimed at enhancing the quality of captured images in challenging lighting environments. Recently, methods based on convolutional neural networks (CNN) have gained prominence as they offer state-of-the-art performance. However, many approaches based on CNN rely on increasing the size and complexity of the neural network. In this study, we propose an alternative method for improving low-light images using an autoencoder-based multiscale knowledge transfer model. Our method leverages the power of three autoencoders, where the encoders of the first two autoencoders are directly connected to the decoder of the third autoencoder. Additionally, the decoder of the first two autoencoders is connected to the encoder of the third autoencoder. This architecture enables effective knowledge transfer, allowing the third autoencoder to learn and benefit from the enhanced knowledge extracted by the first two autoencoders. We further integrate the proposed model into the PIX to PIX GAN framework. By integrating our proposed model as the generator in the GAN framework, we aim to produce enhanced images that not only exhibit improved visual quality but also possess a more authentic and realistic appearance. These experimental results, both qualitative and quantitative, show that our method is better than the state-of-the-art methodologies.

Keywords: low light image enhancement, deep learning, convolutional neural network, image processing

Procedia PDF Downloads 47
1822 Laser Based Microfabrication of a Microheater Chip for Cell Culture

Authors: Daniel Nieto, Ramiro Couceiro

Abstract:

Microfluidic chips have demonstrated their significant application potentials in microbiological processing and chemical reactions, with the goal of developing monolithic and compact chip-sized multifunctional systems. Heat generation and thermal control are critical in some of the biochemical processes. The paper presents a laser direct-write technique for rapid prototyping and manufacturing of microheater chips and its applicability for perfusion cell culture outside a cell incubator. The aim of the microheater is to take the role of conventional incubators for cell culture for facilitating microscopic observation or other online monitoring activities during cell culture and provides portability of cell culture operation. Microheaters (5 mm × 5 mm) have been successfully fabricated on soda-lime glass substrates covered with aluminum layer of thickness 120 nm. Experimental results show that the microheaters exhibit good performance in temperature rise and decay characteristics, with localized heating at targeted spatial domains. These microheaters were suitable for a maximum long-term operation temperature of 120ºC and validated for long-time operation at 37ºC. for 24 hours. Results demonstrated that the physiology of the cultured SW480 adenocarcinoma of the colon cell line on the developed microheater chip was consistent with that of an incubator.

Keywords: laser microfabrication, microheater, bioengineering, cell culture

Procedia PDF Downloads 277
1821 Wear Assessment of SS316l-Al2O3 Composites for Heavy Wear Applications

Authors: Catherine Kuforiji, Michel Nganbe

Abstract:

The abrasive wear of composite materials is a major challenge in highly demanding wear applications. Therefore, this study focuses on fabricating, testing and assessing the properties of 50wt% SS316L stainless steel–50wt% Al2O3 particle composites. Composite samples were fabricated using the powder metallurgy route. The effects of the powder metallurgy processing parameters and hard particle reinforcement were studied. The microstructure, density, hardness and toughness were characterized. The wear behaviour was studied using pin-on-disc testing under dry sliding conditions. The highest hardness of 1085.2 HV, the highest theoretical density of 94.7% and the lowest wear rate of 0.00397 mm3/m were obtained at a milling speed of 720 rpm, a compaction pressure of 794.4 MPa and sintering at 1400 °C in an argon atmosphere. Compared to commercial SS316 and fabricated SS316L, the composites had 7.4 times and 11 times lower wear rate, respectively. However, the commercial 90WC-10Co showed 2.2 times lower wear rate compared to the fabricated SS316L-Al2O3 composites primarily due to the higher ceramic content of 90 wt.% in the reference WC-Co. However, eliminating the relatively high porosity of about 5 vol% using processes such as HIP and hot pressing can be expected to lead to further substantial improvements of the composites wear resistance.

Keywords: SS316L, Al2O3, powder metallurgy, wear characterization

Procedia PDF Downloads 290
1820 Delineation of Oil– Polluted Sites in Ibeno LGA, Nigeria

Authors: Ime R. Udotong, Ofonime U. M. John, Justina I. R. Udotong

Abstract:

Ibeno, Nigeria hosts the operational base of Mobil Producing Nigeria Unlimited (MPNU), a subsidiary of ExxonMobil and the current highest oil and condensate producer in Nigeria. Besides MPNU, other multinational oil companies like Shell Petroleum Development Company Ltd, Elf Petroleum Nigeria Ltd and Nigerian Agip Energy, a subsidiary of ENI E&P operate onshore, on the continental shelf and deep offshore of the Atlantic Ocean in Ibeno, Nigeria, respectively. This study was designed to carry out the survey of the oil impacted sites in Ibeno, Nigeria. A combinations of electrical resistivity (ER), ground penetrating radar (GPR) and physico-chemical as well as microbiological characterization of soils and water samples from the area were carried out. Results obtained revealed that there have been hydrocarbon contaminations of this environment by past crude oil spills as observed from significant concentrations of THC, BTEX and heavy metal contents in the environment. Also, high resistivity values and GPR profiles clearly showing the distribution, thickness and lateral extent of hydrocarbon contamination as represented on the radargram reflector tones corroborates previous significant oil input. Contaminations were of varying degrees, ranging from slight to high, indicating levels of substantial attenuation of crude oil contamination over time. Hydrocarbon pollution of the study area was confirmed by the results of soil and water physico-chemical and microbiological analysis. The levels of THC contamination observed in this study are indicative of high levels of crude oil contamination. Moreover, the display of relatively lower resistivities of locations outside the impacted areas compared to resistivity values within the impacted areas, the 3-D Cartesian images of oil contaminant plume depicted by red, light brown and magenta for high, low and very low oil impacted areas, respectively as well as the high counts of hydrocarbonoclastic microorganisms in excess of 1% confirmed significant recent pollution of the study area.

Keywords: oil-polluted sites, physico-chemical analyses, microbiological characterization, geotechnical investigations, total hydrocarbon content

Procedia PDF Downloads 376
1819 Implementation of Quality Function Development to Incorporate Customer’s Value in the Conceptual Design Stage of a Construction Projects

Authors: Ayedh Alqahtani

Abstract:

Many construction firms in Saudi Arabia dedicated to building projects agree that the most important factor in the real estate market is the value that they can give to their customer. These firms understand the value of their client in different ways. Value can be defined as the size of the building project in relationship to the cost or the design quality of the materials utilized in finish work or any other features of building rooms such as the bathroom. Value can also be understood as something suitable for the money the client is investing for the new property. A quality tool is required to support companies to achieve a solution for the building project and to understand and manage the customer’s needs. Quality Function Development (QFD) method will be able to play this role since the main difference between QFD and other conventional quality management tools is QFD a valuable and very flexible tool for design and taking into the account the VOC. Currently, organizations and agencies are seeking suitable models able to deal better with uncertainty, and that is flexible and easy to use. The primary aim of this research project is to incorporate customer’s requirements in the conceptual design of construction projects. Towards this goal, QFD is selected due to its capability to integrate the design requirements to meet the customer’s needs. To develop QFD, this research focused upon the contribution of the different (significantly weighted) input factors that represent the main variables influencing QFD and subsequent analysis of the techniques used to measure them. First of all, this research will review the literature to determine the current practice of QFD in construction projects. Then, the researcher will review the literature to define the current customers of residential projects and gather information on customers’ requirements for the design of the residential building. After that, qualitative survey research will be conducted to rank customer’s needs and provide the views of stakeholder practitioners about how these needs can affect their satisfy. Moreover, a qualitative focus group with the members of the design team will be conducted to determine the improvements level and technical details for the design of residential buildings. Finally, the QFD will be developed to establish the degree of significance of the design’s solution.

Keywords: quality function development, construction projects, Saudi Arabia, quality tools

Procedia PDF Downloads 106
1818 Effects of Application of Rice Husk Charcoal-Coated Urea and Rice Straw Compost on Growth, Yield, and Soil Properties of Rice

Authors: D. A. S. Gamage, B. F. A Basnayake, W. A. J. M. de Costa

Abstract:

Rice is one of the world’s most important cereals. Increasing food production both to meet in-country requirements and to help overcome food crises is one of the major issues facing Sri Lanka today. However, productive land is limited and has mostly been utilized either for food crop production or other uses. Agriculture plays an important and strategic role in the performance of Sri Lankan national economy. A variety of modern agricultural inputs have been introduced, namely ploughs and harvesters, pesticides, fertilizers and lime. Besides, there are several agricultural institutions developing and updating the management of agricultural sector. Modern agricultural inputs cooperate as a catalyst in raising the productivity. However, in the eagerness of gaining profits from the efficient and productive techniques, this modern agricultural input has affected the environment and living things especially those which have been blended from various chemical substance. The increased pressure to maintain a high level of rice output for consumption has resulted in increased use of pesticides and inorganic fertilizer on rice fields in Sri Lanka. The application of inorganic fertilizer has become a burdened to the country in many ways. The excessive reuse of the ground water resources with a considerable application of organic and chemical fertilizers will lead to a deterioration of the quality and quantity of water. Biochar is a form of charcoal produced through the heating of natural organic materials. It has received significant attention recently for its potential as a soil conditioner, a fertilizer and as a means of storing carbon in a sustainable manner. It is the best solution for managing the agricultural wastes while providing a useful product for increasing agricultural productivity and protecting the environment. The objective of this study was to evaluate rice husk charcoal coated urea as a slow releasing fertilizer and compare the total N, P, K, organic matter in soil and yield of rice production.

Keywords: biochar, paddy husk, soil conditioner, rice straw compost

Procedia PDF Downloads 339
1817 Futuristic Black Box Design Considerations and Global Networking for Real Time Monitoring of Flight Performance Parameters

Authors: K. Parandhama Gowd

Abstract:

The aim of this research paper is to conceptualize, discuss, analyze and propose alternate design methodologies for futuristic Black Box for flight safety. The proposal also includes global networking concepts for real time surveillance and monitoring of flight performance parameters including GPS parameters. It is expected that this proposal will serve as a failsafe real time diagnostic tool for accident investigation and location of debris in real time. In this paper, an attempt is made to improve the existing methods of flight data recording techniques and improve upon design considerations for futuristic FDR to overcome the trauma of not able to locate the block box. Since modern day communications and information technologies with large bandwidth are available coupled with faster computer processing techniques, the attempt made in this paper to develop a failsafe recording technique is feasible. Further data fusion/data warehousing technologies are available for exploitation.

Keywords: flight data recorder (FDR), black box, diagnostic tool, global networking, cockpit voice and data recorder (CVDR), air traffic control (ATC), air traffic, telemetry, tracking and control centers ATTTCC)

Procedia PDF Downloads 557
1816 Adaptive Swarm Balancing Algorithms for Rare-Event Prediction in Imbalanced Healthcare Data

Authors: Jinyan Li, Simon Fong, Raymond Wong, Mohammed Sabah, Fiaidhi Jinan

Abstract:

Clinical data analysis and forecasting have make great contributions to disease control, prevention and detection. However, such data usually suffer from highly unbalanced samples in class distributions. In this paper, we target at the binary imbalanced dataset, where the positive samples take up only the minority. We investigate two different meta-heuristic algorithms, particle swarm optimization and bat-inspired algorithm, and combine both of them with the synthetic minority over-sampling technique (SMOTE) for processing the datasets. One approach is to process the full dataset as a whole. The other is to split up the dataset and adaptively process it one segment at a time. The experimental results reveal that while the performance improvements obtained by the former methods are not scalable to larger data scales, the later one, which we call Adaptive Swarm Balancing Algorithms, leads to significant efficiency and effectiveness improvements on large datasets. We also find it more consistent with the practice of the typical large imbalanced medical datasets. We further use the meta-heuristic algorithms to optimize two key parameters of SMOTE. Leading to more credible performances of the classifier, and shortening the running time compared with the brute-force method.

Keywords: Imbalanced dataset, meta-heuristic algorithm, SMOTE, big data

Procedia PDF Downloads 428
1815 Factors Influencing the Logistics Services Providers' Performance: A Literature Overview

Authors: A. Aguezzoul

Abstract:

The Logistics Services Providers (LSPs) selection and performance is a strategic decision that affects the overall performance of any company as well as its supply chain. It is a complex process, which takes into account various conflicting quantitative and qualitative factors, as well as outsourced logistics activities. This article focuses on the evolution of the weights associated to these factors over the last years in order to better understand the change in the importance that logistics professionals place on them criteria when choosing their LSPs. For that, an analysis of 17 main studies published during 2014-2017 period was carried out and the results are compared to those of a previous literature review on this subject. Our analysis allowed us to deduce the following observations: 1) the LSPs selection is a multi-criteria process; 2) the empirical character of the majority of studies, conducted particularly in Asian countries; 3) the criteria importance has undergone significant changes following the emergence of information technologies that have favored the work in close collaboration and in partnership between the LSPs and their customers, even on a worldwide scale; 4) the cost criterion is relatively less important than in the past; and finally 5) with the development of sustainable supply chains, the factors associated with the logistic activities of return and waste processing (reverse logistics) are becoming increasingly important in this multi-criteria process of selection and evaluation of LSPs performance.

Keywords: logistics outsourcing, logistics providers, multi-criteria decision making, performance

Procedia PDF Downloads 138
1814 Teaching Business Process Management using IBM’s INNOV8 BPM Simulation Game

Authors: Hossam Ali-Hassan, Michael Bliemel

Abstract:

This poster reflects upon our experiences using INNOV8, IBM’s Business Process Management (BPM) simulation game, in online MBA and undergraduate MIS classes over a period of 2 years. The game is designed to gives both business and information technology players a better understanding of how effective BPM impacts an entire business ecosystem. The game includes three different scenarios: Smarter Traffic, which is used to evaluate existing traffic patterns and re-route traffic based on incoming metrics; Smarter Customer Service where players develop more efficient ways to respond to customers in a call centre environment; and Smarter Supply Chains where players balance supply and demand and reduce environmental impact in a traditional supply chain model. We use the game as an experiential learning tool, where students have to act as managers making real time changes to business processes to meet changing business demands and environments. The students learn how information technology (IT) and information systems (IS) can be used to intelligently solve different problems and how computer simulations can be used to test different scenarios or models based on business decisions without having to actually make the potentially costly and/or disruptive changes to business processes. Moreover, when students play the three different scenarios, they quickly see how practical process improvements can help meet profitability, customer satisfaction and environmental goals while addressing real problems faced by municipalities and businesses today. After spending approximately two hours in the game, students reflect on their experience from it to apply several BPM principles that were presented in their textbook through the use of a structured set of assignment questions. For each final scenario students submit a screenshot of their solution followed by one paragraph explaining what criteria you were trying to optimize, and why they picked their input variables. In this poster we outline the course and the module’s learning objectives where we used the game to place this into context. We illustrate key features of the INNOV8 Simulation Game, and describe how we used them to reinforce theoretical concepts. The poster will also illustrate examples from the simulation, assignment, and learning outcomes.

Keywords: experiential learning, business process management, BPM, INNOV8, simulation, game

Procedia PDF Downloads 318
1813 Analysis of Airborne Data Using Range Migration Algorithm for the Spotlight Mode of Synthetic Aperture Radar

Authors: Peter Joseph Basil Morris, Chhabi Nigam, S. Ramakrishnan, P. Radhakrishna

Abstract:

This paper brings out the analysis of the airborne Synthetic Aperture Radar (SAR) data using the Range Migration Algorithm (RMA) for the spotlight mode of operation. Unlike in polar format algorithm (PFA), space-variant defocusing and geometric distortion effects are mitigated in RMA since it does not assume that the illuminating wave-fronts are planar. This facilitates the use of RMA for imaging scenarios involving severe differential range curvatures enabling the imaging of larger scenes at fine resolution and at shorter ranges with low center frequencies. The RMA algorithm for the spotlight mode of SAR is analyzed in this paper using the airborne data. Pre-processing operations viz: - range de-skew and motion compensation to a line are performed on the raw data before being fed to the RMA component. Various stages of the RMA viz:- 2D Matched Filtering, Along Track Fourier Transform and Slot Interpolation are analyzed to find the performance limits and the dependence of the imaging geometry on the resolution of the final image. The ability of RMA to compensate for severe differential range curvatures in the two-dimensional spatial frequency domain are also illustrated in this paper.

Keywords: range migration algorithm, spotlight SAR, synthetic aperture radar, matched filtering, slot interpolation

Procedia PDF Downloads 225
1812 Efficiently Degradation of Perfluorooctanoic Acid, an Emerging Contaminant, by a Hybrid Process of Membrane Distillation Process and Electro-Fenton

Authors: Afrouz Yousefi, Mohtada Sadrzadeh

Abstract:

The widespread presence of poly- and perfluoroalkyl substances (PFAS) poses a significant concern due to their ability to accumulate in living organisms and their persistence in the environment, thanks to their robust carbon-fluorine (C-F) bonds, which require substantial energy to break (485 kJ/mol). The prevalence of toxic PFAS compounds can be highly detrimental to ecosystems, wildlife, and human health. Ongoing efforts are dedicated to investigating methods for fully breaking down and eliminating PFAS from the environment. Among the various techniques employed, advanced oxidation processes have shown promise in completely breaking down emerging contaminants in wastewater. However, the drawback lies in the relatively slow reaction rates of these processes and the substantial energy input required, which currently impedes their widespread commercial adoption. We developed a hybrid process, comprising electro-Fenton as an advanced oxidation process and membrane distillation, to simultaneously degrade organic PFAS pollutants and extract pure water from the mixture. In this study, environmentally persistent perfluorooctanoic acid (PFOA), as an emerging contaminant, was used to study the effectiveness of the electro-Fenton/membrane distillation hybrid system. The PFOA degradation studies were conducted in two modes: electro-Fenton and electro-Fenton coupled with membrane distillation. High-performance liquid chromatography with ultraviolet detection (HPLC-UV), ion-chromatography (measuring fluoride ion concentration), total organic carbon (TOC) decay, mineralization current efficiency (MCE), and specific energy consumption (SEC) were evaluated for a single EF and hybrid EF-MD processes. In contrast to a single EF reaction, TOC decay improved significantly in the EF-MD process. Overall, the MCE of hybrid processes surpassed 100% while it remained under 50% for a single EF reaction. Calculations of specific energy consumption (SEC) demonstrated a substantial decrease of nearly one-third in energy usage when integrating the EF reaction with the MD process.

Keywords: water treatment, PFAS, membrane distillation, electro-Fenton, advanced oxidation

Procedia PDF Downloads 47
1811 Shedding Light on the Black Box: Explaining Deep Neural Network Prediction of Clinical Outcome

Authors: Yijun Shao, Yan Cheng, Rashmee U. Shah, Charlene R. Weir, Bruce E. Bray, Qing Zeng-Treitler

Abstract:

Deep neural network (DNN) models are being explored in the clinical domain, following the recent success in other domains such as image recognition. For clinical adoption, outcome prediction models require explanation, but due to the multiple non-linear inner transformations, DNN models are viewed by many as a black box. In this study, we developed a deep neural network model for predicting 1-year mortality of patients who underwent major cardio vascular procedures (MCVPs), using temporal image representation of past medical history as input. The dataset was obtained from the electronic medical data warehouse administered by Veteran Affairs Information and Computing Infrastructure (VINCI). We identified 21,355 veterans who had their first MCVP in 2014. Features for prediction included demographics, diagnoses, procedures, medication orders, hospitalizations, and frailty measures extracted from clinical notes. Temporal variables were created based on the patient history data in the 2-year window prior to the index MCVP. A temporal image was created based on these variables for each individual patient. To generate the explanation for the DNN model, we defined a new concept called impact score, based on the presence/value of clinical conditions’ impact on the predicted outcome. Like (log) odds ratio reported by the logistic regression (LR) model, impact scores are continuous variables intended to shed light on the black box model. For comparison, a logistic regression model was fitted on the same dataset. In our cohort, about 6.8% of patients died within one year. The prediction of the DNN model achieved an area under the curve (AUC) of 78.5% while the LR model achieved an AUC of 74.6%. A strong but not perfect correlation was found between the aggregated impact scores and the log odds ratios (Spearman’s rho = 0.74), which helped validate our explanation.

Keywords: deep neural network, temporal data, prediction, frailty, logistic regression model

Procedia PDF Downloads 139
1810 Nagabhasma Preparation and Its Effect on Kidneys: A Histopathological Study

Authors: Lydia Andrade, Kumar M. R. Bhat

Abstract:

Heavy metals, especially lead, is considered to be a multi-organ toxicant. However, such heavy metals, are used in the preparation of traditional medicines. Nagabhasma is one of the traditional medicines. Lead is the metal used in its preparation. Lead is converted into a health beneficial, organometallic compound, when subjected to various traditional methods of purification. Therefore, this study is designed to evaluate the effect of such processed lead in various stages of traditionally prepared Nagabhasma on the histological structure of kidneys. Using the human equivalent doses of Nagabhasma, various stages of its preparation were fed orally for 30 days and 60 days (short term and long term). The treated and untreated rats were then sacrificed for the collection of kidneys. The kidneys were processed for histopathological study. The results show severe changes in the histological structure of kidneys. The animals treated with lead acetate showed changes in the epithelial cells lining the bowman’s capsule. The proximal and distal convoluted tubules were dilated leading to atrophy of their epithelial cells. The amount of inflammatory infiltrates was more in this group. A few groups also showed pockets of inter-tubular hemorrhage. These changes, however, were minimized as the stages progressed form stages 1 to 4 of Nagabhasma preparation. Therefore, it is necessary to stringently monitor the processing of lead acetate during the preparation of Nagabhasma.

Keywords: heavy metals, kidneys, lead acetate, Nagabhasma

Procedia PDF Downloads 130
1809 Fatigue Analysis and Life Estimation of the Helicopter Horizontal Tail under Cyclic Loading by Using Finite Element Method

Authors: Defne Uz

Abstract:

Horizontal Tail of helicopter is exposed to repeated oscillatory loading generated by aerodynamic and inertial loads, and bending moments depending on operating conditions and maneuvers of the helicopter. In order to ensure that maximum stress levels do not exceed certain fatigue limit of the material and to prevent damage, a numerical analysis approach can be utilized through the Finite Element Method. Therefore, in this paper, fatigue analysis of the Horizontal Tail model is studied numerically to predict high-cycle and low-cycle fatigue life related to defined loading. The analysis estimates the stress field at stress concentration regions such as around fastener holes where the maximum principal stresses are considered for each load case. Critical element identification of the main load carrying structural components of the model with rivet holes is performed as a post-process since critical regions with high-stress values are used as an input for fatigue life calculation. Once the maximum stress is obtained at the critical element and the related mean and alternating components, it is compared with the endurance limit by applying Soderberg approach. The constant life straight line provides the limit for several combinations of mean and alternating stresses. The life calculation based on S-N (Stress-Number of Cycles) curve is also applied with fully reversed loading to determine the number of cycles corresponds to the oscillatory stress with zero means. The results determine the appropriateness of the design of the model for its fatigue strength and the number of cycles that the model can withstand for the calculated stress. The effect of correctly determining the critical rivet holes is investigated by analyzing stresses at different structural parts in the model. In the case of low life prediction, alternative design solutions are developed, and flight hours can be estimated for the fatigue safe operation of the model.

Keywords: fatigue analysis, finite element method, helicopter horizontal tail, life prediction, stress concentration

Procedia PDF Downloads 133
1808 Automatic Generating CNC-Code for Milling Machine

Authors: Chalakorn Chitsaart, Suchada Rianmora, Mann Rattana-Areeyagon, Wutichai Namjaiprasert

Abstract:

G-code is the main factor in computer numerical control (CNC) machine for controlling the tool-paths and generating the profile of the object’s features. For obtaining high surface accuracy of the surface finish, non-stop operation is required for CNC machine. Recently, to design a new product, the strategy that concerns about a change that has low impact on business and does not consume lot of resources has been introduced. Cost and time for designing minor changes can be reduced since the traditional geometric details of the existing models are applied. In order to support this strategy as the alternative channel for machining operation, this research proposes the automatic generating codes for CNC milling operation. Using this technique can assist the manufacturer to easily change the size and the geometric shape of the product during the operation where the time spent for setting up or processing the machine are reduced. The algorithm implemented on MATLAB platform is developed by analyzing and evaluating the geometric information of the part. Codes are created rapidly to control the operations of the machine. Comparing to the codes obtained from CAM, this developed algorithm can shortly generate and simulate the cutting profile of the part.

Keywords: geometric shapes, milling operation, minor changes, CNC Machine, G-code, cutting parameters

Procedia PDF Downloads 334
1807 Information Theoretic Approach for Beamforming in Wireless Communications

Authors: Syed Khurram Mahmud, Athar Naveed, Shoaib Arif

Abstract:

Beamforming is a signal processing technique extensively utilized in wireless communications and radars for desired signal intensification and interference signal minimization through spatial selectivity. In this paper, we present a method for calculation of optimal weight vectors for smart antenna array, to achieve a directive pattern during transmission and selective reception in interference prone environment. In proposed scheme, Mutual Information (MI) extrema are evaluated through an energy constrained objective function, which is based on a-priori information of interference source and desired array factor. Signal to Interference plus Noise Ratio (SINR) performance is evaluated for both transmission and reception. In our scheme, MI is presented as an index to identify trade-off between information gain, SINR, illumination time and spatial selectivity in an energy constrained optimization problem. The employed method yields lesser computational complexity, which is presented through comparative analysis with conventional methods in vogue. MI based beamforming offers enhancement of signal integrity in degraded environment while reducing computational intricacy and correlating key performance indicators.

Keywords: beamforming, interference, mutual information, wireless communications

Procedia PDF Downloads 263
1806 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering  

Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi

Abstract:

In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.

Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering

Procedia PDF Downloads 132
1805 Quality Assurance in Higher Education: Doha Institute for Graduate Studies as a Case Study

Authors: Ahmed Makhoukh

Abstract:

Quality assurance (QA) has recently become a common practice, which is endorsed by most Higher Education (HE) institutions worldwide, due to the pressure of internal and external forces. One of the aims of this quality movement is to make the contribution of university education to socio-economic development highly significant. This entails that graduates are currently required have a high-quality profile, i.e., to be competent and master the 21st-century skills needed in the labor market. This wave of change, mostly imposed by globalization, has the effect that university education should be learner-centered in order to satisfy the different needs of students and meet the expectations of other stakeholders. Such a shift of focus on the student learning outcomes has led HE institutions to reconsider their strategic planning, their mission, the curriculum, the pedagogical competence of the academic staff, among other elements. To ensure that the overall institutional performance is on the right way, a QA system should be established to assume this task of checking regularly the extent to which the set of standards of evaluation are strictly respected as expected. This operation of QA has the advantage of proving the accountability of the institution, gaining the trust of the public with transparency and enjoying an international recognition. This is the case of Doha Institute (DI) for Graduate Studies, in Qatar, the object of the present study. The significance of this contribution is to show that the conception of quality has changed in this digital age, and the need to integrate a department responsible for QA in every HE institution to ensure educational quality, enhance learners and achieve academic leadership. Thus, to undertake the issue of QA in DI for Graduate Studies, an elite university (in the academic sense) that focuses on a small and selected number of students, a qualitative method will be adopted in the description and analysis of the data (document analysis). In an attempt to investigate the extent to which QA is achieved in Doha Institute for Graduate Studies, three broad indicators will be evaluated (input, process and learning outcomes). This investigation will be carried out in line with the UK Quality Code for Higher Education represented by Quality Assurance Agency (QAA).

Keywords: accreditation, higher education, quality, quality assurance, standards

Procedia PDF Downloads 133
1804 Non-Contact Measurement of Soil Deformation in a Cyclic Triaxial Test

Authors: Erica Elice Uy, Toshihiro Noda, Kentaro Nakai, Jonathan Dungca

Abstract:

Deformation in a conventional cyclic triaxial test is normally measured by using point-wise measuring device. In this study, non-contact measurement technique was applied to be able to monitor and measure the occurrence of non-homogeneous behavior of the soil under cyclic loading. Non-contact measurement is executed through image processing. Two-dimensional measurements were performed using Lucas and Kanade optical flow algorithm and it was implemented Labview. In this technique, the non-homogeneous deformation was monitored using a mirrorless camera. A mirrorless camera was used because it is economical and it has the capacity to take pictures at a fast rate. The camera was first calibrated to remove the distortion brought about the lens and the testing environment as well. Calibration was divided into 2 phases. The first phase was the calibration of the camera parameters and distortion caused by the lens. The second phase was to for eliminating the distortion brought about the triaxial plexiglass. A correction factor was established from this phase. A series of consolidated undrained cyclic triaxial test was performed using a coarse soil. The results from the non-contact measurement technique were compared to the measured deformation from the linear variable displacement transducer. It was observed that deformation was higher at the area where failure occurs.

Keywords: cyclic loading, non-contact measurement, non-homogeneous, optical flow

Procedia PDF Downloads 290
1803 Stabilizing Additively Manufactured Superalloys at High Temperatures

Authors: Keivan Davami, Michael Munther, Lloyd Hackel

Abstract:

The control of properties and material behavior by implementing thermal-mechanical processes is based on mechanical deformation and annealing according to a precise schedule that will produce a unique and stable combination of grain structure, dislocation substructure, texture, and dispersion of precipitated phases. The authors recently developed a thermal-mechanical technique to stabilize the microstructure of additively manufactured nickel-based superalloys even after exposure to high temperatures. However, the mechanism(s) that controls this stability is still under investigation. Laser peening (LP), also called laser shock peening (LSP), is a shock based (50 ns duration) post-processing technique used for extending performance levels and improving service life of critical components by developing deep levels of plastic deformation, thereby generating high density of dislocations and inducing compressive residual stresses in the surface and deep subsurface of components. These compressive residual stresses are usually accompanied with an increase in hardness and enhance the material’s resistance to surface-related failures such as creep, fatigue, contact damage, and stress corrosion cracking. While the LP process enhances the life span and durability of the material, the induced compressive residual stresses relax at high temperatures (>0.5Tm, where Tm is the absolute melting temperature), limiting the applicability of the technology. At temperatures above 0.5Tm, the compressive residual stresses relax, and yield strength begins to drop dramatically. The principal reason is the increasing rate of solid-state diffusion, which affects both the dislocations and the microstructural barriers. Dislocation configurations commonly recover by mechanisms such as climbing and recombining rapidly at high temperatures. Furthermore, precipitates coarsen, and grains grow; virtually all of the available microstructural barriers become ineffective.Our results indicate that by using “cyclic” treatments with sequential LP and annealing steps, the compressive stresses survive, and the microstructure is stable after exposure to temperatures exceeding 0.5Tm for a long period of time. When the laser peening process is combined with annealing, dislocations formed as a result of LPand precipitates formed during annealing have a complex interaction that provides further stability at high temperatures. From a scientific point of view, this research lays the groundwork for studying a variety of physical, materials science, and mechanical engineering concepts. This research could lead to metals operating at higher sustained temperatures enabling improved system efficiencies. The strengthening of metals by a variety of means (alloying, work hardening, and other processes) has been of interest for a wide range of applications. However, the mechanistic understanding of the often complex processes of interactionsbetween dislocations with solute atoms and with precipitates during plastic deformation have largely remained scattered in the literature. In this research, the elucidation of the actual mechanisms involved in the novel cyclic LP/annealing processes as a scientific pursuit is investigated through parallel studies of dislocation theory and the implementation of advanced experimental tools. The results of this research help with the validation of a novel laser processing technique for high temperature applications. This will greatly expand the applications of the laser peening technology originally devised only for temperatures lower than half of the melting temperature.

Keywords: laser shock peening, mechanical properties, indentation, high temperature stability

Procedia PDF Downloads 134