Search results for: waste processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6187

Search results for: waste processing

1927 Application of Box-Behnken Response Surface Design for Optimization of Essential Oil Based Disinfectant on Mixed Species Biofilm

Authors: Anita Vidacs, Robert Rajko, Csaba Vagvolgyi, Judit Krisch

Abstract:

With the optimization of a new disinfectant the number of tests could be decreased and the cost of processing too. Good sanitizers are eco-friendly and allow no resistance evolvement of bacteria. The essential oils (EOs) are natural antimicrobials, and most of them have the Generally Recognized As Safe (GRAS) status. In our study, the effect of the EOs cinnamon, marjoram, and thyme was investigated against mixed species bacterial biofilms of Escherichia coli, Listeria monocytogenes, Pseudomonas putida, and Staphylococcus aureus. The optimal concentration of EOs, disinfection time and level of pH were evaluated with the aid of Response Surface Box-Behnken Design (RSD) on 1 day and 7 days old biofilms on metal, plastic, and wood surfaces. The variable factors were in the range of 1-3 times of minimum bactericide concentration (MBC); 10-110 minutes acting time and 4.5- 7.5 pH. The optimized EO disinfectant was compared to industrial used chemicals (HC-DPE, Hypo). The natural based disinfectants were applicable; the acting time was below 30 minutes. EOs were able to eliminate the biofilm from the used surfaces except from wood. The disinfection effect of the EO based natural solutions was in most cases equivalent or better compared to chemical sanitizers used in food industry.

Keywords: biofilm, Box-Behnken design, disinfectant, essential oil

Procedia PDF Downloads 220
1926 Role of Geomatics in Architectural and Cultural Conservation

Authors: Shweta Lall

Abstract:

The intent of this paper is to demonstrate the role of computerized auxiliary science in advancing the desired and necessary alliance of historians, surveyors, topographers, and analysts of architectural conservation and management. The digital era practice of recording architectural and cultural heritage in view of its preservation, dissemination, and planning developments are discussed in this paper. Geomatics include practices like remote sensing, photogrammetry, surveying, Geographic Information System (GIS), laser scanning technology, etc. These all resources help in architectural and conservation applications which will be identified through various case studies analysed in this paper. The standardised outcomes and the methodologies using relevant case studies are listed and described. The main component of geomatics methodology adapted in conservation is data acquisition, processing, and presentation. Geomatics is used in a wide range of activities involved in architectural and cultural heritage – damage and risk assessment analysis, documentation, 3-D model construction, virtual reconstruction, spatial and structural decision – making analysis and monitoring. This paper will project the summary answers of the capabilities and limitations of the geomatics field in architectural and cultural conservation. Policy-makers, urban planners, architects, and conservationist not only need answers to these questions but also need to practice them in a predictable, transparent, spatially explicit and inexpensive manner.

Keywords: architectural and cultural conservation, geomatics, GIS, remote sensing

Procedia PDF Downloads 148
1925 Design of Speed Bump Recognition System Integrated with Adjustable Shock Absorber Control

Authors: Ming-Yen Chang, Sheng-Hung Ke

Abstract:

This research focuses on the development of a speed bump identification system for real-time control of adjustable shock absorbers in vehicular suspension systems. The study initially involved the collection of images of various speed bumps, and rubber speed bump profiles found on roadways. These images were utilized for training and recognition purposes through the deep learning object detection algorithm YOLOv5. Subsequently, the trained speed bump identification program was integrated with an in-vehicle camera system for live image capture during driving. These images were instantly transmitted to a computer for processing. Using the principles of monocular vision ranging, the distance between the vehicle and an approaching speed bump was determined. The appropriate control distance was established through both practical vehicle measurements and theoretical calculations. Collaboratively, with the electronically adjustable shock absorbers equipped in the vehicle, a shock absorber control system was devised to dynamically adapt the damping force just prior to encountering a speed bump. This system effectively mitigates passenger discomfort and enhances ride quality.

Keywords: adjustable shock absorbers, image recognition, monocular vision ranging, ride

Procedia PDF Downloads 67
1924 Techno-Economic Optimization and Evaluation of an Integrated Industrial Scale NMC811 Cathode Active Material Manufacturing Process

Authors: Usama Mohamed, Sam Booth, Aliysn J. Nedoma

Abstract:

As part of the transition to electric vehicles, there has been a recent increase in demand for battery manufacturing. Cathodes typically account for approximately 50% of the total lithium-ion battery cell cost and are a pivotal factor in determining the viability of new industrial infrastructure. Cathodes which offer lower costs whilst maintaining or increasing performance, such as nickel-rich layered cathodes, have a significant competitive advantage when scaling up the manufacturing process. This project evaluates the techno-economic value proposition of an integrated industrial scale cathode active material (CAM) production process, closing the mass and energy balances, and optimizing the operation conditions using a sensitivity analysis. This is done by developing a process model of a co-precipitation synthesis route using Aspen Plus software and validated based on experimental data. The mechanism chemistry and equilibrium conditions were established based on previous literature and HSC-Chemistry software. This is then followed by integrating the energy streams, adding waste recovery and treatment processes, as well as testing the effect of key parameters (temperature, pH, reaction time, etc.) on CAM production yield and emissions. Finally, an economic analysis estimating the fixed and variable costs (including capital expenditure, labor costs, raw materials, etc.) to calculate the cost of CAM ($/kg and $/kWh), total plant cost ($) and net present value (NPV). This work sets the foundational blueprint for future research into sustainable industrial scale processes for CAM manufacturing.

Keywords: cathodes, industrial production, nickel-rich layered cathodes, process modelling, techno-economic analysis

Procedia PDF Downloads 100
1923 Healthcare Big Data Analytics Using Hadoop

Authors: Chellammal Surianarayanan

Abstract:

Healthcare industry is generating large amounts of data driven by various needs such as record keeping, physician’s prescription, medical imaging, sensor data, Electronic Patient Record(EPR), laboratory, pharmacy, etc. Healthcare data is so big and complex that they cannot be managed by conventional hardware and software. The complexity of healthcare big data arises from large volume of data, the velocity with which the data is accumulated and different varieties such as structured, semi-structured and unstructured nature of data. Despite the complexity of big data, if the trends and patterns that exist within the big data are uncovered and analyzed, higher quality healthcare at lower cost can be provided. Hadoop is an open source software framework for distributed processing of large data sets across clusters of commodity hardware using a simple programming model. The core components of Hadoop include Hadoop Distributed File System which offers way to store large amount of data across multiple machines and MapReduce which offers way to process large data sets with a parallel, distributed algorithm on a cluster. Hadoop ecosystem also includes various other tools such as Hive (a SQL-like query language), Pig (a higher level query language for MapReduce), Hbase(a columnar data store), etc. In this paper an analysis has been done as how healthcare big data can be processed and analyzed using Hadoop ecosystem.

Keywords: big data analytics, Hadoop, healthcare data, towards quality healthcare

Procedia PDF Downloads 413
1922 Pore Pressure and In-situ Stress Magnitudes with Image Log Processing and Geological Interpretation in the Haoud Berkaoui Hydrocarbon Field, Northeastern Algerian Sahara

Authors: Rafik Baouche, Rabah Chaouchi

Abstract:

This work reports the first comprehensive stress field interpretation from the eleven recently drilled wells in the Berkaoui Basin, Algerian Sahara. A cumulative length of 7000+m acoustic image logs from 06 vertical wells were investigated, and a mean NW-SE (128°-145° N) maximum horizontal stress (SHMax) orientation is inferred from the B-D quality wellbore breakouts. The study integrates log-based approach with the downhole measurements to infer pore pressure, in-situ stress magnitudes. Vertical stress (Sv), interpreted from the bulk-density profiles, has an average gradient of 22.36 MPa/km. The Ordovician and Cambrian reservoirs have a pore pressure gradient of 13.47-13.77 MPa/km, which is more than the hydrostatic pressure regime. A 17.2-18.3 MPa/km gradient of minimum horizontal stress (Shmin) is inferred from the fracture closure pressure in the reservoirs. Breakout widths constrained the SHMax magnitude in the 23.8-26.5 MPa/km range. Subsurface stress distribution in the central Saharan Algeria indicates that the present-day stress field in the Berkaoui Basin is principally strike-slip faulting (SHMax > Sv > Shmin). Inferences are drawn on the regional stress pattern and drilling and reservoir development.

Keywords: stress, imagery, breakouts, sahara

Procedia PDF Downloads 75
1921 Medical Image Augmentation Using Spatial Transformations for Convolutional Neural Network

Authors: Trupti Chavan, Ramachandra Guda, Kameshwar Rao

Abstract:

The lack of data is a pain problem in medical image analysis using a convolutional neural network (CNN). This work uses various spatial transformation techniques to address the medical image augmentation issue for knee detection and localization using an enhanced single shot detector (SSD) network. The spatial transforms like a negative, histogram equalization, power law, sharpening, averaging, gaussian blurring, etc. help to generate more samples, serve as pre-processing methods, and highlight the features of interest. The experimentation is done on the OpenKnee dataset which is a collection of knee images from the openly available online sources. The CNN called enhanced single shot detector (SSD) is utilized for the detection and localization of the knee joint from a given X-ray image. It is an enhanced version of the famous SSD network and is modified in such a way that it will reduce the number of prediction boxes at the output side. It consists of a classification network (VGGNET) and an auxiliary detection network. The performance is measured in mean average precision (mAP), and 99.96% mAP is achieved using the proposed enhanced SSD with spatial transformations. It is also seen that the localization boundary is comparatively more refined and closer to the ground truth in spatial augmentation and gives better detection and localization of knee joints.

Keywords: data augmentation, enhanced SSD, knee detection and localization, medical image analysis, openKnee, Spatial transformations

Procedia PDF Downloads 154
1920 Offline Signature Verification in Punjabi Based On SURF Features and Critical Point Matching Using HMM

Authors: Rajpal Kaur, Pooja Choudhary

Abstract:

Biometrics, which refers to identifying an individual based on his or her physiological or behavioral characteristics, has the capabilities to the reliably distinguish between an authorized person and an imposter. The Signature recognition systems can categorized as offline (static) and online (dynamic). This paper presents Surf Feature based recognition of offline signatures system that is trained with low-resolution scanned signature images. The signature of a person is an important biometric attribute of a human being which can be used to authenticate human identity. However the signatures of human can be handled as an image and recognized using computer vision and HMM techniques. With modern computers, there is need to develop fast algorithms for signature recognition. There are multiple techniques are defined to signature recognition with a lot of scope of research. In this paper, (static signature) off-line signature recognition & verification using surf feature with HMM is proposed, where the signature is captured and presented to the user in an image format. Signatures are verified depended on parameters extracted from the signature using various image processing techniques. The Off-line Signature Verification and Recognition is implemented using Mat lab platform. This work has been analyzed or tested and found suitable for its purpose or result. The proposed method performs better than the other recently proposed methods.

Keywords: offline signature verification, offline signature recognition, signatures, SURF features, HMM

Procedia PDF Downloads 384
1919 Detection and Classification of Myocardial Infarction Using New Extracted Features from Standard 12-Lead ECG Signals

Authors: Naser Safdarian, Nader Jafarnia Dabanloo

Abstract:

In this paper we used four features i.e. Q-wave integral, QRS complex integral, T-wave integral and total integral as extracted feature from normal and patient ECG signals to detection and localization of myocardial infarction (MI) in left ventricle of heart. In our research we focused on detection and localization of MI in standard ECG. We use the Q-wave integral and T-wave integral because this feature is important impression in detection of MI. We used some pattern recognition method such as Artificial Neural Network (ANN) to detect and localize the MI. Because these methods have good accuracy for classification of normal and abnormal signals. We used one type of Radial Basis Function (RBF) that called Probabilistic Neural Network (PNN) because of its nonlinearity property, and used other classifier such as k-Nearest Neighbors (KNN), Multilayer Perceptron (MLP) and Naive Bayes Classification. We used PhysioNet database as our training and test data. We reached over 80% for accuracy in test data for localization and over 95% for detection of MI. Main advantages of our method are simplicity and its good accuracy. Also we can improve accuracy of classification by adding more features in this method. A simple method based on using only four features which extracted from standard ECG is presented which has good accuracy in MI localization.

Keywords: ECG signal processing, myocardial infarction, features extraction, pattern recognition

Procedia PDF Downloads 456
1918 Corpus-Based Analysis on the Translatability of Conceptual Vagueness in Traditional Chinese Medicine Classics Huang Di Nei Jing

Authors: Yan Yue

Abstract:

Huang Di Nei Jing (HDNJ) is one of the significant traditional Chinese medicine (TCM) classics which lays the foundation of TCM theory and practice. It is an important work for the world to study the ancient civilizations and medical history of China. Language in HDNJ is highly concise and vague, and notably challenging to translate. This paper investigates the translatability of one particular vagueness in HDNJ: the conceptual vagueness which carries the Chinese philosophical and cultural connotations. The corpora tool Sketch Engine is used to provide potential online contexts and word behaviors. Selected two English translations of HDNJ by TCM practitioner and non-practitioner are used to examine frequency and distribution of linguistic features of the translation. It was found the hypothesis about the universals of translated language (explicitation, normalisation) is true in one translation, but it is on the sacrifice of some original contextual connotations. Transliteration is purposefully used in the second translation to retain the original flavor, which is argued as a violation of the principle of relevance in communication because it yields little contextual effects and demands more processing effort of the reader. The translatability of conceptual vagueness in HDNJ is constrained by source language context and the reader’s cognitive environment.

Keywords: corpus-based translation, translatability, TCM classics, vague language

Procedia PDF Downloads 377
1917 Al2O3-Dielectric AlGaN/GaN Enhancement-Mode MOS-HEMTs by Using Ozone Water Oxidization Technique

Authors: Ching-Sung Lee, Wei-Chou Hsu, Han-Yin Liu, Hung-Hsi Huang, Si-Fu Chen, Yun-Jung Yang, Bo-Chun Chiang, Yu-Chuang Chen, Shen-Tin Yang

Abstract:

AlGaN/GaN high electron mobility transistors (HEMTs) have been intensively studied due to their intrinsic advantages of high breakdown electric field, high electron saturation velocity, and excellent chemical stability. They are also suitable for ultra-violet (UV) photodetection due to the corresponding wavelengths of GaN bandgap. To improve the optical responsivity by decreasing the dark current due to gate leakage problems and limited Schottky barrier heights in GaN-based HEMT devices, various metal-oxide-semiconductor HEMTs (MOS-HEMTs) have been devised by using atomic layer deposition (ALD), molecular beam epitaxy (MBE), metal-organic chemical vapor deposition (MOCVD), liquid phase deposition (LPD), and RF sputtering. The gate dielectrics include MgO, HfO2, Al2O3, La2O3, and TiO2. In order to provide complementary circuit operation, enhancement-mode (E-mode) devices have been lately studied using techniques of fluorine treatment, p-type capper, piezoneutralization layer, and MOS-gate structure. This work reports an Al2O3-dielectric Al0.25Ga0.75N/GaN E-mode MOS-HEMT design by using a cost-effective ozone water oxidization technique. The present ozone oxidization method advantages of low cost processing facility, processing simplicity, compatibility to device fabrication, and room-temperature operation under atmospheric pressure. It can further reduce the gate-to-channel distance and improve the transocnductance (gm) gain for a specific oxide thickness, since the formation of the Al2O3 will consume part of the AlGaN barrier at the same time. The epitaxial structure of the studied devices was grown by using the MOCVD technique. On a Si substrate, the layer structures include a 3.9 m C-doped GaN buffer, a 300 nm GaN channel layer, and a 5 nm Al0.25Ga0.75N barrier layer. Mesa etching was performed to provide electrical isolation by using an inductively coupled-plasma reactive ion etcher (ICP-RIE). Ti/Al/Au were thermally evaporated and annealed to form the source and drain ohmic contacts. The device was immersed into the H2O2 solution pumped with ozone gas generated by using an OW-K2 ozone generator. Ni/Au were deposited as the gate electrode to complete device fabrication of MOS-HEMT. The formed Al2O3 oxide thickness 7 nm and the remained AlGaN barrier thickness is 2 nm. A reference HEMT device has also been fabricated in comparison on the same epitaxial structure. The gate dimensions are 1.2 × 100 µm 2 with a source-to-drain spacing of 5 μm for both devices. The dielectric constant (k) of Al2O3 was characterized to be 9.2 by using C-V measurement. Reduced interface state density after oxidization has been verified by the low-frequency noise spectra, Hooge coefficients, and pulse I-V measurement. Improved device characteristics at temperatures of 300 K-450 K have been achieved for the present MOS-HEMT design. Consequently, Al2O3-dielectric Al0.25Ga0.75N/GaN E-mode MOS-HEMTs by using the ozone water oxidization method are reported. In comparison with a conventional Schottky-gate HEMT, the MOS-HEMT design has demonstrated excellent enhancements of 138% (176%) in gm, max, 118% (139%) in IDS, max, 53% (62%) in BVGD, 3 (2)-order reduction in IG leakage at VGD = -60 V at 300 (450) K. This work is promising for millimeter-wave integrated circuit (MMIC) and three-terminal active UV photodetector applications.

Keywords: MOS-HEMT, enhancement mode, AlGaN/GaN, passivation, ozone water oxidation, gate leakage

Procedia PDF Downloads 263
1916 The Potential Role of Industrialized Building Systems in Malaysian Sustainable Construction: Awareness and Barriers

Authors: Aawag Mohsen Al-Awag, Wesam Salah Alaloul, M. S. Liew

Abstract:

Industrialized building system (IBS) is a method of construction with concentrated practices consisting of techniques, products, and a set of linked elements which operate collectively to accomplish objectives. The Industrialised Building System (IBS) has been recognised as a viable method for improving overall construction performance in terms of quality, cost, safety and health, waste reduction, and productivity. The Malaysian construction industry is considered one of the contributors to the development of the country. The acceptance level of IBS is still below government expectations. Thus, the Malaysian government has been continuously encouraging the industry to use and implement IBS. Conventional systems have several drawbacks, including project delays, low economic efficiency, excess inventory, and poor product quality. When it comes to implementing IBS, construction companies still face several obstacles and problems, notably in terms of contractual and procurement concerns, which leads to the low adoption of IBS in Malaysia. There are barriers to the acceptance of IBS technology, focused on awareness of historical failure and risks connected to IBS practices to provide enhanced performance. Therefore, the transformation from the existing conventional building systems to the industrialized building systems (IBS) is needed more than ever. The flexibility of IBS in Malaysia’s construction industry is very low due to numerous shortcomings and obstacles. Due to its environmental, economic, and social benefits, IBS could play a significant role in the Malaysian construction industry in the future. This paper concentrates on the potential role of IBS in sustainable construction practices in Malaysia. It also highlights the awareness, barriers, advantages, and disadvantages of IBS in the construction sector. The study concludes with recommendations for Malaysian construction stakeholders to encourage and increase the utilization of industrialised building systems.

Keywords: construction industry, industrialized building system, barriers, advantages and disadvantages, construction, sustainability, Malaysia

Procedia PDF Downloads 103
1915 Use Process Ring-Opening Polymerization to Melt Processing of Cellulose Nanowhisker from Coconut Husk Fibers-Filled Polylactide-Based Nanocomposites

Authors: Imam Wierawansyah Eltara, Iftitah, Agus Ismail

Abstract:

In the present work, cellulose nanowhiskers (CNW) extracted from coconut husk fibers, were incorporated in polylactide (PLA)-based composites. Prior to the blending, PLA chains were chemically grafted on the surface of CNW to enhance the compatibilization between CNW and the hydrophobic polyester matrix. Ring-opening polymerization of L-lactide was initiated from the hydroxyl groups available at the CNW surface to yield CNW-g-PLA nanohybrids. PLA-based nanocomposites were prepared by melt blending to ensure a green concept of the study thereby limiting the use of organic solvents. The influence of PLA-grafted cellulose nanoparticles on the mechanical and thermal properties of the ensuing nanocomposites was deeply investigated. The thermal behavior and mechanical properties of the nanocomposites were determined using differential scanning calorimetry (DSC) and dynamical mechanical and thermal analysis (DMTA), respectively. In theory, evidenced that the chemical grafting of CNW enhances their compatibility with the polymeric matrix and thus improves the final properties of the nanocomposites. Large modification of the crystalline properties such as the crystallization half-time was evidenced according to the nature of the PLA matrix and the content of nanofillers.

Keywords: cellulose nanowhiskers, nanocomposites, coconut husk fiber, ring opening polymerization

Procedia PDF Downloads 317
1914 Expanding Trading Strategies By Studying Sentiment Correlation With Data Mining Techniques

Authors: Ved Kulkarni, Karthik Kini

Abstract:

This experiment aims to understand how the media affects the power markets in the mainland United States and study the duration of reaction time between news updates and actual price movements. it have taken into account electric utility companies trading in the NYSE and excluded companies that are more politically involved and move with higher sensitivity to Politics. The scrapper checks for any news related to keywords, which are predefined and stored for each specific company. Based on this, the classifier will allocate the effect into five categories: positive, negative, highly optimistic, highly negative, or neutral. The effect on the respective price movement will be studied to understand the response time. Based on the response time observed, neural networks would be trained to understand and react to changing market conditions, achieving the best strategy in every market. The stock trader would be day trading in the first phase and making option strategy predictions based on the black holes model. The expected result is to create an AI-based system that adjusts trading strategies within the market response time to each price movement.

Keywords: data mining, language processing, artificial neural networks, sentiment analysis

Procedia PDF Downloads 17
1913 An Image Enhancement Method Based on Curvelet Transform for CBCT-Images

Authors: Shahriar Farzam, Maryam Rastgarpour

Abstract:

Image denoising plays extremely important role in digital image processing. Enhancement of clinical image research based on Curvelet has been developed rapidly in recent years. In this paper, we present a method for image contrast enhancement for cone beam CT (CBCT) images based on fast discrete curvelet transforms (FDCT) that work through Unequally Spaced Fast Fourier Transform (USFFT). These transforms return a table of Curvelet transform coefficients indexed by a scale parameter, an orientation and a spatial location. Accordingly, the coefficients obtained from FDCT-USFFT can be modified in order to enhance contrast in an image. Our proposed method first uses a two-dimensional mathematical transform, namely the FDCT through unequal-space fast Fourier transform on input image and then applies thresholding on coefficients of Curvelet to enhance the CBCT images. Consequently, applying unequal-space fast Fourier Transform leads to an accurate reconstruction of the image with high resolution. The experimental results indicate the performance of the proposed method is superior to the existing ones in terms of Peak Signal to Noise Ratio (PSNR) and Effective Measure of Enhancement (EME).

Keywords: curvelet transform, CBCT, image enhancement, image denoising

Procedia PDF Downloads 300
1912 Incorporating Circular Economy into Passive Design Strategies in Tropical Nigeria

Authors: Noah G. Akhimien, Eshrar Latif

Abstract:

The natural environment is in need for an urgent rescue due to dilapidation and recession of resources. Passive design strategies have proven to be one of the effective ways to reduce CO2 emissions and to improve building performance. On the other hand, there is a huge drop in material availability due to poor recycling culture. Consequently, building waste pose environmental hazard due to unrecycled building materials from construction and deconstruction. Buildings are seen to be material banks for a circular economy, therefore incorporating circular economy into passive housing will not only safe guide the climate but also improve resource efficiency. The study focuses on incorporating a circular economy in passive design strategies for an affordable energy and resource efficient residential building in Nigeria. Carbon dioxide (CO2) concentration is still on the increase as buildings are responsible for a significant amount of this emission globally. Therefore, prompt measures need to be taken to combat the effect of global warming and associated threats. Nigeria is rapidly growing in human population, resources on the other hand have receded greatly, and there is an abrupt need for recycling even in the built environment. It is necessary that Nigeria responds to these challenges effectively and efficiently considering building resource and energy. Passive design strategies were assessed using simulations to obtain qualitative and quantitative data which were inferred to case studies as it relates to the Nigeria climate. Building materials were analysed using the ReSOLVE model in order to explore possible recycling phase. This provided relevant information and strategies to illustrate the possibility of circular economy in passive buildings. The study offers an alternative approach, as it is the general principle for the reworking of an economy on ecological lines in passive housing and by closing material loops in circular economy.

Keywords: building, circular, efficiency, environment, sustainability

Procedia PDF Downloads 253
1911 Keratin Fiber Fabrication from Biowaste for Biomedical Application

Authors: Ashmita Mukherjee, Yogesh Harishchandra Kabutare, Suritra Bandyopadhyay, Paulomi Ghosh

Abstract:

Uncontrolled bleeding in the battlefield and the operation rooms can lead to serious injuries, trauma and even be lethal. Keratin was reported to be a haemostatic material which rapidly activates thrombin followed by activation of fibrinogen leading to the formation of insoluble fibrin. Also platelets, the main initiator of haemostasis are reported to adhere to keratin. However, the major limitation of pure keratin as a biomaterial is its poor physical property and corresponding low mechanical strength. To overcome this problem, keratin was cross-linked with alginate to increase its mechanical stability. In our study, Keratin extracted from feather waste showed yield of 80.5% and protein content of 8.05 ± 0.43 mg/mL (n=3). FTIR and CD spectroscopy confirmed the presence of the essential functional groups and preservation of the secondary structures of keratin. The keratin was then cross-linked with alginate to make a dope. The dope was used to draw fibers of desired diameters in a suitable coagulation bath using a customized wet spinning setup. The resultant morphology of keratin fibers was observed under a brightfield microscope. The FT-IR analysis implied that there was a presence of both keratin and alginate peaks in the fibers. The cross-linking was confirmed in the keratin alginate fibers by a shift of the amide A and amide B peaks towards the right and disappearance of the peak for N-H stretching (1534.68 cm-1). Blood was drawn in citrate vacutainers for whole blood clotting test and blood clotting kinetics, which showed that the keratin fibers could accelerate blood coagulation compared to that of alginate fibers and tissue culture plate. Additionally, cross-linked keratin-alginate fiber was found to have lower haemolytic potential compared to alginate fiber. Thus, keratin cross-linked fibers can have potential applications to combat unrestrained bleeding.

Keywords: biomaterial, biowaste, fiber, keratin

Procedia PDF Downloads 194
1910 Hybrid Algorithm for Non-Negative Matrix Factorization Based on Symmetric Kullback-Leibler Divergence for Signal Dependent Noise: A Case Study

Authors: Ana Serafimovic, Karthik Devarajan

Abstract:

Non-negative matrix factorization approximates a high dimensional non-negative matrix V as the product of two non-negative matrices, W and H, and allows only additive linear combinations of data, enabling it to learn parts with representations in reality. It has been successfully applied in the analysis and interpretation of high dimensional data arising in neuroscience, computational biology, and natural language processing, to name a few. The objective of this paper is to assess a hybrid algorithm for non-negative matrix factorization with multiplicative updates. The method aims to minimize the symmetric version of Kullback-Leibler divergence known as intrinsic information and assumes that the noise is signal-dependent and that it originates from an arbitrary distribution from the exponential family. It is a generalization of currently available algorithms for Gaussian, Poisson, gamma and inverse Gaussian noise. We demonstrate the potential usefulness of the new generalized algorithm by comparing its performance to the baseline methods which also aim to minimize symmetric divergence measures.

Keywords: non-negative matrix factorization, dimension reduction, clustering, intrinsic information, symmetric information divergence, signal-dependent noise, exponential family, generalized Kullback-Leibler divergence, dual divergence

Procedia PDF Downloads 246
1909 Ambiguity Resolution for Ground-based Pulse Doppler Radars Using Multiple Medium Pulse Repetition Frequency

Authors: Khue Nguyen Dinh, Loi Nguyen Van, Thanh Nguyen Nhu

Abstract:

In this paper, we propose an adaptive method to resolve ambiguities and a ghost target removal process to extract targets detected by a ground-based pulse-Doppler radar using medium pulse repetition frequency (PRF) waveforms. The ambiguity resolution method is an adaptive implementation of the coincidence algorithm, which is implemented on a two-dimensional (2D) range-velocity matrix to resolve range and velocity ambiguities simultaneously, with a proposed clustering filter to enhance the anti-error ability of the system. Here we consider the scenario of multiple target environments. The ghost target removal process, which is based on the power after Doppler processing, is proposed to mitigate ghosting detections to enhance the performance of ground-based radars using a short PRF schedule in multiple target environments. Simulation results on a ground-based pulsed Doppler radar model will be presented to show the effectiveness of the proposed approach.

Keywords: ambiguity resolution, coincidence algorithm, medium PRF, ghosting removal

Procedia PDF Downloads 152
1908 Inversion of Electrical Resistivity Data: A Review

Authors: Shrey Sharma, Gunjan Kumar Verma

Abstract:

High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.

Keywords: inversion, limitations, optimization, resistivity

Procedia PDF Downloads 365
1907 Design, Optimize the Damping System for Optical Scanning Equipment

Authors: Duy Nhat Tran, Van Tien Pham, Quang Trung Trinh, Tien Hai Tran, Van Cong Bui

Abstract:

In recent years, artificial intelligence and the Internet of Things have experienced significant advancements. Collecting image data and real-time analysis and processing of tasks have become increasingly popular in various aspects of life. Optical scanning devices are widely used to observe and analyze different environments, whether fixed outdoors, mounted on mobile devices, or used in unmanned aerial vehicles. As a result, the interaction between the physical environment and these devices has become more critical in terms of safety. Two commonly used methods for addressing these challenges are active and passive approaches. Each method has its advantages and disadvantages, but combining both methods can lead to higher efficiency. One solution is to utilize direct-drive motors for position control and real-time feedback within the operational range to determine appropriate control parameters with high precision. If the maximum motor torque is smaller than the inertial torque and the rotor reaches the operational limit, the spring system absorbs the impact force. Numerous experiments have been conducted to demonstrate the effectiveness of device protection during operation.

Keywords: optical device, collision safety, collision absorption, precise mechanics

Procedia PDF Downloads 63
1906 Modification Encryption Time and Permutation in Advanced Encryption Standard Algorithm

Authors: Dalal N. Hammod, Ekhlas K. Gbashi

Abstract:

Today, cryptography is used in many applications to achieve high security in data transmission and in real-time communications. AES has long gained global acceptance and is used for securing sensitive data in various industries but has suffered from slow processing and take a large time to transfer data. This paper suggests a method to enhance Advance Encryption Standard (AES) Algorithm based on time and permutation. The suggested method (MAES) is based on modifying the SubByte and ShiftRrows in the encryption part and modification the InvSubByte and InvShiftRows in the decryption part. After the implementation of the proposal and testing the results, the Modified AES achieved good results in accomplishing the communication with high performance criteria in terms of randomness, encryption time, storage space, and avalanche effects. The proposed method has good randomness to ciphertext because this method passed NIST statistical tests against attacks; also, (MAES) reduced the encryption time by (10 %) than the time of the original AES; therefore, the modified AES is faster than the original AES. Also, the proposed method showed good results in memory utilization where the value is (54.36) for the MAES, but the value for the original AES is (66.23). Also, the avalanche effects used for calculating diffusion property are (52.08%) for the modified AES and (51.82%) percentage for the original AES.

Keywords: modified AES, randomness test, encryption time, avalanche effects

Procedia PDF Downloads 248
1905 Inter-Departmental Survey to Check the Impact of Bio-Safety Training Sessions among Lab Employees

Authors: Noorulaine Maqsood, Saeed Khan

Abstract:

Background: Concern regarding incident reporting and bio-safety training in clinical laboratories in Pakistan has increased remarkably in the last few years due to rapid increase in diagnosis and research on infectious organisms. In order to ensure the safety of employees, this issue needs to be addressed immediately. Bio-safety training sessions and lectures are necessary for the protection of laboratory workers in order to ensure safe practices and minimize the count of incident reporting in the lab. Objective: To carry out an inter-departmental survey in lab regarding the awareness of bio-safety practices among lab employees before and after conducting bio-safety training sessions. Methodology: We conducted a 30 questions survey of laboratory workers in June 2013 (before training session) to gather information related to bio-safety awareness. Afterwards, we conducted another survey after training sessions and workshops related to bio-safety. Result: The survey regarding bio-safety level showed that before the training session 32% of the participants were aware of bio-safety level being used in their lab whereas after the session this percentage increased to 72%. 48% of the participants had information about the proper usage of PPE which increased to 76%. Awareness regarding proper management of hazardous waste increased from 32% to 64%. The incident reporting practice, sample handling and hand hygiene awareness was previously reported to be 40%, 65%, and 52% that increased to 80%, 85% and 88% respectively after the training session was completed. Conclusion: The first survey results showed lack of awareness that suggest nearly all senior scientists, faculty, medical technologist, lab attendant and housekeeping staff working in laboratories are required to have bio-safety training, and required inspection at least twice a year by a bio-safety officer and also required to renew their bio-safety training. After the training session, significant changes in awareness level and attitude of the participants regarding biosafety practices were observed. Therefore, such bio-safety sessions should be carried out regularly in clinical laboratories.

Keywords: biosafety practices, clinical laboratory, Pakistan, survey

Procedia PDF Downloads 428
1904 Exploratory Analysis of A Review of Nonexistence Polarity in Native Speech

Authors: Deawan Rakin Ahamed Remal, Sinthia Chowdhury, Sharun Akter Khushbu, Sheak Rashed Haider Noori

Abstract:

Native Speech to text synthesis has its own leverage for the purpose of mankind. The extensive nature of art to speaking different accents is common but the purpose of communication between two different accent types of people is quite difficult. This problem will be motivated by the extraction of the wrong perception of language meaning. Thus, many existing automatic speech recognition has been placed to detect text. Overall study of this paper mentions a review of NSTTR (Native Speech Text to Text Recognition) synthesis compared with Text to Text recognition. Review has exposed many text to text recognition systems that are at a very early stage to comply with the system by native speech recognition. Many discussions started about the progression of chatbots, linguistic theory another is rule based approach. In the Recent years Deep learning is an overwhelming chapter for text to text learning to detect language nature. To the best of our knowledge, In the sub continent a huge number of people speak in Bangla language but they have different accents in different regions therefore study has been elaborate contradictory discussion achievement of existing works and findings of future needs in Bangla language acoustic accent.

Keywords: TTR, NSTTR, text to text recognition, deep learning, natural language processing

Procedia PDF Downloads 132
1903 The Trigger-DAQ System in the Mu2e Experiment

Authors: Antonio Gioiosa, Simone Doanti, Eric Flumerfelt, Luca Morescalchi, Elena Pedreschi, Gianantonio Pezzullo, Ryan A. Rivera, Franco Spinella

Abstract:

The Mu2e experiment at Fermilab aims to measure the charged-lepton flavour violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. With the expected experimental sensitivity, Mu2e will improve the previous limit of four orders of magnitude. The Mu2e data acquisition (DAQ) system provides hardware and software to collect digitized data from the tracker, calorimeter, cosmic ray veto, and beam monitoring systems. Mu2e’s trigger and data acquisition system (TDAQ) uses otsdaq as its solution. developed at Fermilab, otsdaq uses the artdaq DAQ framework and art analysis framework, under-the-hood, for event transfer, filtering, and processing. Otsdaq is an online DAQ software suite with a focus on flexibility and scalability while providing a multi-user, web-based interface accessible through the Chrome or Firefox web browser. The detector read out controller (ROC) from the tracker and calorimeter stream out zero-suppressed data continuously to the data transfer controller (DTC). Data is then read over the PCIe bus to a software filter algorithm that selects events which are finally combined with the data flux that comes from a cosmic ray veto system (CRV).

Keywords: trigger, daq, mu2e, Fermilab

Procedia PDF Downloads 155
1902 Grain Size Characteristics and Sediments Distribution in the Eastern Part of Lekki Lagoon

Authors: Mayowa Philips Ibitola, Abe Oluwaseun Banji, Olorunfemi Akinade-Solomon

Abstract:

A total of 20 bottom sediment samples were collected from the Lekki Lagoon during the wet and dry season. The study was carried out to determine the textural characteristics, sediment distribution pattern and energy of transportation within the lagoon system. The sediment grain sizes and depth profiling was analyzed using dry sieving method and MATLAB algorithm for processing. The granulometric reveals fine grained sand both for the wet and dry season with an average mean value of 2.03 ϕ and -2.88 ϕ, respectively. Sediments were moderately sorted with an average inclusive standard deviation of 0.77 ϕ and -0.82 ϕ. Skewness varied from strongly coarse and near symmetrical 0.34- ϕ and 0.09 ϕ. The kurtosis average value was 0.87 ϕ and -1.4 ϕ (platykurtic and leptokurtic). Entirely, the bathymetry shows an average depth of 4.0 m. The deepest and shallowest area has a depth of 11.2 m and 0.5 m, respectively. High concentration of fine sand was observed at deep areas compared to the shallow areas during wet and dry season. Statistical parameter results show that the overall sediments are sorted, and deposited under low energy condition over a long distance. However, sediment distribution and sediment transport pattern of Lekki Lagoon is controlled by a low energy current and the down slope configuration of the bathymetry enhances the sorting and the deposition rate in the Lekki Lagoon.

Keywords: Lekki Lagoon, Marine sediment, bathymetry, grain size distribution

Procedia PDF Downloads 231
1901 Electrocardiogram-Based Heartbeat Classification Using Convolutional Neural Networks

Authors: Jacqueline Rose T. Alipo-on, Francesca Isabelle F. Escobar, Myles Joshua T. Tan, Hezerul Abdul Karim, Nouar Al Dahoul

Abstract:

Electrocardiogram (ECG) signal analysis and processing are crucial in the diagnosis of cardiovascular diseases, which are considered one of the leading causes of mortality worldwide. However, the traditional rule-based analysis of large volumes of ECG data is time-consuming, labor-intensive, and prone to human errors. With the advancement of the programming paradigm, algorithms such as machine learning have been increasingly used to perform an analysis of ECG signals. In this paper, various deep learning algorithms were adapted to classify five classes of heartbeat types. The dataset used in this work is the synthetic MIT-BIH Arrhythmia dataset produced from generative adversarial networks (GANs). Various deep learning models such as ResNet-50 convolutional neural network (CNN), 1-D CNN, and long short-term memory (LSTM) were evaluated and compared. ResNet-50 was found to outperform other models in terms of recall and F1 score using a five-fold average score of 98.88% and 98.87%, respectively. 1-D CNN, on the other hand, was found to have the highest average precision of 98.93%.

Keywords: heartbeat classification, convolutional neural network, electrocardiogram signals, generative adversarial networks, long short-term memory, ResNet-50

Procedia PDF Downloads 128
1900 Comparison of Various Landfill Ground Improvement Techniques for Redevelopment of Closed Landfills to Cater Transport Infrastructure

Authors: Michael D. Vinod, Hadi Khabbaz

Abstract:

Construction of infrastructure above or adjacent to landfills is becoming more common to capitalize on the limited space available within urban areas. However, development above landfills is a challenging task due to large voids, the presence of organic matter, heterogeneous nature of waste and ambiguity surrounding landfill settlement prediction. Prior to construction of infrastructure above landfills, ground improvement techniques are being employed to improve the geotechnical properties of landfill material. Although the ground improvement techniques have little impact on long term biodegradation and creep related landfill settlement, they have shown some notable short term success with a variety of techniques, including methods for verifying the level of effectiveness of ground improvement techniques. This paper provides geotechnical and landfill engineers a guideline for selection of landfill ground improvement techniques and their suitability to project-specific sites. Ground improvement methods assessed and compared in this paper include concrete injected columns (CIC), dynamic compaction, rapid impact compaction (RIC), preloading, high energy impact compaction (HEIC), vibro compaction, vibro replacement, chemical stabilization and the inclusion of geosynthetics such as geocells. For each ground improvement technique a summary of the existing theory, benefits, limitations, suitable modern ground improvement monitoring methods, the applicability of ground improvement techniques for landfills and supporting case studies are provided. The authors highlight the importance of implementing cost-effective monitoring techniques to allow observation and necessary remediation of the subsidence effects associated with long term landfill settlement. These ground improvement techniques are primarily for the purpose of construction above closed landfills to cater for transport infrastructure loading.

Keywords: closed landfills, ground improvement, monitoring, settlement, transport infrastructure

Procedia PDF Downloads 224
1899 A Multivariate Statistical Approach for Water Quality Assessment of River Hindon, India

Authors: Nida Rizvi, Deeksha Katyal, Varun Joshi

Abstract:

River Hindon is an important river catering the demand of highly populated rural and industrial cluster of western Uttar Pradesh, India. Water quality of river Hindon is deteriorating at an alarming rate due to various industrial, municipal and agricultural activities. The present study aimed at identifying the pollution sources and quantifying the degree to which these sources are responsible for the deteriorating water quality of the river. Various water quality parameters, like pH, temperature, electrical conductivity, total dissolved solids, total hardness, calcium, chloride, nitrate, sulphate, biological oxygen demand, chemical oxygen demand and total alkalinity were assessed. Water quality data obtained from eight study sites for one year has been subjected to the two multivariate techniques, namely, principal component analysis and cluster analysis. Principal component analysis was applied with the aim to find out spatial variability and to identify the sources responsible for the water quality of the river. Three Varifactors were obtained after varimax rotation of initial principal components using principal component analysis. Cluster analysis was carried out to classify sampling stations of certain similarity, which grouped eight different sites into two clusters. The study reveals that the anthropogenic influence (municipal, industrial, waste water and agricultural runoff) was the major source of river water pollution. Thus, this study illustrates the utility of multivariate statistical techniques for analysis and elucidation of multifaceted data sets, recognition of pollution sources/factors and understanding temporal/spatial variations in water quality for effective river water quality management.

Keywords: cluster analysis, multivariate statistical techniques, river Hindon, water quality

Procedia PDF Downloads 467
1898 Thermal Decontamination of Soils Polluted by Polychlorinated Biphenyls and Microplastics

Authors: Roya Biabani, Mentore Vaccari, Piero Ferrari

Abstract:

Accumulated microplastic (MPLs) in soil pose the risk of adsorbing and transporting polychlorinated biphenyls (PCBs) into the food chain or bodies. PCBs belong to a class of man-made hydrophobic organic chemicals (HOCs) that are classified as probable human carcinogens and a hazard to biota. Therefore, to take effective action and not aggravate the already recognized problems, the knowledge of PCB remediation in the presence of MPLs needs to be complete. Due to the high efficiency and little secondary pollution production, thermal desorption (TD) has been widely used for processing a variety of pollutants, especially for removing volatile and semi-volatile organic matter from contaminated solids and sediment. This study investigates the fate of PCB compounds during the thermal remediation method. For this, the PCB-contaminated soil was collected from the earth-canal downstream Caffaro S.p.A. chemical factory, which produced PCBs and PCB mixtures between 1930 and 1984. For MPL analysis, MPLs were separated by density separation and oxidation of organic matter. An operational range for the key parameters of thermal desorption processes was experimentally evaluated. Moreover, the temperature treatment characteristics of the PCBs-contaminated soil under anaerobic and aerobic conditions were studied using the Thermogravimetric Analysis (TGA).

Keywords: contaminated soils, microplastics, polychlorinated biphenyls, thermal desorption

Procedia PDF Downloads 104