Search results for: processing map
1622 Pore Pressure and In-situ Stress Magnitudes with Image Log Processing and Geological Interpretation in the Haoud Berkaoui Hydrocarbon Field, Northeastern Algerian Sahara
Authors: Rafik Baouche, Rabah Chaouchi
Abstract:
This work reports the first comprehensive stress field interpretation from the eleven recently drilled wells in the Berkaoui Basin, Algerian Sahara. A cumulative length of 7000+m acoustic image logs from 06 vertical wells were investigated, and a mean NW-SE (128°-145° N) maximum horizontal stress (SHMax) orientation is inferred from the B-D quality wellbore breakouts. The study integrates log-based approach with the downhole measurements to infer pore pressure, in-situ stress magnitudes. Vertical stress (Sv), interpreted from the bulk-density profiles, has an average gradient of 22.36 MPa/km. The Ordovician and Cambrian reservoirs have a pore pressure gradient of 13.47-13.77 MPa/km, which is more than the hydrostatic pressure regime. A 17.2-18.3 MPa/km gradient of minimum horizontal stress (Shmin) is inferred from the fracture closure pressure in the reservoirs. Breakout widths constrained the SHMax magnitude in the 23.8-26.5 MPa/km range. Subsurface stress distribution in the central Saharan Algeria indicates that the present-day stress field in the Berkaoui Basin is principally strike-slip faulting (SHMax > Sv > Shmin). Inferences are drawn on the regional stress pattern and drilling and reservoir development.Keywords: stress, imagery, breakouts, sahara
Procedia PDF Downloads 751621 Medical Image Augmentation Using Spatial Transformations for Convolutional Neural Network
Authors: Trupti Chavan, Ramachandra Guda, Kameshwar Rao
Abstract:
The lack of data is a pain problem in medical image analysis using a convolutional neural network (CNN). This work uses various spatial transformation techniques to address the medical image augmentation issue for knee detection and localization using an enhanced single shot detector (SSD) network. The spatial transforms like a negative, histogram equalization, power law, sharpening, averaging, gaussian blurring, etc. help to generate more samples, serve as pre-processing methods, and highlight the features of interest. The experimentation is done on the OpenKnee dataset which is a collection of knee images from the openly available online sources. The CNN called enhanced single shot detector (SSD) is utilized for the detection and localization of the knee joint from a given X-ray image. It is an enhanced version of the famous SSD network and is modified in such a way that it will reduce the number of prediction boxes at the output side. It consists of a classification network (VGGNET) and an auxiliary detection network. The performance is measured in mean average precision (mAP), and 99.96% mAP is achieved using the proposed enhanced SSD with spatial transformations. It is also seen that the localization boundary is comparatively more refined and closer to the ground truth in spatial augmentation and gives better detection and localization of knee joints.Keywords: data augmentation, enhanced SSD, knee detection and localization, medical image analysis, openKnee, Spatial transformations
Procedia PDF Downloads 1541620 Offline Signature Verification in Punjabi Based On SURF Features and Critical Point Matching Using HMM
Authors: Rajpal Kaur, Pooja Choudhary
Abstract:
Biometrics, which refers to identifying an individual based on his or her physiological or behavioral characteristics, has the capabilities to the reliably distinguish between an authorized person and an imposter. The Signature recognition systems can categorized as offline (static) and online (dynamic). This paper presents Surf Feature based recognition of offline signatures system that is trained with low-resolution scanned signature images. The signature of a person is an important biometric attribute of a human being which can be used to authenticate human identity. However the signatures of human can be handled as an image and recognized using computer vision and HMM techniques. With modern computers, there is need to develop fast algorithms for signature recognition. There are multiple techniques are defined to signature recognition with a lot of scope of research. In this paper, (static signature) off-line signature recognition & verification using surf feature with HMM is proposed, where the signature is captured and presented to the user in an image format. Signatures are verified depended on parameters extracted from the signature using various image processing techniques. The Off-line Signature Verification and Recognition is implemented using Mat lab platform. This work has been analyzed or tested and found suitable for its purpose or result. The proposed method performs better than the other recently proposed methods.Keywords: offline signature verification, offline signature recognition, signatures, SURF features, HMM
Procedia PDF Downloads 3841619 Detection and Classification of Myocardial Infarction Using New Extracted Features from Standard 12-Lead ECG Signals
Authors: Naser Safdarian, Nader Jafarnia Dabanloo
Abstract:
In this paper we used four features i.e. Q-wave integral, QRS complex integral, T-wave integral and total integral as extracted feature from normal and patient ECG signals to detection and localization of myocardial infarction (MI) in left ventricle of heart. In our research we focused on detection and localization of MI in standard ECG. We use the Q-wave integral and T-wave integral because this feature is important impression in detection of MI. We used some pattern recognition method such as Artificial Neural Network (ANN) to detect and localize the MI. Because these methods have good accuracy for classification of normal and abnormal signals. We used one type of Radial Basis Function (RBF) that called Probabilistic Neural Network (PNN) because of its nonlinearity property, and used other classifier such as k-Nearest Neighbors (KNN), Multilayer Perceptron (MLP) and Naive Bayes Classification. We used PhysioNet database as our training and test data. We reached over 80% for accuracy in test data for localization and over 95% for detection of MI. Main advantages of our method are simplicity and its good accuracy. Also we can improve accuracy of classification by adding more features in this method. A simple method based on using only four features which extracted from standard ECG is presented which has good accuracy in MI localization.Keywords: ECG signal processing, myocardial infarction, features extraction, pattern recognition
Procedia PDF Downloads 4561618 Corpus-Based Analysis on the Translatability of Conceptual Vagueness in Traditional Chinese Medicine Classics Huang Di Nei Jing
Authors: Yan Yue
Abstract:
Huang Di Nei Jing (HDNJ) is one of the significant traditional Chinese medicine (TCM) classics which lays the foundation of TCM theory and practice. It is an important work for the world to study the ancient civilizations and medical history of China. Language in HDNJ is highly concise and vague, and notably challenging to translate. This paper investigates the translatability of one particular vagueness in HDNJ: the conceptual vagueness which carries the Chinese philosophical and cultural connotations. The corpora tool Sketch Engine is used to provide potential online contexts and word behaviors. Selected two English translations of HDNJ by TCM practitioner and non-practitioner are used to examine frequency and distribution of linguistic features of the translation. It was found the hypothesis about the universals of translated language (explicitation, normalisation) is true in one translation, but it is on the sacrifice of some original contextual connotations. Transliteration is purposefully used in the second translation to retain the original flavor, which is argued as a violation of the principle of relevance in communication because it yields little contextual effects and demands more processing effort of the reader. The translatability of conceptual vagueness in HDNJ is constrained by source language context and the reader’s cognitive environment.Keywords: corpus-based translation, translatability, TCM classics, vague language
Procedia PDF Downloads 3771617 Al2O3-Dielectric AlGaN/GaN Enhancement-Mode MOS-HEMTs by Using Ozone Water Oxidization Technique
Authors: Ching-Sung Lee, Wei-Chou Hsu, Han-Yin Liu, Hung-Hsi Huang, Si-Fu Chen, Yun-Jung Yang, Bo-Chun Chiang, Yu-Chuang Chen, Shen-Tin Yang
Abstract:
AlGaN/GaN high electron mobility transistors (HEMTs) have been intensively studied due to their intrinsic advantages of high breakdown electric field, high electron saturation velocity, and excellent chemical stability. They are also suitable for ultra-violet (UV) photodetection due to the corresponding wavelengths of GaN bandgap. To improve the optical responsivity by decreasing the dark current due to gate leakage problems and limited Schottky barrier heights in GaN-based HEMT devices, various metal-oxide-semiconductor HEMTs (MOS-HEMTs) have been devised by using atomic layer deposition (ALD), molecular beam epitaxy (MBE), metal-organic chemical vapor deposition (MOCVD), liquid phase deposition (LPD), and RF sputtering. The gate dielectrics include MgO, HfO2, Al2O3, La2O3, and TiO2. In order to provide complementary circuit operation, enhancement-mode (E-mode) devices have been lately studied using techniques of fluorine treatment, p-type capper, piezoneutralization layer, and MOS-gate structure. This work reports an Al2O3-dielectric Al0.25Ga0.75N/GaN E-mode MOS-HEMT design by using a cost-effective ozone water oxidization technique. The present ozone oxidization method advantages of low cost processing facility, processing simplicity, compatibility to device fabrication, and room-temperature operation under atmospheric pressure. It can further reduce the gate-to-channel distance and improve the transocnductance (gm) gain for a specific oxide thickness, since the formation of the Al2O3 will consume part of the AlGaN barrier at the same time. The epitaxial structure of the studied devices was grown by using the MOCVD technique. On a Si substrate, the layer structures include a 3.9 m C-doped GaN buffer, a 300 nm GaN channel layer, and a 5 nm Al0.25Ga0.75N barrier layer. Mesa etching was performed to provide electrical isolation by using an inductively coupled-plasma reactive ion etcher (ICP-RIE). Ti/Al/Au were thermally evaporated and annealed to form the source and drain ohmic contacts. The device was immersed into the H2O2 solution pumped with ozone gas generated by using an OW-K2 ozone generator. Ni/Au were deposited as the gate electrode to complete device fabrication of MOS-HEMT. The formed Al2O3 oxide thickness 7 nm and the remained AlGaN barrier thickness is 2 nm. A reference HEMT device has also been fabricated in comparison on the same epitaxial structure. The gate dimensions are 1.2 × 100 µm 2 with a source-to-drain spacing of 5 μm for both devices. The dielectric constant (k) of Al2O3 was characterized to be 9.2 by using C-V measurement. Reduced interface state density after oxidization has been verified by the low-frequency noise spectra, Hooge coefficients, and pulse I-V measurement. Improved device characteristics at temperatures of 300 K-450 K have been achieved for the present MOS-HEMT design. Consequently, Al2O3-dielectric Al0.25Ga0.75N/GaN E-mode MOS-HEMTs by using the ozone water oxidization method are reported. In comparison with a conventional Schottky-gate HEMT, the MOS-HEMT design has demonstrated excellent enhancements of 138% (176%) in gm, max, 118% (139%) in IDS, max, 53% (62%) in BVGD, 3 (2)-order reduction in IG leakage at VGD = -60 V at 300 (450) K. This work is promising for millimeter-wave integrated circuit (MMIC) and three-terminal active UV photodetector applications.Keywords: MOS-HEMT, enhancement mode, AlGaN/GaN, passivation, ozone water oxidation, gate leakage
Procedia PDF Downloads 2621616 Use Process Ring-Opening Polymerization to Melt Processing of Cellulose Nanowhisker from Coconut Husk Fibers-Filled Polylactide-Based Nanocomposites
Authors: Imam Wierawansyah Eltara, Iftitah, Agus Ismail
Abstract:
In the present work, cellulose nanowhiskers (CNW) extracted from coconut husk fibers, were incorporated in polylactide (PLA)-based composites. Prior to the blending, PLA chains were chemically grafted on the surface of CNW to enhance the compatibilization between CNW and the hydrophobic polyester matrix. Ring-opening polymerization of L-lactide was initiated from the hydroxyl groups available at the CNW surface to yield CNW-g-PLA nanohybrids. PLA-based nanocomposites were prepared by melt blending to ensure a green concept of the study thereby limiting the use of organic solvents. The influence of PLA-grafted cellulose nanoparticles on the mechanical and thermal properties of the ensuing nanocomposites was deeply investigated. The thermal behavior and mechanical properties of the nanocomposites were determined using differential scanning calorimetry (DSC) and dynamical mechanical and thermal analysis (DMTA), respectively. In theory, evidenced that the chemical grafting of CNW enhances their compatibility with the polymeric matrix and thus improves the final properties of the nanocomposites. Large modification of the crystalline properties such as the crystallization half-time was evidenced according to the nature of the PLA matrix and the content of nanofillers.Keywords: cellulose nanowhiskers, nanocomposites, coconut husk fiber, ring opening polymerization
Procedia PDF Downloads 3171615 Expanding Trading Strategies By Studying Sentiment Correlation With Data Mining Techniques
Authors: Ved Kulkarni, Karthik Kini
Abstract:
This experiment aims to understand how the media affects the power markets in the mainland United States and study the duration of reaction time between news updates and actual price movements. it have taken into account electric utility companies trading in the NYSE and excluded companies that are more politically involved and move with higher sensitivity to Politics. The scrapper checks for any news related to keywords, which are predefined and stored for each specific company. Based on this, the classifier will allocate the effect into five categories: positive, negative, highly optimistic, highly negative, or neutral. The effect on the respective price movement will be studied to understand the response time. Based on the response time observed, neural networks would be trained to understand and react to changing market conditions, achieving the best strategy in every market. The stock trader would be day trading in the first phase and making option strategy predictions based on the black holes model. The expected result is to create an AI-based system that adjusts trading strategies within the market response time to each price movement.Keywords: data mining, language processing, artificial neural networks, sentiment analysis
Procedia PDF Downloads 171614 An Image Enhancement Method Based on Curvelet Transform for CBCT-Images
Authors: Shahriar Farzam, Maryam Rastgarpour
Abstract:
Image denoising plays extremely important role in digital image processing. Enhancement of clinical image research based on Curvelet has been developed rapidly in recent years. In this paper, we present a method for image contrast enhancement for cone beam CT (CBCT) images based on fast discrete curvelet transforms (FDCT) that work through Unequally Spaced Fast Fourier Transform (USFFT). These transforms return a table of Curvelet transform coefficients indexed by a scale parameter, an orientation and a spatial location. Accordingly, the coefficients obtained from FDCT-USFFT can be modified in order to enhance contrast in an image. Our proposed method first uses a two-dimensional mathematical transform, namely the FDCT through unequal-space fast Fourier transform on input image and then applies thresholding on coefficients of Curvelet to enhance the CBCT images. Consequently, applying unequal-space fast Fourier Transform leads to an accurate reconstruction of the image with high resolution. The experimental results indicate the performance of the proposed method is superior to the existing ones in terms of Peak Signal to Noise Ratio (PSNR) and Effective Measure of Enhancement (EME).Keywords: curvelet transform, CBCT, image enhancement, image denoising
Procedia PDF Downloads 3001613 Hybrid Algorithm for Non-Negative Matrix Factorization Based on Symmetric Kullback-Leibler Divergence for Signal Dependent Noise: A Case Study
Authors: Ana Serafimovic, Karthik Devarajan
Abstract:
Non-negative matrix factorization approximates a high dimensional non-negative matrix V as the product of two non-negative matrices, W and H, and allows only additive linear combinations of data, enabling it to learn parts with representations in reality. It has been successfully applied in the analysis and interpretation of high dimensional data arising in neuroscience, computational biology, and natural language processing, to name a few. The objective of this paper is to assess a hybrid algorithm for non-negative matrix factorization with multiplicative updates. The method aims to minimize the symmetric version of Kullback-Leibler divergence known as intrinsic information and assumes that the noise is signal-dependent and that it originates from an arbitrary distribution from the exponential family. It is a generalization of currently available algorithms for Gaussian, Poisson, gamma and inverse Gaussian noise. We demonstrate the potential usefulness of the new generalized algorithm by comparing its performance to the baseline methods which also aim to minimize symmetric divergence measures.Keywords: non-negative matrix factorization, dimension reduction, clustering, intrinsic information, symmetric information divergence, signal-dependent noise, exponential family, generalized Kullback-Leibler divergence, dual divergence
Procedia PDF Downloads 2461612 Enhance Biogas Production by Enzymatic Pre-Treatment from Palm Oil Mill Effluent (POME)
Authors: M. S. Tajul Islam, Md. Zahangir Alam
Abstract:
To enhance biogas production through anaerobic digestion, the application of various type of pre-treatment method has some limitations in terms of sustainable environmental management. Many studies on pretreatments especially chemical and physical processes are carried out to evaluate the anaerobic digestion for enhanced biogas production. Among the pretreatment methods acid and alkali pre-treatments gained the highest importance. Previous studies have showed that although acid and alkali pretreatment has significant effect on degradation of biomass, these methods have some negative impact on environment due to their hazard in nature while enzymatic pre-treatment is environmentally friendly. One of the constrains to use of enzyme in pretreatment process for biogas production is high cost which is currently focused to reduce cost through fermentation of waste-based media. As such palm oil mill effluent (POME) as an abundant resource generated during palm oil processing at mill is being used a potential fermentation media for enzyme production. This low cost of enzyme could be an alternative to biogas pretreatment process. This review is to focus direct application of enzyme as enzymatic pre-treatment on POME to enhanced production of biogas.Keywords: POME, enzymatic pre-treatment, biogas, lignocellulosic biomass, anaerobic digestion
Procedia PDF Downloads 5501611 Increasing Redness and Microbial Stability of Low Nitrite Chicken Sausage by Encapsulated Tomato Pomace Extract
Authors: Bung-Orn Hemung, Nachayut Chanshotigul, Koo Bok Chin
Abstract:
Tomato pomace (TP) is the waste from tomato processing plants and its utilization as food ingredient may provide sustainable industry by reducing waste. TP was extracted by ethanol using microwave-assisted method at 180W for 90s. The ethanol was evaporated out, and an extract was encapsulated with maltodextrin (1:10) by spray drying to obtain an encapsulated TP extract (ETPE). The redness (a value) of ETPE powder was 6.5±0.05, and it was used as natural ingredient in the low-nitrite chicken sausage. Chicken emulsion sausage was prepared at 25 mg/kg of nitrite for being control. Effect of ETPE (1.0%) was evaluated along with the reference (150 mg/kg of nitrite without ETPE). The redness (a value) of sausage with ETPE was found at 6.8±0.03, which was higher than those of reference and control, which were at 4.8±.022 and 5.1±0.15, respectively. However, hardness, expressible moisture content and cooking yield values were reduced slightly. During storage at 10 °C in the air packed condition for 1 week, changes in color, pH, redness, and thiobarbituric acid reactive substances value were not significantly different. However, total microbial count of sausage samples with ETPE was lower than control for a 1 log cycle, suggesting microbial stability. Therefore, the addition of ETPE could be an alternative strategy to utilize TP as a natural colorant and antimicrobial agent to extend the shelf life of low-nitrite chicken sausage.Keywords: antimicrobial ingredient, chicken sausage, ethanolic extract, low-nitrite sausage, tomato pomace
Procedia PDF Downloads 2081610 Ambiguity Resolution for Ground-based Pulse Doppler Radars Using Multiple Medium Pulse Repetition Frequency
Authors: Khue Nguyen Dinh, Loi Nguyen Van, Thanh Nguyen Nhu
Abstract:
In this paper, we propose an adaptive method to resolve ambiguities and a ghost target removal process to extract targets detected by a ground-based pulse-Doppler radar using medium pulse repetition frequency (PRF) waveforms. The ambiguity resolution method is an adaptive implementation of the coincidence algorithm, which is implemented on a two-dimensional (2D) range-velocity matrix to resolve range and velocity ambiguities simultaneously, with a proposed clustering filter to enhance the anti-error ability of the system. Here we consider the scenario of multiple target environments. The ghost target removal process, which is based on the power after Doppler processing, is proposed to mitigate ghosting detections to enhance the performance of ground-based radars using a short PRF schedule in multiple target environments. Simulation results on a ground-based pulsed Doppler radar model will be presented to show the effectiveness of the proposed approach.Keywords: ambiguity resolution, coincidence algorithm, medium PRF, ghosting removal
Procedia PDF Downloads 1511609 Inversion of Electrical Resistivity Data: A Review
Authors: Shrey Sharma, Gunjan Kumar Verma
Abstract:
High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.Keywords: inversion, limitations, optimization, resistivity
Procedia PDF Downloads 3651608 Formulation of Mortars with Marine Sediments
Authors: Nor-Edine Abriak, Mouhamadou Amar, Mahfoud Benzerzour
Abstract:
The transition to a more sustainable economy is directed by a reduction in the consumption of raw materials in equivalent production. The recovery of byproducts and especially the dredged sediment as mineral addition in cements matrix represents an alternative to reduce raw material consumption and construction sector’s carbon footprint. However, the efficient use of sediment requires adequate and optimal treatment. Several processing techniques have so far been applied in order to improve some physicochemical properties. The heat treatment by calcination was effective in removing the organic fraction and activates the pozzolanic properties. In this article, the effect of the optimized heat treatment of marine sediments in the physico-mechanical and environmental properties of mortars are shown. A finding is that the optimal substitution of a portion of cement by treated sediments by calcination at 750 °C helps to maintain or improve the mechanical properties of the cement matrix in comparison with a standard reference mortar. The use of calcined sediment enhances mortar behavior in terms of mechanical strength and durability. From an environmental point of view and life cycle, mortars formulated containing treated sediments are considered inert with respect to the inert waste storage facilities reference (ISDI-France).Keywords: sediment, calcination, cement, reuse
Procedia PDF Downloads 1801607 Design, Optimize the Damping System for Optical Scanning Equipment
Authors: Duy Nhat Tran, Van Tien Pham, Quang Trung Trinh, Tien Hai Tran, Van Cong Bui
Abstract:
In recent years, artificial intelligence and the Internet of Things have experienced significant advancements. Collecting image data and real-time analysis and processing of tasks have become increasingly popular in various aspects of life. Optical scanning devices are widely used to observe and analyze different environments, whether fixed outdoors, mounted on mobile devices, or used in unmanned aerial vehicles. As a result, the interaction between the physical environment and these devices has become more critical in terms of safety. Two commonly used methods for addressing these challenges are active and passive approaches. Each method has its advantages and disadvantages, but combining both methods can lead to higher efficiency. One solution is to utilize direct-drive motors for position control and real-time feedback within the operational range to determine appropriate control parameters with high precision. If the maximum motor torque is smaller than the inertial torque and the rotor reaches the operational limit, the spring system absorbs the impact force. Numerous experiments have been conducted to demonstrate the effectiveness of device protection during operation.Keywords: optical device, collision safety, collision absorption, precise mechanics
Procedia PDF Downloads 631606 Modification Encryption Time and Permutation in Advanced Encryption Standard Algorithm
Authors: Dalal N. Hammod, Ekhlas K. Gbashi
Abstract:
Today, cryptography is used in many applications to achieve high security in data transmission and in real-time communications. AES has long gained global acceptance and is used for securing sensitive data in various industries but has suffered from slow processing and take a large time to transfer data. This paper suggests a method to enhance Advance Encryption Standard (AES) Algorithm based on time and permutation. The suggested method (MAES) is based on modifying the SubByte and ShiftRrows in the encryption part and modification the InvSubByte and InvShiftRows in the decryption part. After the implementation of the proposal and testing the results, the Modified AES achieved good results in accomplishing the communication with high performance criteria in terms of randomness, encryption time, storage space, and avalanche effects. The proposed method has good randomness to ciphertext because this method passed NIST statistical tests against attacks; also, (MAES) reduced the encryption time by (10 %) than the time of the original AES; therefore, the modified AES is faster than the original AES. Also, the proposed method showed good results in memory utilization where the value is (54.36) for the MAES, but the value for the original AES is (66.23). Also, the avalanche effects used for calculating diffusion property are (52.08%) for the modified AES and (51.82%) percentage for the original AES.Keywords: modified AES, randomness test, encryption time, avalanche effects
Procedia PDF Downloads 2481605 Exploratory Analysis of A Review of Nonexistence Polarity in Native Speech
Authors: Deawan Rakin Ahamed Remal, Sinthia Chowdhury, Sharun Akter Khushbu, Sheak Rashed Haider Noori
Abstract:
Native Speech to text synthesis has its own leverage for the purpose of mankind. The extensive nature of art to speaking different accents is common but the purpose of communication between two different accent types of people is quite difficult. This problem will be motivated by the extraction of the wrong perception of language meaning. Thus, many existing automatic speech recognition has been placed to detect text. Overall study of this paper mentions a review of NSTTR (Native Speech Text to Text Recognition) synthesis compared with Text to Text recognition. Review has exposed many text to text recognition systems that are at a very early stage to comply with the system by native speech recognition. Many discussions started about the progression of chatbots, linguistic theory another is rule based approach. In the Recent years Deep learning is an overwhelming chapter for text to text learning to detect language nature. To the best of our knowledge, In the sub continent a huge number of people speak in Bangla language but they have different accents in different regions therefore study has been elaborate contradictory discussion achievement of existing works and findings of future needs in Bangla language acoustic accent.Keywords: TTR, NSTTR, text to text recognition, deep learning, natural language processing
Procedia PDF Downloads 1321604 The Trigger-DAQ System in the Mu2e Experiment
Authors: Antonio Gioiosa, Simone Doanti, Eric Flumerfelt, Luca Morescalchi, Elena Pedreschi, Gianantonio Pezzullo, Ryan A. Rivera, Franco Spinella
Abstract:
The Mu2e experiment at Fermilab aims to measure the charged-lepton flavour violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. With the expected experimental sensitivity, Mu2e will improve the previous limit of four orders of magnitude. The Mu2e data acquisition (DAQ) system provides hardware and software to collect digitized data from the tracker, calorimeter, cosmic ray veto, and beam monitoring systems. Mu2e’s trigger and data acquisition system (TDAQ) uses otsdaq as its solution. developed at Fermilab, otsdaq uses the artdaq DAQ framework and art analysis framework, under-the-hood, for event transfer, filtering, and processing. Otsdaq is an online DAQ software suite with a focus on flexibility and scalability while providing a multi-user, web-based interface accessible through the Chrome or Firefox web browser. The detector read out controller (ROC) from the tracker and calorimeter stream out zero-suppressed data continuously to the data transfer controller (DTC). Data is then read over the PCIe bus to a software filter algorithm that selects events which are finally combined with the data flux that comes from a cosmic ray veto system (CRV).Keywords: trigger, daq, mu2e, Fermilab
Procedia PDF Downloads 1551603 Grain Size Characteristics and Sediments Distribution in the Eastern Part of Lekki Lagoon
Authors: Mayowa Philips Ibitola, Abe Oluwaseun Banji, Olorunfemi Akinade-Solomon
Abstract:
A total of 20 bottom sediment samples were collected from the Lekki Lagoon during the wet and dry season. The study was carried out to determine the textural characteristics, sediment distribution pattern and energy of transportation within the lagoon system. The sediment grain sizes and depth profiling was analyzed using dry sieving method and MATLAB algorithm for processing. The granulometric reveals fine grained sand both for the wet and dry season with an average mean value of 2.03 ϕ and -2.88 ϕ, respectively. Sediments were moderately sorted with an average inclusive standard deviation of 0.77 ϕ and -0.82 ϕ. Skewness varied from strongly coarse and near symmetrical 0.34- ϕ and 0.09 ϕ. The kurtosis average value was 0.87 ϕ and -1.4 ϕ (platykurtic and leptokurtic). Entirely, the bathymetry shows an average depth of 4.0 m. The deepest and shallowest area has a depth of 11.2 m and 0.5 m, respectively. High concentration of fine sand was observed at deep areas compared to the shallow areas during wet and dry season. Statistical parameter results show that the overall sediments are sorted, and deposited under low energy condition over a long distance. However, sediment distribution and sediment transport pattern of Lekki Lagoon is controlled by a low energy current and the down slope configuration of the bathymetry enhances the sorting and the deposition rate in the Lekki Lagoon.Keywords: Lekki Lagoon, Marine sediment, bathymetry, grain size distribution
Procedia PDF Downloads 2311602 Electrocardiogram-Based Heartbeat Classification Using Convolutional Neural Networks
Authors: Jacqueline Rose T. Alipo-on, Francesca Isabelle F. Escobar, Myles Joshua T. Tan, Hezerul Abdul Karim, Nouar Al Dahoul
Abstract:
Electrocardiogram (ECG) signal analysis and processing are crucial in the diagnosis of cardiovascular diseases, which are considered one of the leading causes of mortality worldwide. However, the traditional rule-based analysis of large volumes of ECG data is time-consuming, labor-intensive, and prone to human errors. With the advancement of the programming paradigm, algorithms such as machine learning have been increasingly used to perform an analysis of ECG signals. In this paper, various deep learning algorithms were adapted to classify five classes of heartbeat types. The dataset used in this work is the synthetic MIT-BIH Arrhythmia dataset produced from generative adversarial networks (GANs). Various deep learning models such as ResNet-50 convolutional neural network (CNN), 1-D CNN, and long short-term memory (LSTM) were evaluated and compared. ResNet-50 was found to outperform other models in terms of recall and F1 score using a five-fold average score of 98.88% and 98.87%, respectively. 1-D CNN, on the other hand, was found to have the highest average precision of 98.93%.Keywords: heartbeat classification, convolutional neural network, electrocardiogram signals, generative adversarial networks, long short-term memory, ResNet-50
Procedia PDF Downloads 1281601 Thermal Decontamination of Soils Polluted by Polychlorinated Biphenyls and Microplastics
Authors: Roya Biabani, Mentore Vaccari, Piero Ferrari
Abstract:
Accumulated microplastic (MPLs) in soil pose the risk of adsorbing and transporting polychlorinated biphenyls (PCBs) into the food chain or bodies. PCBs belong to a class of man-made hydrophobic organic chemicals (HOCs) that are classified as probable human carcinogens and a hazard to biota. Therefore, to take effective action and not aggravate the already recognized problems, the knowledge of PCB remediation in the presence of MPLs needs to be complete. Due to the high efficiency and little secondary pollution production, thermal desorption (TD) has been widely used for processing a variety of pollutants, especially for removing volatile and semi-volatile organic matter from contaminated solids and sediment. This study investigates the fate of PCB compounds during the thermal remediation method. For this, the PCB-contaminated soil was collected from the earth-canal downstream Caffaro S.p.A. chemical factory, which produced PCBs and PCB mixtures between 1930 and 1984. For MPL analysis, MPLs were separated by density separation and oxidation of organic matter. An operational range for the key parameters of thermal desorption processes was experimentally evaluated. Moreover, the temperature treatment characteristics of the PCBs-contaminated soil under anaerobic and aerobic conditions were studied using the Thermogravimetric Analysis (TGA).Keywords: contaminated soils, microplastics, polychlorinated biphenyls, thermal desorption
Procedia PDF Downloads 1041600 Optimization of a Four-Lobed Swirl Pipe for Clean-In-Place Procedures
Authors: Guozhen Li, Philip Hall, Nick Miles, Tao Wu
Abstract:
This paper presents a numerical investigation of two horizontally mounted four-lobed swirl pipes in terms of swirl induction effectiveness into flows passing through them. The swirl flows induced by the two swirl pipes have the potential to improve the efficiency of Clean-In-Place procedures in a closed processing system by local intensification of hydrodynamic impact on the internal pipe surface. Pressure losses, swirl development within the two swirl pipe, swirl induction effectiveness, swirl decay and wall shear stress variation downstream of two swirl pipes are analyzed and compared. It was found that a shorter length of swirl inducing pipe used in joint with transition pipes is more effective in swirl induction than when a longer one is used, in that it has a less constraint to the induced swirl and results in slightly higher swirl intensity just downstream of it with the expense of a smaller pressure loss. The wall shear stress downstream of the shorter swirl pipe is also slightly larger than that downstream of the longer swirl pipe due to the slightly higher swirl intensity induced by the shorter swirl pipe. The advantage of the shorter swirl pipe in terms of swirl induction is more significant in flows with a larger Reynolds Number.Keywords: swirl pipe, swirl effectiveness, CFD, wall shear stress, swirl intensity
Procedia PDF Downloads 6061599 AI Software Algorithms for Drivers Monitoring within Vehicles Traffic - SiaMOTO
Authors: Ioan Corneliu Salisteanu, Valentin Dogaru Ulieru, Mihaita Nicolae Ardeleanu, Alin Pohoata, Bogdan Salisteanu, Stefan Broscareanu
Abstract:
Creating a personalized statistic for an individual within the population using IT systems, based on the searches and intercepted spheres of interest they manifest, is just one 'atom' of the artificial intelligence analysis network. However, having the ability to generate statistics based on individual data intercepted from large demographic areas leads to reasoning like that issued by a human mind with global strategic ambitions. The DiaMOTO device is a technical sensory system that allows the interception of car events caused by a driver, positioning them in time and space. The device's connection to the vehicle allows the creation of a source of data whose analysis can create psychological, behavioural profiles of the drivers involved. The SiaMOTO system collects data from many vehicles equipped with DiaMOTO, driven by many different drivers with a unique fingerprint in their approach to driving. In this paper, we aimed to explain the software infrastructure of the SiaMOTO system, a system designed to monitor and improve driver driving behaviour, as well as the criteria and algorithms underlying the intelligent analysis process.Keywords: artificial intelligence, data processing, driver behaviour, driver monitoring, SiaMOTO
Procedia PDF Downloads 911598 Tool Wear Analysis in 3D Manufactured Ti6AI4V
Authors: David Downey
Abstract:
With the introduction of additive manufacturing (3D printing) to produce titanium (Ti6Al4V) components in the medical/aerospace and automotive industries, intricate geometries can be produced with virtually complete design freedom. However, the consideration of microstructural anisotropy resulting from the additive manufacturing process becomes necessary due to this design flexibility and the need to print a geometric shape that can consist of numerous angles, radii, and swept surfaces. A femoral knee implant serves as an example of a 3D-printed near-net-shaped product. The mechanical properties of the printed components, and consequently, their machinability, are affected by microstructural anisotropy. Currently, finish-machining operations performed on titanium printed parts using selective laser melting (SLM) utilize the same cutting tools employed for processing wrought titanium components. Cutting forces for components manufactured through SLM can be up to 70% higher than those for their wrought counterparts made of Ti6Al4V. Moreover, temperatures at the cutting interface of 3D printed material can surpass those of wrought titanium, leading to significant tool wear. Although the criteria for tool wear may be similar for both 3D printed and wrought materials, the rate of wear during the machining process may differ. The impact of these issues on the choice of cutting tool material and tool lifetimes will be discussed.Keywords: additive manufacturing, build orientation, microstructural anisotropy, printed titanium Ti6Al4V, tool wear
Procedia PDF Downloads 911597 Bioremediation of Sea Food Waste in Solid State Fermentation along with Production of Bioactive Agents
Authors: Rahul Warmoota, Aditya Bhardwaj, Steffy Angural, Monika Rana, Sunena Jassal, Neena Puri, Naveen Gupta
Abstract:
Seafood processing generates large volumes of waste products such as skin, heads, tails, shells, scales, backbones, etc. Pollution due to conventional methods of seafood waste disposal causes negative implications on the environment, aquatic life, and human health. Moreover, these waste products can be used for the production of high-value products which are still untapped due to inappropriate management. Paenibacillus sp. AD is known to act on chitinolytic and proteinaceous waste and was explored for its potential to degrade various types of seafood waste in solid-state fermentation. Effective degradation of seafood waste generated from a variety of sources such as fish scales, crab shells, prawn shells, and a mixture of such wastes was observed. 30 to 40 percent degradation in terms of decrease in the mass was achieved. Along with the degradation, chitinolytic and proteolytic enzymes were produced, which can have various biotechnological applications. Apart from this, value-added products such as chitin oligosaccharides and peptides of various degrees of polymerization were also produced, which can be used for various therapeutic purposes. Results indicated that Paenibacillus sp. AD can be used for the development of a process for the infield degradation of seafood waste.Keywords: chitin, chitin-oligosaccharides, chitinase, protease, biodegradation, crab shells, prawn shells, fish scales
Procedia PDF Downloads 981596 Design and Experimental Studies of a Centrifugal SWIRL Atomizer
Authors: Hemabushan K., Manikandan
Abstract:
In a swirl atomizer, fluid undergoes a swirling motion as a result of centrifugal force created by opposed tangential inlets in the swirl chamber. The angular momentum of fluid continually increases as it reaches the exit orifice and forms a hollow sheet. Which disintegrates to form ligaments and droplets respectively as it flows downstream. This type of atomizers used in rocket injectors and oil burner furnaces. In this present investigation a swirl atomizer with two opposed tangential inlets has been designed. Water as working fluid, experiments had been conducted for the fluid injection pressures in regime of 0.033 bar to 0.519 bar. The fluid has been pressured by a 0.5hp pump and regulated by a pressure regulator valve. Injection pressure of fluid has been measured by a U-tube mercury manometer. The spray pattern and the droplets has been captured with a high resolution camera in black background with a high intensity flash highlighting the fluid. The unprocessed images were processed in ImageJ processing software for measuring the droplet diameters and its shape characteristics along the downstream. The parameters such as mean droplet diameter and distribution, wave pattern, rupture distance and spray angle were studied for this atomizer. The above results were compared with theoretical results and also analysed for deviation with design parameters.Keywords: swirl atomizer, injector, spray, SWIRL
Procedia PDF Downloads 4901595 Assessment of the Potential of Fuel-derived Rice Husk Ash as Pozzolanic Material
Authors: Jesha Faye T. Librea, Leslie Joy L. Diaz
Abstract:
Fuel-derived rice husk ash (fRHA) is a waste material from industries employing rice husk as a biomass fuel which, on the downside, causes disposal and environmental problems. To mitigate this, the fRHA was evaluated for use in other applications such as a pozzolanic material for the construction industry. In this study, the assessment of the potential of fRHA as pozzolanic supplementary cementitious material was conducted by determining the chemical and physical properties of fRHA according to ASTM C618, evaluating the fineness of the material according to ASTM C430, and determining its pozzolanic activity using Luxan Method. The material was found to have a high amorphous silica content of around 95.82 % with traces of alkaline and carbon impurities. The retained carbon residue is 7.18 %, which is within the limit of the specifications for natural pozzolans indicated in ASTM C618. The fineness of the fRHA is at 88.88 % retained at a 45-micron sieve, which, however, exceeded the limit of 34 %. This large particle size distribution was found to affect the pozzolanic activity of the fRHA. This was shown in the Luxan test, where the fRHA was identified as non-pozzolan due to its low pozzolanic activity index of 0.262. Thus, further processing must be done to the fRHA to pass the required ASTM fineness, have a higher pozzolanic activity index, and fully qualify as a pozzolanic material.Keywords: rice husk ash, pozzolanic, fuel-derived ash, supplementary cementitious material
Procedia PDF Downloads 661594 A Hybrid Feature Selection and Deep Learning Algorithm for Cancer Disease Classification
Authors: Niousha Bagheri Khulenjani, Mohammad Saniee Abadeh
Abstract:
Learning from very big datasets is a significant problem for most present data mining and machine learning algorithms. MicroRNA (miRNA) is one of the important big genomic and non-coding datasets presenting the genome sequences. In this paper, a hybrid method for the classification of the miRNA data is proposed. Due to the variety of cancers and high number of genes, analyzing the miRNA dataset has been a challenging problem for researchers. The number of features corresponding to the number of samples is high and the data suffer from being imbalanced. The feature selection method has been used to select features having more ability to distinguish classes and eliminating obscures features. Afterward, a Convolutional Neural Network (CNN) classifier for classification of cancer types is utilized, which employs a Genetic Algorithm to highlight optimized hyper-parameters of CNN. In order to make the process of classification by CNN faster, Graphics Processing Unit (GPU) is recommended for calculating the mathematic equation in a parallel way. The proposed method is tested on a real-world dataset with 8,129 patients, 29 different types of tumors, and 1,046 miRNA biomarkers, taken from The Cancer Genome Atlas (TCGA) database.Keywords: cancer classification, feature selection, deep learning, genetic algorithm
Procedia PDF Downloads 1111593 Colour Quick Response Code with High Damage Resistance Capability
Authors: Minh Nguyen
Abstract:
Today, QR or Quick Response Codes are prevalent, and mobile/smart devices can efficiently read and understand them. Therefore, we can see their appearance in many areas, such as storing web pages/websites, business phone numbers, redirecting to an app download, business location, social media. The popularity of the QR Code is mainly because of its many advantages, such as it can hold a good amount of information, is small, easy to scan and read by a general RGB camera, and it can still work with some damages on its surface. However, there are still some issues. For instance, some areas needed to be kept untouched for its successful decode (e.g., the “Finder Patterns,” the “Quiet Zone,” etc.), the capability of built-in auto-correction is not robust enough, and it is not flexible enough for many application such as Augment Reality (AR). We proposed a new Colour Quick Response Code that has several advantages over the original ones: (1) there is no untouchable area, (2) it allows up to 40% of the entire code area to be damaged, (3) it is more beneficial for Augmented Reality applications, and (4) it is back-compatible and readable by available QR Code scanners such as Pyzbar. From our experience, our Colour Quick Response Code is significantly more flexible on damage compared to the original QR Code. Our code is believed to be suitable in situations where standard 2D Barcodes fail to work, such as curved and shiny surfaces, for instance, medical blood test sample tubes and syringes.Keywords: QR code, computer vision, image processing, 2D barcode
Procedia PDF Downloads 118