Search results for: graphics processing units
1802 Multi-Modal Feature Fusion Network for Speaker Recognition Task
Authors: Xiang Shijie, Zhou Dong, Tian Dan
Abstract:
Speaker recognition is a crucial task in the field of speech processing, aimed at identifying individuals based on their vocal characteristics. However, existing speaker recognition methods face numerous challenges. Traditional methods primarily rely on audio signals, which often suffer from limitations in noisy environments, variations in speaking style, and insufficient sample sizes. Additionally, relying solely on audio features can sometimes fail to capture the unique identity of the speaker comprehensively, impacting recognition accuracy. To address these issues, we propose a multi-modal network architecture that simultaneously processes both audio and text signals. By gradually integrating audio and text features, we leverage the strengths of both modalities to enhance the robustness and accuracy of speaker recognition. Our experiments demonstrate significant improvements with this multi-modal approach, particularly in complex environments, where recognition performance has been notably enhanced. Our research not only highlights the limitations of current speaker recognition methods but also showcases the effectiveness of multi-modal fusion techniques in overcoming these limitations, providing valuable insights for future research.Keywords: feature fusion, memory network, multimodal input, speaker recognition
Procedia PDF Downloads 321801 Solving Process Planning, Weighted Apparent Tardiness Cost Dispatching, and Weighted Processing plus Weight Due-Date Assignment Simultaneously Using a Hybrid Search
Authors: Halil Ibrahim Demir, Caner Erden, Abdullah Hulusi Kokcam, Mumtaz Ipek
Abstract:
Process planning, scheduling, and due date assignment are three important manufacturing functions which are studied independently in literature. There are hundreds of works on IPPS and SWDDA problems but a few works on IPPSDDA problem. Integrating these three functions is very crucial due to the high relationship between them. Since the scheduling problem is in the NP-Hard problem class without any integration, an integrated problem is even harder to solve. This study focuses on the integration of these functions. Sum of weighted tardiness, earliness, and due date related costs are used as a penalty function. Random search and hybrid metaheuristics are used to solve the integrated problem. Marginal improvement in random search is very high in the early iterations and reduces enormously in later iterations. At that point directed search contribute to marginal improvement more than random search. In this study, random and genetic search methods are combined to find better solutions. Results show that overall performance becomes better as the integration level increases.Keywords: process planning, genetic algorithm, hybrid search, random search, weighted due-date assignment, weighted scheduling
Procedia PDF Downloads 3621800 Electrochemical Treatment and Chemical Analyses of Tannery Wastewater Using Sacrificial Aluminum Electrode, Ethiopia
Authors: Dessie Tibebe, Muluken Asmare, Marye Mulugeta, Yezbie Kassa, Zerubabel Moges, Dereje Yenealem, Tarekegn Fentie, Agmas Amare
Abstract:
The performance of electrocoagulation (EC) using Aluminium electrodes for the treatment of effluent-containing chromium metal using a fixed bed electrochemical batch reactor was studied. In the present work, the efficiency evaluation of EC in removing physicochemical and heavy metals from real industrial tannery wastewater in the Amhara region, collected from Bahirdar, Debre Brihan, and Haik, was investigated. The treated and untreated samples were determined by AAS and ICP OES spectrophotometers. The results indicated that selected heavy metals were removed in all experiments with high removal percentages. The optimal results were obtained regarding both cost and electrocoagulation efficiency with initial pH = 3, initial concentration = 40 mg/L, electrolysis time = 30 min, current density = 40 mA/cm2, and temperature = 25oC favored metal removal. The maximum removal percentages of selected metals obtained were 84.42% for Haik, 92.64% for Bahir Dar and 94.90% for Debre Brihan. The sacrificial electrode and sludge were characterized by FT-IR, SEM and XRD. After treatment, some metals like chromium will be used again as a tanning agent in leather processing to promote a circular economy.Keywords: electrochemical, treatment, aluminum, tannery effluent
Procedia PDF Downloads 1101799 Investigation of Glacier Activity Using Optical and Radar Data in Zardkooh
Authors: Mehrnoosh Ghadimi, Golnoush Ghadimi
Abstract:
Precise monitoring of glacier velocity is critical in determining glacier-related hazards. Zardkooh Mountain was studied in terms of glacial activity rate in Zagros Mountainous region in Iran. In this study, we assessed the ability of optical and radar imagery to derive glacier-surface velocities in mountainous terrain. We processed Landsat 8 for optical data and Sentinel-1a for radar data. We used methods that are commonly used to measure glacier surface movements, such as cross correlation of optical and radar satellite images, SAR tracking techniques, and multiple aperture InSAR (MAI). We also assessed time series glacier surface displacement using our modified method, Enhanced Small Baseline Subset (ESBAS). The ESBAS has been implemented in StaMPS software, with several aspects of the processing chain modified, including filtering prior to phase unwrapping, topographic correction within three-dimensional phase unwrapping, reducing atmospheric noise, and removing the ramp caused by ionosphere turbulence and/or orbit errors. Our findings indicate an average surface velocity rate of 32 mm/yr in the Zardkooh mountainous areas.Keywords: active rock glaciers, landsat 8, sentinel-1a, zagros mountainous region
Procedia PDF Downloads 771798 A Comparison between Underwater Image Enhancement Techniques
Authors: Ouafa Benaida, Abdelhamid Loukil, Adda Ali Pacha
Abstract:
In recent years, the growing interest of scientists in the field of image processing and analysis of underwater images and videos has been strengthened following the emergence of new underwater exploration techniques, such as the emergence of autonomous underwater vehicles and the use of underwater image sensors facilitating the exploration of underwater mineral resources as well as the search for new species of aquatic life by biologists. Indeed, underwater images and videos have several defects and must be preprocessed before their analysis. Underwater landscapes are usually darkened due to the interaction of light with the marine environment: light is absorbed as it travels through deep waters depending on its wavelength. Additionally, light does not follow a linear direction but is scattered due to its interaction with microparticles in water, resulting in low contrast, low brightness, color distortion, and restricted visibility. The improvement of the underwater image is, therefore, more than necessary in order to facilitate its analysis. The research presented in this paper aims to implement and evaluate a set of classical techniques used in the field of improving the quality of underwater images in several color representation spaces. These methods have the particularity of being simple to implement and do not require prior knowledge of the physical model at the origin of the degradation.Keywords: underwater image enhancement, histogram normalization, histogram equalization, contrast limited adaptive histogram equalization, single-scale retinex
Procedia PDF Downloads 891797 Motion Detection Method for Clutter Rejection in the Bio-Radar Signal Processing
Authors: Carolina Gouveia, José Vieira, Pedro Pinho
Abstract:
The cardiopulmonary signal monitoring, without the usage of contact electrodes or any type of in-body sensors, has several applications such as sleeping monitoring and continuous monitoring of vital signals in bedridden patients. This system has also applications in the vehicular environment to monitor the driver, in order to avoid any possible accident in case of cardiac failure. Thus, the bio-radar system proposed in this paper, can measure vital signals accurately by using the Doppler effect principle that relates the received signal properties with the distance change between the radar antennas and the person’s chest-wall. Once the bio-radar aim is to monitor subjects in real-time and during long periods of time, it is impossible to guarantee the patient immobilization, hence their random motion will interfere in the acquired signals. In this paper, a mathematical model of the bio-radar is presented, as well as its simulation in MATLAB. The used algorithm for breath rate extraction is explained and a method for DC offsets removal based in a motion detection system is proposed. Furthermore, experimental tests were conducted with a view to prove that the unavoidable random motion can be used to estimate the DC offsets accurately and thus remove them successfully.Keywords: bio-signals, DC component, Doppler effect, ellipse fitting, radar, SDR
Procedia PDF Downloads 1411796 Design and Manufacture of Removable Nosecone Tips with Integrated Pitot Tubes for High Power Sounding Rocketry
Authors: Bjorn Kierulf, Arun Chundru
Abstract:
Over the past decade, collegiate rocketry teams have emerged across the country with various goals: space, liquid-fueled flight, etc. A critical piece of the development of knowledge within a club is the use of so-called "sounding rockets," whose goal is to take in-flight measurements that inform future rocket design. Common measurements include acceleration from inertial measurement units (IMU's), and altitude from barometers. With a properly tuned filter, these measurements can be used to find velocity, but are susceptible to noise, offset, and filter settings. Instead, velocity can be measured more directly and more instantaneously using a pitot tube, which operates by measuring the stagnation pressure. At supersonic speeds, an additional thermodynamic property is necessary to constrain the upstream state. One possibility is the stagnation temperature, measured by a thermocouple in the pitot tube. The routing of the pitot tube from the nosecone tip down to a pressure transducer is complicated by the nosecone's structure. Commercial-off-the-shelf (COTS) nosecones come with a removable metal tip (without a pitot tube). This provides the opportunity to make custom tips with integrated measurement systems without making the nosecone from scratch. The main design constraint is how the nosecone tip is held down onto the nosecone, using the tension in a threaded rod anchored to a bulkhead below. Because the threaded rod connects into a threaded hole in the center of the nosecone tip, the pitot tube follows a winding path, and the pressure fitting is off-center. Two designs will be presented in the paper, one with a curved pitot tube and a coaxial design that eliminates the need for the winding path by routing pressure through a structural tube. Additionally, three manufacturing methods will be presented for these designs: bound powder filament metal 3D printing, stereo-lithography (SLA) 3D printing, and traditional machining. These will employ three different materials, copper, steel, and proprietary resin. These manufacturing methods and materials are relatively low cost, thus accessible to student researchers. These designs and materials cover multiple use cases, based on how fast the sounding rocket is expected to travel and how important heating effects are - to measure and to avoid melting. This paper will include drawings showing key features and an overview of the design changes necessitated by the manufacture. It will also include a look at the successful use of these nosecone tips and the data they have gathered to date.Keywords: additive manufacturing, machining, pitot tube, sounding rocketry
Procedia PDF Downloads 1641795 Analysing Techniques for Fusing Multimodal Data in Predictive Scenarios Using Convolutional Neural Networks
Authors: Philipp Ruf, Massiwa Chabbi, Christoph Reich, Djaffar Ould-Abdeslam
Abstract:
In recent years, convolutional neural networks (CNN) have demonstrated high performance in image analysis, but oftentimes, there is only structured data available regarding a specific problem. By interpreting structured data as images, CNNs can effectively learn and extract valuable insights from tabular data, leading to improved predictive accuracy and uncovering hidden patterns that may not be apparent in traditional structured data analysis. In applying a single neural network for analyzing multimodal data, e.g., both structured and unstructured information, significant advantages in terms of time complexity and energy efficiency can be achieved. Converting structured data into images and merging them with existing visual material offers a promising solution for applying CNN in multimodal datasets, as they often occur in a medical context. By employing suitable preprocessing techniques, structured data is transformed into image representations, where the respective features are expressed as different formations of colors and shapes. In an additional step, these representations are fused with existing images to incorporate both types of information. This final image is finally analyzed using a CNN.Keywords: CNN, image processing, tabular data, mixed dataset, data transformation, multimodal fusion
Procedia PDF Downloads 1231794 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic
Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi
Abstract:
In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing
Procedia PDF Downloads 2991793 Analysing Waste Management Options in the Printing Industry: Case of a South African Company
Authors: Stanley Fore
Abstract:
The case study company is one of the leading newsprint companies in South Africa. The company has achieved this status through operational expansion, diversification and investing in cutting-edge technology. They have a reputation for the highest quality and personalised service that transcends borders and industries. The company offers a wide variety of small and large scales printing services. The company is faced with the challenge of significant waste production during normal operations. The company generates 1200 kg of plastic waste and 60 – 70 tonnes of paper waste per month. The company operates a waste management process currently, whereby waste paper is sold, at low cost, to recycling firms for further processing. Having considered the quantity of waste being generated, the company has embarked on a venture to find a more profitable solution to its current waste production. As waste management and recycling is not the company’s core business, the aim of the venture is to implement a secondary profitable waste process business. The venture will be expedited as a strategic project. This research aims to estimate the financial feasibility of a selected solution as well as the impact of non-financial considerations thereof. The financial feasibility is analysed using metrics such as Payback period; internal rate of return and net present value.Keywords: waste, printing industry, up-cycling, management
Procedia PDF Downloads 2621792 Encapsulation of Satureja khuzestanica Essential Oil in Chitosan Nanoparticles with Enhanced Antifungal Activity
Authors: Amir Amiri, Naghmeh Morakabati
Abstract:
During the recent years the six-fold growth of cancer in Iran has led the production of healthy products to become a challenge in the food industry. Due to the young population in the country, the consumption of fast foods is growing. The chemical cancer-causing preservatives are used to produce these products more than the standard; so using an appropriate alternative seems to be important. On the one hand, the plant essential oils show the high antimicrobial potential against pathogenic and spoilage microorganisms and on the other hand they are highly volatile and decomposed under the processing conditions. The study aims to produce the loaded chitosan nanoparticles with different concentrations of savory essential oil to improve the anti-microbial property and increase the resistance of essential oil to oxygen and heat. The encapsulation efficiency was obtained in the range of 32.07% to 39.93% and the particle size distribution of the samples was observed in the range of 159 to 210 nm. The range of Zeta potential was obtained between -11.9 to -23.1 mV. The essential oil loaded in chitosan showed stronger antifungal activity against Rhizopus stolonifer. The results showed that the antioxidant property is directly related to the concentration of loaded essential oil so that the antioxidant property increases by increasing the concentration of essential oil. In general, it seems that the savory essential oil loaded in chitosan particles can be used as a food processor.Keywords: chitosan, encapsulation, essential oil, nanogel
Procedia PDF Downloads 2741791 Plasma Lipid Profiles and Atherogenic Indices of Rats Fed Raw and Processed Jack Fruit (Artocarpus heterophyllus) Seeds Diets at Different Concentrations
Authors: O. E. Okafor, L. U. S. Ezeanyika, C. G. Nkwonta, C. J. Okonkwo
Abstract:
The effect of processing on plasma lipid profile and atherogenic indices of rats fed Artocarpus heterophyllus seed diets at different concentrations were investigated. Fifty five rats were used for this study, they were divided into eleven groups of five rats each (one control group and ten test groups), the test groups were fed raw, boiled, roasted, fermented, and soaked diets at 10 % and 40% concentrations. The study lasted for thirty five days. The diets led to significant decrease (p < 0.05) in plasma cholesterol and triacylglycerol of rats fed 10% and 40% concentrations of the diets, and a significant increase (p < 0.05) in high density lipoprotein (HDL) levels at 40% concentrations of the test diets. The diets also produced decrease in low density lipoprotein (LDL), very low density lipoprotein (VLDL), cardiac risk ratio (CRR), atherogenic index of plasma (AIP) and atherogenic coefficient (AC) at 40% concentrations except the soaked group that showed slight elevation of LDL, CRR, AC and AIP at 40% concentration. Artocarpus heterophyllus seeds could be beneficial to health because of its ability to increase plasma HDL and reduce plasma LDL, VLDL, cholesterol, triglycerides and atherogenic indices at higher diet concentration.Keywords: artocarpus heterophyllus, atherogenic indices, concentrations, lipid profile
Procedia PDF Downloads 3021790 Development of Colorimetric Based Microfluidic Platform for Quantification of Fluid Contaminants
Authors: Sangeeta Palekar, Mahima Rana, Jayu Kalambe
Abstract:
In this paper, a microfluidic-based platform for the quantification of contaminants in the water is proposed. The proposed system uses microfluidic channels with an embedded environment for contaminants detection in water. Microfluidics-based platforms present an evident stage of innovation for fluid analysis, with different applications advancing minimal efforts and simplicity of fabrication. Polydimethylsiloxane (PDMS)-based microfluidics channel is fabricated using a soft lithography technique. Vertical and horizontal connections for fluid dispensing with the microfluidic channel are explored. The principle of colorimetry, which incorporates the use of Griess reagent for the detection of nitrite, has been adopted. Nitrite has high water solubility and water retention, due to which it has a greater potential to stay in groundwater, endangering aquatic life along with human health, hence taken as a case study in this work. The developed platform also compares the detection methodology, containing photodetectors for measuring absorbance and image sensors for measuring color change for quantification of contaminants like nitrite in water. The utilization of image processing techniques offers the advantage of operational flexibility, as the same system can be used to identify other contaminants present in water by introducing minor software changes.Keywords: colorimetric, fluid contaminants, nitrite detection, microfluidics
Procedia PDF Downloads 1981789 Comparative Quantitative Study on Learning Outcomes of Major Study Groups of an Information and Communication Technology Bachelor Educational Program
Authors: Kari Björn, Mikael Soini
Abstract:
Higher Education system reforms, especially Finnish system of Universities of Applied Sciences in 2014 are discussed. The new steering model is based on major legislative changes, output-oriented funding and open information. The governmental steering reform, especially the financial model and the resulting institutional level responses, such as a curriculum reforms are discussed, focusing especially in engineering programs. The paper is motivated by management need to establish objective steering-related performance indicators and to apply them consistently across all educational programs. The close relationship to governmental steering and funding model imply that internally derived indicators can be directly applied. Metropolia University of Applied Sciences (MUAS) as a case institution is briefly introduced, focusing on engineering education in Information and Communications Technology (ICT), and its related programs. The reform forced consolidation of previously separate smaller programs into fewer units of student application. New curriculum ICT students have a common first year before they apply for a Major. A framework of parallel and longitudinal comparisons is introduced and used across Majors in two campuses. The new externally introduced performance criteria are applied internally on ICT Majors using data ex-ante and ex-post of program merger. A comparative performance of the Majors after completion of joint first year is established, focusing on previously omitted Majors for completeness of analysis. Some new research questions resulting from transfer of Majors between campuses and quota setting are discussed. Practical orientation identifies best practices to share or targets needing most attention for improvement. This level of analysis is directly applicable at student group and teaching team level, where corrective actions are possible, when identified. The analysis is quantitative and the nature of the corrective actions are not discussed. Causal relationships and factor analysis are omitted, because campuses, their staff and various pedagogical implementation details contain still too many undetermined factors for our limited data. Such qualitative analysis is left for further research. Further study must, however, be guided by the relevance of the observations.Keywords: engineering education, integrated curriculum, learning outcomes, performance measurement
Procedia PDF Downloads 2411788 Application of Box-Behnken Response Surface Design for Optimization of Essential Oil Based Disinfectant on Mixed Species Biofilm
Authors: Anita Vidacs, Robert Rajko, Csaba Vagvolgyi, Judit Krisch
Abstract:
With the optimization of a new disinfectant the number of tests could be decreased and the cost of processing too. Good sanitizers are eco-friendly and allow no resistance evolvement of bacteria. The essential oils (EOs) are natural antimicrobials, and most of them have the Generally Recognized As Safe (GRAS) status. In our study, the effect of the EOs cinnamon, marjoram, and thyme was investigated against mixed species bacterial biofilms of Escherichia coli, Listeria monocytogenes, Pseudomonas putida, and Staphylococcus aureus. The optimal concentration of EOs, disinfection time and level of pH were evaluated with the aid of Response Surface Box-Behnken Design (RSD) on 1 day and 7 days old biofilms on metal, plastic, and wood surfaces. The variable factors were in the range of 1-3 times of minimum bactericide concentration (MBC); 10-110 minutes acting time and 4.5- 7.5 pH. The optimized EO disinfectant was compared to industrial used chemicals (HC-DPE, Hypo). The natural based disinfectants were applicable; the acting time was below 30 minutes. EOs were able to eliminate the biofilm from the used surfaces except from wood. The disinfection effect of the EO based natural solutions was in most cases equivalent or better compared to chemical sanitizers used in food industry.Keywords: biofilm, Box-Behnken design, disinfectant, essential oil
Procedia PDF Downloads 2191787 Role of Geomatics in Architectural and Cultural Conservation
Authors: Shweta Lall
Abstract:
The intent of this paper is to demonstrate the role of computerized auxiliary science in advancing the desired and necessary alliance of historians, surveyors, topographers, and analysts of architectural conservation and management. The digital era practice of recording architectural and cultural heritage in view of its preservation, dissemination, and planning developments are discussed in this paper. Geomatics include practices like remote sensing, photogrammetry, surveying, Geographic Information System (GIS), laser scanning technology, etc. These all resources help in architectural and conservation applications which will be identified through various case studies analysed in this paper. The standardised outcomes and the methodologies using relevant case studies are listed and described. The main component of geomatics methodology adapted in conservation is data acquisition, processing, and presentation. Geomatics is used in a wide range of activities involved in architectural and cultural heritage – damage and risk assessment analysis, documentation, 3-D model construction, virtual reconstruction, spatial and structural decision – making analysis and monitoring. This paper will project the summary answers of the capabilities and limitations of the geomatics field in architectural and cultural conservation. Policy-makers, urban planners, architects, and conservationist not only need answers to these questions but also need to practice them in a predictable, transparent, spatially explicit and inexpensive manner.Keywords: architectural and cultural conservation, geomatics, GIS, remote sensing
Procedia PDF Downloads 1471786 Design of Speed Bump Recognition System Integrated with Adjustable Shock Absorber Control
Authors: Ming-Yen Chang, Sheng-Hung Ke
Abstract:
This research focuses on the development of a speed bump identification system for real-time control of adjustable shock absorbers in vehicular suspension systems. The study initially involved the collection of images of various speed bumps, and rubber speed bump profiles found on roadways. These images were utilized for training and recognition purposes through the deep learning object detection algorithm YOLOv5. Subsequently, the trained speed bump identification program was integrated with an in-vehicle camera system for live image capture during driving. These images were instantly transmitted to a computer for processing. Using the principles of monocular vision ranging, the distance between the vehicle and an approaching speed bump was determined. The appropriate control distance was established through both practical vehicle measurements and theoretical calculations. Collaboratively, with the electronically adjustable shock absorbers equipped in the vehicle, a shock absorber control system was devised to dynamically adapt the damping force just prior to encountering a speed bump. This system effectively mitigates passenger discomfort and enhances ride quality.Keywords: adjustable shock absorbers, image recognition, monocular vision ranging, ride
Procedia PDF Downloads 661785 Healthcare Big Data Analytics Using Hadoop
Authors: Chellammal Surianarayanan
Abstract:
Healthcare industry is generating large amounts of data driven by various needs such as record keeping, physician’s prescription, medical imaging, sensor data, Electronic Patient Record(EPR), laboratory, pharmacy, etc. Healthcare data is so big and complex that they cannot be managed by conventional hardware and software. The complexity of healthcare big data arises from large volume of data, the velocity with which the data is accumulated and different varieties such as structured, semi-structured and unstructured nature of data. Despite the complexity of big data, if the trends and patterns that exist within the big data are uncovered and analyzed, higher quality healthcare at lower cost can be provided. Hadoop is an open source software framework for distributed processing of large data sets across clusters of commodity hardware using a simple programming model. The core components of Hadoop include Hadoop Distributed File System which offers way to store large amount of data across multiple machines and MapReduce which offers way to process large data sets with a parallel, distributed algorithm on a cluster. Hadoop ecosystem also includes various other tools such as Hive (a SQL-like query language), Pig (a higher level query language for MapReduce), Hbase(a columnar data store), etc. In this paper an analysis has been done as how healthcare big data can be processed and analyzed using Hadoop ecosystem.Keywords: big data analytics, Hadoop, healthcare data, towards quality healthcare
Procedia PDF Downloads 4131784 Pore Pressure and In-situ Stress Magnitudes with Image Log Processing and Geological Interpretation in the Haoud Berkaoui Hydrocarbon Field, Northeastern Algerian Sahara
Authors: Rafik Baouche, Rabah Chaouchi
Abstract:
This work reports the first comprehensive stress field interpretation from the eleven recently drilled wells in the Berkaoui Basin, Algerian Sahara. A cumulative length of 7000+m acoustic image logs from 06 vertical wells were investigated, and a mean NW-SE (128°-145° N) maximum horizontal stress (SHMax) orientation is inferred from the B-D quality wellbore breakouts. The study integrates log-based approach with the downhole measurements to infer pore pressure, in-situ stress magnitudes. Vertical stress (Sv), interpreted from the bulk-density profiles, has an average gradient of 22.36 MPa/km. The Ordovician and Cambrian reservoirs have a pore pressure gradient of 13.47-13.77 MPa/km, which is more than the hydrostatic pressure regime. A 17.2-18.3 MPa/km gradient of minimum horizontal stress (Shmin) is inferred from the fracture closure pressure in the reservoirs. Breakout widths constrained the SHMax magnitude in the 23.8-26.5 MPa/km range. Subsurface stress distribution in the central Saharan Algeria indicates that the present-day stress field in the Berkaoui Basin is principally strike-slip faulting (SHMax > Sv > Shmin). Inferences are drawn on the regional stress pattern and drilling and reservoir development.Keywords: stress, imagery, breakouts, sahara
Procedia PDF Downloads 751783 Medical Image Augmentation Using Spatial Transformations for Convolutional Neural Network
Authors: Trupti Chavan, Ramachandra Guda, Kameshwar Rao
Abstract:
The lack of data is a pain problem in medical image analysis using a convolutional neural network (CNN). This work uses various spatial transformation techniques to address the medical image augmentation issue for knee detection and localization using an enhanced single shot detector (SSD) network. The spatial transforms like a negative, histogram equalization, power law, sharpening, averaging, gaussian blurring, etc. help to generate more samples, serve as pre-processing methods, and highlight the features of interest. The experimentation is done on the OpenKnee dataset which is a collection of knee images from the openly available online sources. The CNN called enhanced single shot detector (SSD) is utilized for the detection and localization of the knee joint from a given X-ray image. It is an enhanced version of the famous SSD network and is modified in such a way that it will reduce the number of prediction boxes at the output side. It consists of a classification network (VGGNET) and an auxiliary detection network. The performance is measured in mean average precision (mAP), and 99.96% mAP is achieved using the proposed enhanced SSD with spatial transformations. It is also seen that the localization boundary is comparatively more refined and closer to the ground truth in spatial augmentation and gives better detection and localization of knee joints.Keywords: data augmentation, enhanced SSD, knee detection and localization, medical image analysis, openKnee, Spatial transformations
Procedia PDF Downloads 1541782 Offline Signature Verification in Punjabi Based On SURF Features and Critical Point Matching Using HMM
Authors: Rajpal Kaur, Pooja Choudhary
Abstract:
Biometrics, which refers to identifying an individual based on his or her physiological or behavioral characteristics, has the capabilities to the reliably distinguish between an authorized person and an imposter. The Signature recognition systems can categorized as offline (static) and online (dynamic). This paper presents Surf Feature based recognition of offline signatures system that is trained with low-resolution scanned signature images. The signature of a person is an important biometric attribute of a human being which can be used to authenticate human identity. However the signatures of human can be handled as an image and recognized using computer vision and HMM techniques. With modern computers, there is need to develop fast algorithms for signature recognition. There are multiple techniques are defined to signature recognition with a lot of scope of research. In this paper, (static signature) off-line signature recognition & verification using surf feature with HMM is proposed, where the signature is captured and presented to the user in an image format. Signatures are verified depended on parameters extracted from the signature using various image processing techniques. The Off-line Signature Verification and Recognition is implemented using Mat lab platform. This work has been analyzed or tested and found suitable for its purpose or result. The proposed method performs better than the other recently proposed methods.Keywords: offline signature verification, offline signature recognition, signatures, SURF features, HMM
Procedia PDF Downloads 3841781 Detection and Classification of Myocardial Infarction Using New Extracted Features from Standard 12-Lead ECG Signals
Authors: Naser Safdarian, Nader Jafarnia Dabanloo
Abstract:
In this paper we used four features i.e. Q-wave integral, QRS complex integral, T-wave integral and total integral as extracted feature from normal and patient ECG signals to detection and localization of myocardial infarction (MI) in left ventricle of heart. In our research we focused on detection and localization of MI in standard ECG. We use the Q-wave integral and T-wave integral because this feature is important impression in detection of MI. We used some pattern recognition method such as Artificial Neural Network (ANN) to detect and localize the MI. Because these methods have good accuracy for classification of normal and abnormal signals. We used one type of Radial Basis Function (RBF) that called Probabilistic Neural Network (PNN) because of its nonlinearity property, and used other classifier such as k-Nearest Neighbors (KNN), Multilayer Perceptron (MLP) and Naive Bayes Classification. We used PhysioNet database as our training and test data. We reached over 80% for accuracy in test data for localization and over 95% for detection of MI. Main advantages of our method are simplicity and its good accuracy. Also we can improve accuracy of classification by adding more features in this method. A simple method based on using only four features which extracted from standard ECG is presented which has good accuracy in MI localization.Keywords: ECG signal processing, myocardial infarction, features extraction, pattern recognition
Procedia PDF Downloads 4561780 Corpus-Based Analysis on the Translatability of Conceptual Vagueness in Traditional Chinese Medicine Classics Huang Di Nei Jing
Authors: Yan Yue
Abstract:
Huang Di Nei Jing (HDNJ) is one of the significant traditional Chinese medicine (TCM) classics which lays the foundation of TCM theory and practice. It is an important work for the world to study the ancient civilizations and medical history of China. Language in HDNJ is highly concise and vague, and notably challenging to translate. This paper investigates the translatability of one particular vagueness in HDNJ: the conceptual vagueness which carries the Chinese philosophical and cultural connotations. The corpora tool Sketch Engine is used to provide potential online contexts and word behaviors. Selected two English translations of HDNJ by TCM practitioner and non-practitioner are used to examine frequency and distribution of linguistic features of the translation. It was found the hypothesis about the universals of translated language (explicitation, normalisation) is true in one translation, but it is on the sacrifice of some original contextual connotations. Transliteration is purposefully used in the second translation to retain the original flavor, which is argued as a violation of the principle of relevance in communication because it yields little contextual effects and demands more processing effort of the reader. The translatability of conceptual vagueness in HDNJ is constrained by source language context and the reader’s cognitive environment.Keywords: corpus-based translation, translatability, TCM classics, vague language
Procedia PDF Downloads 3771779 Al2O3-Dielectric AlGaN/GaN Enhancement-Mode MOS-HEMTs by Using Ozone Water Oxidization Technique
Authors: Ching-Sung Lee, Wei-Chou Hsu, Han-Yin Liu, Hung-Hsi Huang, Si-Fu Chen, Yun-Jung Yang, Bo-Chun Chiang, Yu-Chuang Chen, Shen-Tin Yang
Abstract:
AlGaN/GaN high electron mobility transistors (HEMTs) have been intensively studied due to their intrinsic advantages of high breakdown electric field, high electron saturation velocity, and excellent chemical stability. They are also suitable for ultra-violet (UV) photodetection due to the corresponding wavelengths of GaN bandgap. To improve the optical responsivity by decreasing the dark current due to gate leakage problems and limited Schottky barrier heights in GaN-based HEMT devices, various metal-oxide-semiconductor HEMTs (MOS-HEMTs) have been devised by using atomic layer deposition (ALD), molecular beam epitaxy (MBE), metal-organic chemical vapor deposition (MOCVD), liquid phase deposition (LPD), and RF sputtering. The gate dielectrics include MgO, HfO2, Al2O3, La2O3, and TiO2. In order to provide complementary circuit operation, enhancement-mode (E-mode) devices have been lately studied using techniques of fluorine treatment, p-type capper, piezoneutralization layer, and MOS-gate structure. This work reports an Al2O3-dielectric Al0.25Ga0.75N/GaN E-mode MOS-HEMT design by using a cost-effective ozone water oxidization technique. The present ozone oxidization method advantages of low cost processing facility, processing simplicity, compatibility to device fabrication, and room-temperature operation under atmospheric pressure. It can further reduce the gate-to-channel distance and improve the transocnductance (gm) gain for a specific oxide thickness, since the formation of the Al2O3 will consume part of the AlGaN barrier at the same time. The epitaxial structure of the studied devices was grown by using the MOCVD technique. On a Si substrate, the layer structures include a 3.9 m C-doped GaN buffer, a 300 nm GaN channel layer, and a 5 nm Al0.25Ga0.75N barrier layer. Mesa etching was performed to provide electrical isolation by using an inductively coupled-plasma reactive ion etcher (ICP-RIE). Ti/Al/Au were thermally evaporated and annealed to form the source and drain ohmic contacts. The device was immersed into the H2O2 solution pumped with ozone gas generated by using an OW-K2 ozone generator. Ni/Au were deposited as the gate electrode to complete device fabrication of MOS-HEMT. The formed Al2O3 oxide thickness 7 nm and the remained AlGaN barrier thickness is 2 nm. A reference HEMT device has also been fabricated in comparison on the same epitaxial structure. The gate dimensions are 1.2 × 100 µm 2 with a source-to-drain spacing of 5 μm for both devices. The dielectric constant (k) of Al2O3 was characterized to be 9.2 by using C-V measurement. Reduced interface state density after oxidization has been verified by the low-frequency noise spectra, Hooge coefficients, and pulse I-V measurement. Improved device characteristics at temperatures of 300 K-450 K have been achieved for the present MOS-HEMT design. Consequently, Al2O3-dielectric Al0.25Ga0.75N/GaN E-mode MOS-HEMTs by using the ozone water oxidization method are reported. In comparison with a conventional Schottky-gate HEMT, the MOS-HEMT design has demonstrated excellent enhancements of 138% (176%) in gm, max, 118% (139%) in IDS, max, 53% (62%) in BVGD, 3 (2)-order reduction in IG leakage at VGD = -60 V at 300 (450) K. This work is promising for millimeter-wave integrated circuit (MMIC) and three-terminal active UV photodetector applications.Keywords: MOS-HEMT, enhancement mode, AlGaN/GaN, passivation, ozone water oxidation, gate leakage
Procedia PDF Downloads 2621778 Use Process Ring-Opening Polymerization to Melt Processing of Cellulose Nanowhisker from Coconut Husk Fibers-Filled Polylactide-Based Nanocomposites
Authors: Imam Wierawansyah Eltara, Iftitah, Agus Ismail
Abstract:
In the present work, cellulose nanowhiskers (CNW) extracted from coconut husk fibers, were incorporated in polylactide (PLA)-based composites. Prior to the blending, PLA chains were chemically grafted on the surface of CNW to enhance the compatibilization between CNW and the hydrophobic polyester matrix. Ring-opening polymerization of L-lactide was initiated from the hydroxyl groups available at the CNW surface to yield CNW-g-PLA nanohybrids. PLA-based nanocomposites were prepared by melt blending to ensure a green concept of the study thereby limiting the use of organic solvents. The influence of PLA-grafted cellulose nanoparticles on the mechanical and thermal properties of the ensuing nanocomposites was deeply investigated. The thermal behavior and mechanical properties of the nanocomposites were determined using differential scanning calorimetry (DSC) and dynamical mechanical and thermal analysis (DMTA), respectively. In theory, evidenced that the chemical grafting of CNW enhances their compatibility with the polymeric matrix and thus improves the final properties of the nanocomposites. Large modification of the crystalline properties such as the crystallization half-time was evidenced according to the nature of the PLA matrix and the content of nanofillers.Keywords: cellulose nanowhiskers, nanocomposites, coconut husk fiber, ring opening polymerization
Procedia PDF Downloads 3171777 Expanding Trading Strategies By Studying Sentiment Correlation With Data Mining Techniques
Authors: Ved Kulkarni, Karthik Kini
Abstract:
This experiment aims to understand how the media affects the power markets in the mainland United States and study the duration of reaction time between news updates and actual price movements. it have taken into account electric utility companies trading in the NYSE and excluded companies that are more politically involved and move with higher sensitivity to Politics. The scrapper checks for any news related to keywords, which are predefined and stored for each specific company. Based on this, the classifier will allocate the effect into five categories: positive, negative, highly optimistic, highly negative, or neutral. The effect on the respective price movement will be studied to understand the response time. Based on the response time observed, neural networks would be trained to understand and react to changing market conditions, achieving the best strategy in every market. The stock trader would be day trading in the first phase and making option strategy predictions based on the black holes model. The expected result is to create an AI-based system that adjusts trading strategies within the market response time to each price movement.Keywords: data mining, language processing, artificial neural networks, sentiment analysis
Procedia PDF Downloads 171776 An Image Enhancement Method Based on Curvelet Transform for CBCT-Images
Authors: Shahriar Farzam, Maryam Rastgarpour
Abstract:
Image denoising plays extremely important role in digital image processing. Enhancement of clinical image research based on Curvelet has been developed rapidly in recent years. In this paper, we present a method for image contrast enhancement for cone beam CT (CBCT) images based on fast discrete curvelet transforms (FDCT) that work through Unequally Spaced Fast Fourier Transform (USFFT). These transforms return a table of Curvelet transform coefficients indexed by a scale parameter, an orientation and a spatial location. Accordingly, the coefficients obtained from FDCT-USFFT can be modified in order to enhance contrast in an image. Our proposed method first uses a two-dimensional mathematical transform, namely the FDCT through unequal-space fast Fourier transform on input image and then applies thresholding on coefficients of Curvelet to enhance the CBCT images. Consequently, applying unequal-space fast Fourier Transform leads to an accurate reconstruction of the image with high resolution. The experimental results indicate the performance of the proposed method is superior to the existing ones in terms of Peak Signal to Noise Ratio (PSNR) and Effective Measure of Enhancement (EME).Keywords: curvelet transform, CBCT, image enhancement, image denoising
Procedia PDF Downloads 3001775 Hybrid Algorithm for Non-Negative Matrix Factorization Based on Symmetric Kullback-Leibler Divergence for Signal Dependent Noise: A Case Study
Authors: Ana Serafimovic, Karthik Devarajan
Abstract:
Non-negative matrix factorization approximates a high dimensional non-negative matrix V as the product of two non-negative matrices, W and H, and allows only additive linear combinations of data, enabling it to learn parts with representations in reality. It has been successfully applied in the analysis and interpretation of high dimensional data arising in neuroscience, computational biology, and natural language processing, to name a few. The objective of this paper is to assess a hybrid algorithm for non-negative matrix factorization with multiplicative updates. The method aims to minimize the symmetric version of Kullback-Leibler divergence known as intrinsic information and assumes that the noise is signal-dependent and that it originates from an arbitrary distribution from the exponential family. It is a generalization of currently available algorithms for Gaussian, Poisson, gamma and inverse Gaussian noise. We demonstrate the potential usefulness of the new generalized algorithm by comparing its performance to the baseline methods which also aim to minimize symmetric divergence measures.Keywords: non-negative matrix factorization, dimension reduction, clustering, intrinsic information, symmetric information divergence, signal-dependent noise, exponential family, generalized Kullback-Leibler divergence, dual divergence
Procedia PDF Downloads 2461774 Enhance Biogas Production by Enzymatic Pre-Treatment from Palm Oil Mill Effluent (POME)
Authors: M. S. Tajul Islam, Md. Zahangir Alam
Abstract:
To enhance biogas production through anaerobic digestion, the application of various type of pre-treatment method has some limitations in terms of sustainable environmental management. Many studies on pretreatments especially chemical and physical processes are carried out to evaluate the anaerobic digestion for enhanced biogas production. Among the pretreatment methods acid and alkali pre-treatments gained the highest importance. Previous studies have showed that although acid and alkali pretreatment has significant effect on degradation of biomass, these methods have some negative impact on environment due to their hazard in nature while enzymatic pre-treatment is environmentally friendly. One of the constrains to use of enzyme in pretreatment process for biogas production is high cost which is currently focused to reduce cost through fermentation of waste-based media. As such palm oil mill effluent (POME) as an abundant resource generated during palm oil processing at mill is being used a potential fermentation media for enzyme production. This low cost of enzyme could be an alternative to biogas pretreatment process. This review is to focus direct application of enzyme as enzymatic pre-treatment on POME to enhanced production of biogas.Keywords: POME, enzymatic pre-treatment, biogas, lignocellulosic biomass, anaerobic digestion
Procedia PDF Downloads 5501773 Enhancing Intra-Organizational Supply Chain Relationships in Manufacturing Companies: A Case Study in Tigray, Ethiopia
Authors: Weldeabrha Kiros Kidanemaryam
Abstract:
The investigation is to examine intra-organizational supply chain relationships of firms, which will help to look at and give an emphasis on internal processes and operations strength and achievements to make an influence even for external relationship management and outstanding performances of organizations. The purpose of the study is to scrutinize the internal supply chain relationships within manufacturing companies located in Tigray. The qualitative and quantitative data analysis methods were employed during the study by applying the primary data sources (questionnaires & interviews) and secondary data sources (organizational reports and documents) with the purposive sampling method. Thus, a descriptive research design was also applied in the research project in line with the cross-sectional research design which portrays simply the magnitude of the issues and problems by collecting the required and necessary data once from the sample respondents. This is because the study variables don’t have any cause-and-effect relationship in the research project that requires other types of research design than a descriptive research design; it already needs to be assessed and analyzed with a detailed description of the results after quantifying the outcomes and degree of the issues and problems based on the data gathered from respondents. The collected data was also analyzed by using the statistical package for social sciences (SPSS Version 20). The intra-organizational relationships of the companies are moderately accomplished, which requires an improvement for enhancing the performances of each unit or department within the firms so as to upgrade and ensure the progress of the companies’ effectiveness and efficiency. Moreover, the manufacturing companies have low industrial discipline and working culture, weak supervision of manpower, delayed delivery in the process within the companies, unsatisfactory quality of products, underutilization of capacity, and low productivity and profitability, which in turn results in minimizing the performance of intra-organizational supply chain relationships and to reduce the companies’ organizational efficiency, effectiveness and sustainability. Hence, the companies should have to give emphasize building and managing the intra-organizational supply chain relationships effectively because nothing can be done without creating successful and progressive relationships with internal units or functional areas and individuals for the production and provision of the required and qualified products that permits to meet the intended customers’ desires. The study contributes to improving the practical applications and gives an emphasis on the policy measurements and implications of the manufacturing companies with regard to intra-organizational supply chain relationships.Keywords: supply chain, supply chain relationships, intra-organizational relationships, manufacturing companies
Procedia PDF Downloads 34