Search results for: noisy forensic speaker verification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1108

Search results for: noisy forensic speaker verification

628 Image Segmentation Techniques: Review

Authors: Lindani Mbatha, Suvendi Rimer, Mpho Gololo

Abstract:

Image segmentation is the process of dividing an image into several sections, such as the object's background and the foreground. It is a critical technique in both image-processing tasks and computer vision. Most of the image segmentation algorithms have been developed for gray-scale images and little research and algorithms have been developed for the color images. Most image segmentation algorithms or techniques vary based on the input data and the application. Nearly all of the techniques are not suitable for noisy environments. Most of the work that has been done uses the Markov Random Field (MRF), which involves the computations and is said to be robust to noise. In the past recent years' image segmentation has been brought to tackle problems such as easy processing of an image, interpretation of the contents of an image, and easy analysing of an image. This article reviews and summarizes some of the image segmentation techniques and algorithms that have been developed in the past years. The techniques include neural networks (CNN), edge-based techniques, region growing, clustering, and thresholding techniques and so on. The advantages and disadvantages of medical ultrasound image segmentation techniques are also discussed. The article also addresses the applications and potential future developments that can be done around image segmentation. This review article concludes with the fact that no technique is perfectly suitable for the segmentation of all different types of images, but the use of hybrid techniques yields more accurate and efficient results.

Keywords: clustering-based, convolution-network, edge-based, region-growing

Procedia PDF Downloads 98
627 Deictic Expressions in Selected Football Commentaries

Authors: Vera Ofori Akomah

Abstract:

There is no society without language. In football, language serves as a tool for communication. The football language and meaning of activities are largely revealed through the utterances of football commentators. The linguistic subfield of pragmatics is related to the study of meaning. Pragmatics shows that the interpretation of utterances not only depends on linguistic knowledge but also depends on knowledge about the context of the utterance, knowledge about the status of those involved such as the intent of the speaker, the place, and time of the utterance. Pragmatics analysis comes in several forms and one of such is Deixis. In football commentating, commentators often use deitic expressions in building utterances. The researcher intends to analyse deixis contained in three selected football commentaries through the use of Levinson’s deixis theory. This research is a qualitative study with content analysis as its method. This is because this study focuses on deitic expressions in football commentaries. The data of this study are utterances from English commentaries from 2016 El Classico match between Barcelona and Real Madrid, 2018 FIFA World Cup: Portugal vs Spain and 2022 FIFA World Cup Qualifier: Ghana v Nigeria. The result of the study reveals that there are five kinds of deixis which are person deixis (divided into three: the first person, the second person and the third person), place deixis, time deixis, discourse deixis and social deixis.

Keywords: pragmatics analysis, football commentary, deixis, types of deixis

Procedia PDF Downloads 28
626 Optimal Design of Substation Grounding Grid Based on Genetic Algorithm Technique

Authors: Ahmed Z. Gabr, Ahmed A. Helal, Hussein E. Said

Abstract:

With the incessant increase of power systems capacity and voltage grade, the safety of grounding grid becomes more and more prominent. In this paper, the designing substation grounding grid is presented by means of genetic algorithm (GA). This approach purposes to control the grounding cost of the power system with the aid of controlling grounding rod number and conductor lengths under the same safety limitations. The proposed technique is used for the design of the substation grounding grid in Khalda Petroleum Company “El-Qasr” power plant and the design was simulated by using CYMGRD software for results verification. The result of the design is highly complying with IEEE 80-2000 standard requirements.

Keywords: genetic algorithm, optimum grounding grid design, power system analysis, power system protection, single layer model, substation

Procedia PDF Downloads 537
625 Zero Net Energy Communities and the Impacts to the Grid

Authors: Heidi von Korff

Abstract:

The electricity grid is changing in terms of flexibility. Distributed generation (DG) policy is being discussed worldwide and implemented. Developers and utilities are seeking a pathway towards Zero Net Energy (ZNE) communities and the interconnection to the distribution grid. Using the VISDOM platform for establishing a method for managing and monitoring energy consumption loads of ZNE communities as a capacity resource for the grid. Reductions in greenhouse gas emissions and energy security are primary policy drivers for incorporating high-performance energy standards and sustainability practices in residential households, such as a market transformation of ZNE and nearly ZNE (nZNE) communities. This research investigates how load data impacts ZNE, to see if there is a correlation to the daily load variations in a single ZNE home. Case studies will include a ZNE community in California and a nearly ZNE community (All – Electric) in the Netherlands, which both are in measurement and verification (M&V) phases and connected to the grid for simulations of methods.

Keywords: zero net energy, distributed generation, renewable energy, zero net energy community

Procedia PDF Downloads 307
624 Confidence in Practice of Debate at Senior High School Student in Jakarta, Indonesia

Authors: Arista Mayang Sari Slamet

Abstract:

This study was conducted to see the shape or behavior that shows the attitude of confidence in the practice of debate on science program students in Senior High School. This research is a descriptive qualitative study by explaining the forms of behavior of each indicator (there are ten indicators) confidence of Santrock. Data collection using interviews with Indonesian language teachers, direct observation, and documents. In this study, it was found that there is one item that is not visible indicator of the high school students of class X, which is the fourth item ‘Sitting with others in social activities’. This is caused by the forum examined are debating forum, so there is a social activity can’t be seen. The result of this study there are two students who do not show the behavior of confidence, their name is Dea and Audria (from the pro team). This indicates that the head of a pro team dominated the debate. The time for the debate is 45 minutes. Therefore all students in both of team can’t demonstrate their debate skill. In each team is only dominated by one student. The most common forms of confidence behavior are expressing opinion, look at the other person (speaker), and keeping eye contact with the other person. This indicates that the attitude of confidence by looking at the other person makes them more confident about their opinion. The most uncommon indicators is to direct or instruct to the other person. This shows that the attitude of self-confidence shown by the students isn’t lead.

Keywords: confidence, debate, senior high school, Jakarta

Procedia PDF Downloads 163
623 Boosting Crime Scene Investigations Capabilities through Crime Script Analysis

Authors: Benoit Leclerc

Abstract:

The concept of scripts and the role that crime scripts has been playing in criminology during the last decade is reviewed. Particularly illuminating is the potential use of scripts not only to understand and disrupt offender scripts (e.g., commonly referred as crime scripts) but to capture victim and guardian scripts to increase the likelihood of preventing crime. Similarly, the concept of scripts is applied to forensic science – another field that can benefit from script analysis. First, similar to guardian scripts, script analysis can illuminate the process of completing crime scene investigations for those who investigate (crime scene investigators or other professionals involved in crime scene investigations), and as a result, provide a range of intervention-points to improve the success of these investigations. Second, script analysis can also provide valuable information on offenders’ crime-commission processes for crime scene investigators and highlight a number of ‘contact points’ that could be targeted during investigations.

Keywords: crime scripts, crime scene investigation, script analysis, situational crime prevention

Procedia PDF Downloads 275
622 Cardiokey: A Binary and Multi-Class Machine Learning Approach to Identify Individuals Using Electrocardiographic Signals on Wearable Devices

Authors: S. Chami, J. Chauvin, T. Demarest, Stan Ng, M. Straus, W. Jahner

Abstract:

Biometrics tools such as fingerprint and iris are widely used in industry to protect critical assets. However, their vulnerability and lack of robustness raise several worries about the protection of highly critical assets. Biometrics based on Electrocardiographic (ECG) signals is a robust identification tool. However, most of the state-of-the-art techniques have worked on clinical signals, which are of high quality and less noisy, extracted from wearable devices like a smartwatch. In this paper, we are presenting a complete machine learning pipeline that identifies people using ECG extracted from an off-person device. An off-person device is a wearable device that is not used in a medical context such as a smartwatch. In addition, one of the main challenges of ECG biometrics is the variability of the ECG of different persons and different situations. To solve this issue, we proposed two different approaches: per person classifier, and one-for-all classifier. The first approach suggests making binary classifier to distinguish one person from others. The second approach suggests a multi-classifier that distinguishes the selected set of individuals from non-selected individuals (others). The preliminary results, the binary classifier obtained a performance 90% in terms of accuracy within a balanced data. The second approach has reported a log loss of 0.05 as a multi-class score.

Keywords: biometrics, electrocardiographic, machine learning, signals processing

Procedia PDF Downloads 142
621 Thermal Fracture Analysis of Fibrous Composites with Variable Fiber Spacing Using Jk-Integral

Authors: Farid Saeidi, Serkan Dag

Abstract:

In this study, fracture analysis of a fibrous composite laminate with variable fiber spacing is carried out using Jk-integral method. The laminate is assumed to be under thermal loading. Jk-integral is formulated by using the constitutive relations of plane orthotropic thermoelasticity. Developed domain independent form of the Jk-integral is then integrated into the general purpose finite element analysis software ANSYS. Numerical results are generated so as to assess the influence of variable fiber spacing on mode I and II stress intensity factors, energy release rate, and T-stress. For verification, some of the results are compared to those obtained using displacement correlation technique (DCT).

Keywords: Jk-integral, Variable Fiber Spacing, Thermoelasticity, T-stress, Finite Element Method, Fibrous Composite.

Procedia PDF Downloads 388
620 The Evolution of Amazon Alexa: From Voice Assistant to Smart Home Hub

Authors: Abrar Abuzaid, Maha Alaaeddine, Haya Alesayi

Abstract:

This project is centered around understanding the usage and impact of Alexa, Amazon's popular virtual assistant, in everyday life. Alexa, known for its integration into devices like Amazon Echo, offers functionalities such as voice interaction, media control, providing real-time information, and managing smart home devices. Our primary focus is to conduct a straightforward survey aimed at uncovering how people use Alexa in their daily routines. We plan to reach out to a wide range of individuals to get a diverse perspective on how Alexa is being utilized for various tasks, the frequency and context of its use, and the overall user experience. The survey will explore the most common uses of Alexa, its impact on daily life, features that users find most beneficial, and improvements they are looking for. This project is not just about collecting data but also about understanding the real-world applications of a technology like Alexa and how it fits into different lifestyles. By examining the responses, we aim to gain a practical understanding of Alexa's role in homes and possibly in workplaces. This project will provide insights into user satisfaction and areas where Alexa could be enhanced to meet the evolving needs of its users. It’s a step towards connecting technology with everyday life, making it more accessible and user-friendly

Keywords: Amazon Alexa, artificial intelligence, smart speaker, natural language processing

Procedia PDF Downloads 63
619 The 'Cornaro Family Tree' as a Tool for Identifying Cornaro Family Portraits

Authors: Rachel Healy

Abstract:

This paper builds on the speaker’s recent identification of an early sixteenth-century painting in the National Gallery of Ireland as containing rare portraits of Giorgio Cornaro (brother of Caterina, Queen of Cyprus) and his son Cardinal Francesco. It resolves similar long-standing confusion regarding the identities of sitters in related works by Titian, Raphael and Bernini, in works such as the Cornaro Triple Portrait in the National Gallery of Art, Washington DC, Man with a Falcon in The Joslyn Art Museum, Omaha, Head of a Cardinal, Wilton House, Wiltshire and The Cornaro Chapel, Santa Maria della Vittoria, Rome, by using an overlooked seventeenth-century painted Cornaro family tree, from Palazzo Corner-Mocenigo, as a tool for identifying these and other sitters in disputed portraits of one of Renaissance Venice’s wealthiest and most influential patrician families. In so doing, it will cast new light on Titian’s development as a portraitist and the extent to which important paintings commissioned by the Cornaro survived fires at two family palaces in Venice in the 1530s. It will also showcase the associations Raphael had with the Cornaro cardinal and will present new evidence relating to the likenesses Bernini fashioned for the Cornaro Chapel in 1647-52.

Keywords: Venice, portraits, titian, genealogy, Bernini, family tree, Raphael, venetian family, cornaro, sixteenth century Venice, portraiture

Procedia PDF Downloads 272
618 Seamless MATLAB® to Register-Transfer Level Design Methodology Using High-Level Synthesis

Authors: Petri Solanti, Russell Klein

Abstract:

Many designers are asking for an automated path from an abstract mathematical MATLAB model to a high-quality Register-Transfer Level (RTL) hardware description. Manual transformations of MATLAB or intermediate code are needed, when the design abstraction is changed. Design conversion is problematic as it is multidimensional and it requires many different design steps to translate the mathematical representation of the desired functionality to an efficient hardware description with the same behavior and configurability. Yet, a manual model conversion is not an insurmountable task. Using currently available design tools and an appropriate design methodology, converting a MATLAB model to efficient hardware is a reasonable effort. This paper describes a simple and flexible design methodology that was developed together with several design teams.

Keywords: design methodology, high-level synthesis, MATLAB, verification

Procedia PDF Downloads 140
617 Optimizing SCADA/RTU Control System Alarms for Gas Wells

Authors: Mohammed Ali Faqeeh

Abstract:

SCADA System Alarms Optimization Process has been introduced recently and applied accordingly in different implemented stages. First, MODBUS communication protocols between RTU/SCADA were improved at the level of I/O points scanning intervals. Then, some of the technical issues related to manufacturing limitations were resolved. Afterward, another approach was followed to take a decision on the configured alarms database. So, a couple of meetings and workshops were held among all system stakeholders, which resulted in an agreement of disabling unnecessary (Diagnostic) alarms. Moreover, a leap forward step was taken to segregate the SCADA Operator Graphics in a way to show only process-related alarms while some other graphics will ensure the availability of field alarms related to maintenance and engineering purposes. This overall system management and optimization have resulted in a huge effective impact on all operations, maintenance, and engineering. It has reduced unneeded open tickets for maintenance crews which led to reduce the driven mileages accordingly. Also, this practice has shown a good impression on the operation reactions and response to the emergency situations as the SCADA operators can be staying much vigilant on the real alarms rather than gets distracted by noisy ones. SCADA System Alarms Optimization process has been executed utilizing all applicable in-house resources among engineering, maintenance, and operations crews. The methodology of the entire enhanced scopes is performed through various stages.

Keywords: SCADA, RTU Communication, alarm management system, SCADA alarms, Modbus, DNP protocol

Procedia PDF Downloads 166
616 A Photovoltaic Micro-Storage System for Residential Applications

Authors: Alia Al Nuaimi, Ayesha Al Aberi, Faiza Al Marzouqi, Shaikha Salem Ali Al Yahyaee, Ala Hussein

Abstract:

In this paper, a PV micro-storage system for residential applications is proposed. The term micro refers to the size of the PV storage system, which is in the range of few kilo-watts, compared to the grid size (~GWs). Usually, in a typical load profile of a residential unit, two peak demand periods exist: one at morning and the other at evening time. The morning peak can be partly covered by the PV energy directly, while the evening peak cannot be covered by the PV alone. Therefore, an energy storage system that stores solar energy during daytime and use this stored energy when the sun is absent is a must. A complete design procedure including theoretical analysis followed by simulation verification and economic feasibility evaluation is addressed in this paper.

Keywords: battery, energy storage, photovoltaic, peak shaving, smart grid

Procedia PDF Downloads 321
615 Using T-Splines to Model Point Clouds from Terrestrial Laser Scanner

Authors: G. Kermarrec, J. Hartmann

Abstract:

Spline surfaces are a major representation of freeform surfaces in the computer-aided graphic industry and were recently introduced in the field of geodesy for processing point clouds from terrestrial laser scanner (TLS). The surface fitting consists of approximating a trustworthy mathematical surface to a large numbered 3D point cloud. The standard B-spline surfaces lack of local refinement due to the tensor-product construction. The consequences are oscillating geometry, particularly in the transition from low-to-high curvature parts for scattered point clouds with missing data. More economic alternatives in terms of parameters on how to handle point clouds with a huge amount of observations are the recently introduced T-splines. As long as the partition of unity is guaranteed, their computational complexity is low, and they are flexible. T-splines are implemented in a commercial package called Rhino, a 3D modeler which is widely used in computer aided design to create and animate NURBS objects. We have applied T-splines surface fitting to terrestrial laser scanner point clouds from a bridge under load and a sheet pile wall with noisy observations. We will highlight their potential for modelling details with high trustworthiness, paving the way for further applications in terms of deformation analysis.

Keywords: deformation analysis, surface modelling, terrestrial laser scanner, T-splines

Procedia PDF Downloads 141
614 Failure Criterion for Mixed Mode Fracture of Cracked Wood Specimens

Authors: Mahdi Fakoor, Seyed Mohammad Navid Ghoreishi

Abstract:

Investigation of fracture of wood components can prevent from catastrophic failures. Created fracture process zone (FPZ) in crack tip vicinity has important effect on failure of cracked composite materials. In this paper, a failure criterion for fracture investigation of cracked wood specimens under mixed mode I/II loading is presented. This criterion is based on maximum strain energy release rate and material nonlinearity in the vicinity of crack tip due to presence of microcracks. Verification of results with available experimental data proves the coincidence of the proposed criterion with the nature of fracture of wood. To simplify the estimation of nonlinear properties of FPZ, a damage factor is also introduced for engineering and application purposes.

Keywords: fracture criterion, mixed mode loading, damage zone, micro cracks

Procedia PDF Downloads 299
613 Mathematical Model of Cancer Growth under the Influence of Radiation Therapy

Authors: Beata Jackowska-Zduniak

Abstract:

We formulate and analyze a mathematical model describing dynamics of cancer growth under the influence of radiation therapy. The effect of this type of therapy is considered as an additional equation of discussed model. Numerical simulations show that delay, which is added to ordinary differential equations and represent time needed for transformation from one type of cells to the other one, affects the behavior of the system. The validation and verification of proposed model is based on medical data. Analytical results are illustrated by numerical examples of the model dynamics. The model is able to reconstruct dynamics of treatment of cancer and may be used to determine the most effective treatment regimen based on the study of the behavior of individual treatment protocols.

Keywords: mathematical modeling, numerical simulation, ordinary differential equations, radiation therapy

Procedia PDF Downloads 409
612 Deep Reinforcement Learning Model Using Parameterised Quantum Circuits

Authors: Lokes Parvatha Kumaran S., Sakthi Jay Mahenthar C., Sathyaprakash P., Jayakumar V., Shobanadevi A.

Abstract:

With the evolution of technology, the need to solve complex computational problems like machine learning and deep learning has shot up. But even the most powerful classical supercomputers find it difficult to execute these tasks. With the recent development of quantum computing, researchers and tech-giants strive for new quantum circuits for machine learning tasks, as present works on Quantum Machine Learning (QML) ensure less memory consumption and reduced model parameters. But it is strenuous to simulate classical deep learning models on existing quantum computing platforms due to the inflexibility of deep quantum circuits. As a consequence, it is essential to design viable quantum algorithms for QML for noisy intermediate-scale quantum (NISQ) devices. The proposed work aims to explore Variational Quantum Circuits (VQC) for Deep Reinforcement Learning by remodeling the experience replay and target network into a representation of VQC. In addition, to reduce the number of model parameters, quantum information encoding schemes are used to achieve better results than the classical neural networks. VQCs are employed to approximate the deep Q-value function for decision-making and policy-selection reinforcement learning with experience replay and the target network.

Keywords: quantum computing, quantum machine learning, variational quantum circuit, deep reinforcement learning, quantum information encoding scheme

Procedia PDF Downloads 135
611 A Game-Theory-Based Price-Optimization Algorithm for the Simulation of Markets Using Agent-Based Modelling

Authors: Juan Manuel Sanchez-Cartas, Gonzalo Leon

Abstract:

A price competition algorithm for ABMs based on game theory principles is proposed to deal with the simulation of theoretical market models. The algorithm is applied to the classical Hotelling’s model and to a two-sided market model to show it leads to the optimal behavior predicted by theoretical models. However, when theoretical models fail to predict the equilibrium, the algorithm is capable of reaching a feasible outcome. Results highlight that the algorithm can be implemented in other simulation models to guarantee rational users and endogenous optimal behaviors. Also, it can be applied as a tool of verification given that is theoretically based.

Keywords: agent-based models, algorithmic game theory, multi-sided markets, price optimization

Procedia PDF Downloads 458
610 Enhancing Patch Time Series Transformer with Wavelet Transform for Improved Stock Prediction

Authors: Cheng-yu Hsieh, Bo Zhang, Ahmed Hambaba

Abstract:

Stock market prediction has long been an area of interest for both expert analysts and investors, driven by its complexity and the noisy, volatile conditions it operates under. This research examines the efficacy of combining the Patch Time Series Transformer (PatchTST) with wavelet transforms, specifically focusing on Haar and Daubechies wavelets, in forecasting the adjusted closing price of the S&P 500 index for the following day. By comparing the performance of the augmented PatchTST models with traditional predictive models such as Recurrent Neural Networks (RNNs), Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM) networks, and Transformers, this study highlights significant enhancements in prediction accuracy. The integration of the Daubechies wavelet with PatchTST notably excels, surpassing other configurations and conventional models in terms of Mean Absolute Error (MAE) and Mean Squared Error (MSE). The success of the PatchTST model paired with Daubechies wavelet is attributed to its superior capability in extracting detailed signal information and eliminating irrelevant noise, thus proving to be an effective approach for financial time series forecasting.

Keywords: deep learning, financial forecasting, stock market prediction, patch time series transformer, wavelet transform

Procedia PDF Downloads 55
609 Identification and Quantification of Lisinopril from Pure, Formulated and Urine Samples by Micellar Thin Layer Chromatography

Authors: Sudhanshu Sharma

Abstract:

Lisinopril, 1-[N-{(s)-I-carboxy-3 phenyl propyl}-L-proline dehydrate is a lysine analog of enalaprilat, the active metabolite of enalapril. It is long-acting, non-sulhydryl angiotensin-converting enzyme (ACE) inhibitor that is used for the treatment of hypertension and congestive heart failure in daily dosage 10-80 mg. Pharmacological activity of lisinopril has been proved in various experimental and clinical studies. Owing to its importance and widespread use, efforts have been made towards the development of simple and reliable analytical methods. As per our literature survey, lisinopril in pharmaceutical formulations has been determined by various analytical methodologies like polaragraphy, potentiometry, and spectrophotometry, but most of these analytical methods are not too suitable for the Identification of lisinopril from clinical samples because of the interferences caused by the amino acids and amino groups containing metabolites present in biological samples. This report is an attempt in the direction of developing a simple and reliable method for on plate identification and quantification of lisinopril in pharmaceutical formulations as well as from human urine samples using silica gel H layers developed with a new mobile phase comprising of micellar solutions of N-cetyl-N, N, N-trimethylammonium bromide (CTAB). Micellar solutions have found numerous practical applications in many areas of separation science. Micellar liquid chromatography (MLC) has gained immense popularity and wider applicability due to operational simplicity, cost effectiveness, relatively non-toxicity and enhanced separation efficiency, low aggressiveness. Incorporation of aqueous micellar solutions as mobile phase was pioneered by Armstrong and Terrill as they accentuated the importance of TLC where simultaneous separation of ionic or non-ionic species in a variety of matrices is required. A peculiarity of the micellar mobile phases (MMPs) is that they have no macroscopic analogues, as a result the typical separations can be easily achieved by using MMPs than aqueous organic mobile phases. Previously MMPs were successfully employed in TLC based critical separations of aromatic hydrocarbons, nucleotides, vitamin K1 and K5, o-, m- and p- aminophenol, amino acids, separation of penicillins. The human urine analysis for identification of selected drugs and their metabolites has emerged as an important investigation tool in forensic drug analysis. Among all chromatographic methods available only thin layer chromatography (TLC) enables a simple fast and effective separation of the complex mixtures present in various biological samples and is recommended as an approved testing for forensic drug analysis by federal Law. TLC proved its applicability during successful separation of bio-active amines, carbohydrates, enzymes, porphyrins, and their precursors, alkaloid and drugs from urine samples.

Keywords: lisnopril, surfactant, chromatography, micellar solutions

Procedia PDF Downloads 367
608 The Impact on the Network Deflectometry

Authors: Djamel–Eddine Yassine Boutiba

Abstract:

In this present memory, we present the various impacts deflectometer leading to the sizing by strengthening of existing roadways. It reminds that the road network in Algeria plays a major role with regard to drainage in major strategic areas and especially in the fringe northern Algeria. Heavy traffic passing through the northern fringe (between 25% and 30% heavy vehicles) causes substantial degradations at both the surface layer and base layer. The work on site by means within the laboratory CTTP such as deflectographe Lacroix, allowed us to record a large number of deflection localized bending on RN19A (Carrefour CW73-Ain- Merane), whose analysis of the results led us to opt for a building throughout the band's project . By the recorder against HWD (Heavy Weight déflectometer) allowed us to learn about the behavior of the pavement on the banks. In addition, the Software Alize III has been essential in the verification of the increase in the thickness dimensioned.

Keywords: capacity, deflection, deflectograph lacroix, degradation, hwd

Procedia PDF Downloads 285
607 Production of Biodiesel Using Brine Waste as a Heterogeneous Catalyst

Authors: Hilary Rutto, Linda Sibali

Abstract:

In these modern times, we constantly search for new and innovative technologies to lift the burden of our extreme energy demand. The overall purpose of biofuel production research is to source an alternative energy source to replace the normal use of fossil fuel as liquid petroleum products. This experiment looks at the basis of biodiesel production with regards to alternative catalysts that can be used to produce biodiesel. The key factors that will be addressed during the experiments will focus on temperature variation, catalyst additions to the overall reaction, methanol to oil ratio, and the impact of agitation on the reaction. Brine samples sources from nearby plants will be evaluated and tested thoroughly and the key characteristics of these brine samples analysed for the verification of its use as a possible catalyst in biodiesel production. The one factor at a time experimental approach was used in this experiment, and the recycle and reuse characteristics of the heterogeneous catalyst was evaluated.

Keywords: brine sludge, heterogenous catalyst, biodiesel, one factor

Procedia PDF Downloads 172
606 Teaching Creative Thinking and Writing to Simultaneous Bilinguals: A Longitudinal Study of 6-7 Years Old English and Punjabi Language Learners

Authors: Hafiz Muhammad Fazalehaq

Abstract:

This paper documents the results of a longitudinal study done on two bilingual children who speak English and Punjabi simultaneously. Their father is a native English speaker whereas their mother speaks Punjabi. Their mother can speak both the languages (English and Punjabi) whereas their father only speaks English. At the age of six, these children have difficulty in creative thinking and of course creative writing. So, the first task for the researcher is to impress and entice the children to think creatively. Various and different methodologies and techniques were used to entice them to start thinking creatively. Creative thinking leads to creative writing. These children were exposed to numerous sources including videos, photographs, texts and audios at first place in order to have a taste of creative genres (stories in this case). The children were encouraged to create their own stories sometimes with photographs and sometimes by using their favorite toys. At a second stage, they were asked to write about an event or incident. After that, they were motivated to create new stories and write them. Length of their creative writing varies from a few sentences to a two standard page. After this six months’ study, the researcher was able to develop a ten steps methodology for creating and improving/enhancing creative thinking and creative writing skills of the subjects understudy. This ten-step methodology entices and motivates the learner to think creatively for producing a creative piece.

Keywords: bilinguals, creative thinking, creative writing, simultaneous bilingual

Procedia PDF Downloads 352
605 On the Interface of the Phonemes and the Orthography of KāNà

Authors: Akat Sordum Owen

Abstract:

This paper focuses on the interface between the phonemes and the orthography of Kānà, an endangered language spoken in Khānà and Tàì Local Government Areas of Rivers State of Nigeria. Kānà is one of the four languages (others being Gòkānà, Bāān Ògóì and Ẹ́lẹ́mẹ́) of Ogonoid (i.e. Ogoni group of languages) located in the Cross River branch of Benue-Congo phylum. A good number of scholars, including Ikoro (1996) and Vobnu (2001) agree on the phonemes inventory of the language but differ on the choice of the letters of the orthography. Whereas many scholars on the language accept that the language is alphabetic and satisfactory with respect to the use of Latin (English) alphabetic orthography with emphasis on phoneme-grapheme relation, some other scholars tend to uphold that the complex consonants in the phonemic chart should be treated as a consonant cluster in the alphabet. This paper argues that consonant clusters occur at syntactic (and morphological) levels with regard to certain items in order to produce desired pronunciations and spellings. Each consonant in a cluster is identical and can be used with other letters to produce a different word. The data was obtained from scholarly writings on the language, by interviews and our intuition as a native speaker of the language. It is believed that this study will trigger further research into the orthography of Kānà and other tonal languages, such as Igbo and Yoruba having similar features in order to reanalyze the number of letters in the alphabets of those languages.

Keywords: KANA, phonemes, orthography, letters

Procedia PDF Downloads 18
604 Numerical Simulation and Experimental Validation of the Tire-Road Separation in Quarter-car Model

Authors: Quy Dang Nguyen, Reza Nakhaie Jazar

Abstract:

The paper investigates vibration dynamics of tire-road separation for a quarter-car model; this separation model is developed to be close to the real situation considering the tire is able to separate from the ground plane. A set of piecewise linear mathematical models is developed and matches the in-contact and no-contact states to be considered as mother models for further investigations. The bound dynamics are numerically simulated in the time response and phase portraits. The separation analysis may determine which values of suspension parameters can delay and avoid the no-contact phenomenon, which results in improving ride comfort and eliminating the potentially dangerous oscillation. Finally, model verification is carried out in the MSC-ADAMS environment.

Keywords: quarter-car vibrations, tire-road separation, separation analysis, separation dynamics, ride comfort, ADAMS validation

Procedia PDF Downloads 93
603 A Data-Mining Model for Protection of FACTS-Based Transmission Line

Authors: Ashok Kalagura

Abstract:

This paper presents a data-mining model for fault-zone identification of flexible AC transmission systems (FACTS)-based transmission line including a thyristor-controlled series compensator (TCSC) and unified power-flow controller (UPFC), using ensemble decision trees. Given the randomness in the ensemble of decision trees stacked inside the random forests model, it provides an effective decision on the fault-zone identification. Half-cycle post-fault current and voltage samples from the fault inception are used as an input vector against target output ‘1’ for the fault after TCSC/UPFC and ‘1’ for the fault before TCSC/UPFC for fault-zone identification. The algorithm is tested on simulated fault data with wide variations in operating parameters of the power system network, including noisy environment providing a reliability measure of 99% with faster response time (3/4th cycle from fault inception). The results of the presented approach using the RF model indicate the reliable identification of the fault zone in FACTS-based transmission lines.

Keywords: distance relaying, fault-zone identification, random forests, RFs, support vector machine, SVM, thyristor-controlled series compensator, TCSC, unified power-flow controller, UPFC

Procedia PDF Downloads 424
602 A Kernel-Based Method for MicroRNA Precursor Identification

Authors: Bin Liu

Abstract:

MicroRNAs (miRNAs) are small non-coding RNA molecules, functioning in transcriptional and post-transcriptional regulation of gene expression. The discrimination of the real pre-miRNAs from the false ones (such as hairpin sequences with similar stem-loops) is necessary for the understanding of miRNAs’ role in the control of cell life and death. Since both their small size and sequence specificity, it cannot be based on sequence information alone but requires structure information about the miRNA precursor to get satisfactory performance. Kmers are convenient and widely used features for modeling the properties of miRNAs and other biological sequences. However, Kmers suffer from the inherent limitation that if the parameter K is increased to incorporate long range effects, some certain Kmer will appear rarely or even not appear, as a consequence, most Kmers absent and a few present once. Thus, the statistical learning approaches using Kmers as features become susceptible to noisy data once K becomes large. In this study, we proposed a Gapped k-mer approach to overcome the disadvantages of Kmers, and applied this method to the field of miRNA prediction. Combined with the structure status composition, a classifier called imiRNA-GSSC was proposed. We show that compared to the original imiRNA-kmer and alternative approaches. Trained on human miRNA precursors, this predictor can achieve an accuracy of 82.34 for predicting 4022 pre-miRNA precursors from eleven species.

Keywords: gapped k-mer, imiRNA-GSSC, microRNA precursor, support vector machine

Procedia PDF Downloads 163
601 An Axisymmetric Finite Element Method for Compressible Swirling Flow

Authors: Raphael Zanella, Todd A. Oliver, Karl W. Schulz

Abstract:

This work deals with the finite element approximation of axisymmetric compressible flows with swirl velocity. We are interested in problems where the flow, while weakly dependent on the azimuthal coordinate, may have a strong azimuthal velocity component. We describe the approximation of the compressible Navier-Stokes equations with H1-conformal spaces of axisymmetric functions. The weak formulation is implemented in a C++ solver with explicit time marching. The code is first verified with a convergence test on a manufactured solution. The verification is completed by comparing the numerical and analytical solutions in a Poiseuille flow case and a Taylor-Couette flow case. The code is finally applied to the problem of a swirling subsonic air flow in a plasma torch geometry.

Keywords: axisymmetric problem, compressible Navier-Stokes equations, continuous finite elements, swirling flow

Procedia PDF Downloads 176
600 Smartphone Video Source Identification Based on Sensor Pattern Noise

Authors: Raquel Ramos López, Anissa El-Khattabi, Ana Lucila Sandoval Orozco, Luis Javier García Villalba

Abstract:

An increasing number of mobile devices with integrated cameras has meant that most digital video comes from these devices. These digital videos can be made anytime, anywhere and for different purposes. They can also be shared on the Internet in a short period of time and may sometimes contain recordings of illegal acts. The need to reliably trace the origin becomes evident when these videos are used for forensic purposes. This work proposes an algorithm to identify the brand and model of mobile device which generated the video. Its procedure is as follows: after obtaining the relevant video information, a classification algorithm based on sensor noise and Wavelet Transform performs the aforementioned identification process. We also present experimental results that support the validity of the techniques used and show promising results.

Keywords: digital video, forensics analysis, key frame, mobile device, PRNU, sensor noise, source identification

Procedia PDF Downloads 429
599 The Mirage of Progress? a Longitudinal Study of Japanese Students’ L2 Oral Grammar

Authors: Robert Long, Hiroaki Watanabe

Abstract:

This longitudinal study examines the grammatical errors of Japanese university students’ dialogues with a native speaker over an academic year. The L2 interactions of 15 Japanese speakers were taken from the JUSFC2018 corpus (April/May 2018) and the JUSFC2019 corpus (January/February). The corpora were based on a self-introduction monologue and a three-question dialogue; however, this study examines the grammatical accuracy found in the dialogues. Research questions focused on a possible significant difference in grammatical accuracy from the first interview session in 2018 and the second one the following year, specifically regarding errors in clauses per 100 words, global errors and local errors, and with specific errors related to parts of speech. The investigation also focused on which forms showed the least improvement or had worsened? Descriptive statistics showed that error-free clauses/errors per 100 words decreased slightly while clauses with errors/100 words increased by one clause. Global errors showed a significant decline, while local errors increased from 97 to 158 errors. For errors related to parts of speech, a t-test confirmed there was a significant difference between the two speech corpora with more error frequency occurring in the 2019 corpus. This data highlights the difficulty in having students self-edit themselves.

Keywords: clause analysis, global vs. local errors, grammatical accuracy, L2 output, longitudinal study

Procedia PDF Downloads 133