Search results for: decoding sequential search algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5565

Search results for: decoding sequential search algorithm

3465 Remote Assessment and Change Detection of GreenLAI of Cotton Crop Using Different Vegetation Indices

Authors: Ganesh B. Shinde, Vijaya B. Musande

Abstract:

Cotton crop identification based on the timely information has significant advantage to the different implications of food, economic and environment. Due to the significant advantages, the accurate detection of cotton crop regions using supervised learning procedure is challenging problem in remote sensing. Here, classifiers on the direct image are played a major role but the results are not much satisfactorily. In order to further improve the effectiveness, variety of vegetation indices are proposed in the literature. But, recently, the major challenge is to find the better vegetation indices for the cotton crop identification through the proposed methodology. Accordingly, fuzzy c-means clustering is combined with neural network algorithm, trained by Levenberg-Marquardt for cotton crop classification. To experiment the proposed method, five LISS-III satellite images was taken and the experimentation was done with six vegetation indices such as Simple Ratio, Normalized Difference Vegetation Index, Enhanced Vegetation Index, Green Atmospherically Resistant Vegetation Index, Wide-Dynamic Range Vegetation Index, Green Chlorophyll Index. Along with these indices, Green Leaf Area Index is also considered for investigation. From the research outcome, Green Atmospherically Resistant Vegetation Index outperformed with all other indices by reaching the average accuracy value of 95.21%.

Keywords: Fuzzy C-Means clustering (FCM), neural network, Levenberg-Marquardt (LM) algorithm, vegetation indices

Procedia PDF Downloads 310
3464 Introducing α-Oxoester (COBz) as a Protecting Group for Carbohydrates

Authors: Atul Kumar, Veeranjaneyulu Gannedi, Qazi Naveed Ahmed

Abstract:

Oligosaccharides, which are essential to all cellular organisms, play vital roles in cell recognition, signaling, and are involved in a broad range of biological processes. The chemical synthesis of carbohydrates represents a powerful tool to provide homogeneous glycans. In carbohydrate synthesis, the major concern is the orthogonal protection of hydroxyl groups that can be unmasked independently. Classical protecting groups include benzyl ethers (Bn), which are normally cleaved through hydrogenolysis or by means of metal reduction, and acetate (Ac), benzoate (Bz) or pivaloate esters, which are removed using base promoted hydrolysis. In present work a series of α-Oxoester (COBz) protected saccharides, with divergent base sensitivity profiles against benzoyl (Bz) and acetyl (Ac), were designed and KHSO₅/CH₃COCl in methanol was identified as an easy, mild, selective and efficient deprotecting reagent for their removal in the perspective of carbohydrate synthesis. Timely monitoring of later reagent was advantageous in establishing both sequential as well as simultaneous deprotecting of COBz, Bz, and Ac. The salient feature of our work is its ease to generate different acceptors using designed monosaccharides. In summary, we demonstrated α-Oxoester (COBz) as a new protecting group for carbohydrates and the application of this group for the synthesis of Glycosylphosphatidylinositol (GPI) anchor are in progress.

Keywords: α-Oxoester, oligosaccharides, new protecting group, acceptor synthesis, glycosylation

Procedia PDF Downloads 142
3463 Evolving Credit Scoring Models using Genetic Programming and Language Integrated Query Expression Trees

Authors: Alexandru-Ion Marinescu

Abstract:

There exist a plethora of methods in the scientific literature which tackle the well-established task of credit score evaluation. In its most abstract form, a credit scoring algorithm takes as input several credit applicant properties, such as age, marital status, employment status, loan duration, etc. and must output a binary response variable (i.e. “GOOD” or “BAD”) stating whether the client is susceptible to payment return delays. Data imbalance is a common occurrence among financial institution databases, with the majority being classified as “GOOD” clients (clients that respect the loan return calendar) alongside a small percentage of “BAD” clients. But it is the “BAD” clients we are interested in since accurately predicting their behavior is crucial in preventing unwanted loss for loan providers. We add to this whole context the constraint that the algorithm must yield an actual, tractable mathematical formula, which is friendlier towards financial analysts. To this end, we have turned to genetic algorithms and genetic programming, aiming to evolve actual mathematical expressions using specially tailored mutation and crossover operators. As far as data representation is concerned, we employ a very flexible mechanism – LINQ expression trees, readily available in the C# programming language, enabling us to construct executable pieces of code at runtime. As the title implies, they model trees, with intermediate nodes being operators (addition, subtraction, multiplication, division) or mathematical functions (sin, cos, abs, round, etc.) and leaf nodes storing either constants or variables. There is a one-to-one correspondence between the client properties and the formula variables. The mutation and crossover operators work on a flattened version of the tree, obtained via a pre-order traversal. A consequence of our chosen technique is that we can identify and discard client properties which do not take part in the final score evaluation, effectively acting as a dimensionality reduction scheme. We compare ourselves with state of the art approaches, such as support vector machines, Bayesian networks, and extreme learning machines, to name a few. The data sets we benchmark against amount to a total of 8, of which we mention the well-known Australian credit and German credit data sets, and the performance indicators are the following: percentage correctly classified, area under curve, partial Gini index, H-measure, Brier score and Kolmogorov-Smirnov statistic, respectively. Finally, we obtain encouraging results, which, although placing us in the lower half of the hierarchy, drive us to further refine the algorithm.

Keywords: expression trees, financial credit scoring, genetic algorithm, genetic programming, symbolic evolution

Procedia PDF Downloads 112
3462 Lip Localization Technique for Myanmar Consonants Recognition Based on Lip Movements

Authors: Thein Thein, Kalyar Myo San

Abstract:

Lip reading system is one of the different supportive technologies for hearing impaired, or elderly people or non-native speakers. For normal hearing persons in noisy environments or in conditions where the audio signal is not available, lip reading techniques can be used to increase their understanding of spoken language. Hearing impaired persons have used lip reading techniques as important tools to find out what was said by other people without hearing voice. Thus, visual speech information is important and become active research area. Using visual information from lip movements can improve the accuracy and robustness of a speech recognition system and the need for lip reading system is ever increasing for every language. However, the recognition of lip movement is a difficult task because of the region of interest (ROI) is nonlinear and noisy. Therefore, this paper proposes method to detect the accurate lips shape and to localize lip movement towards automatic lip tracking by using the combination of Otsu global thresholding technique and Moore Neighborhood Tracing Algorithm. Proposed method shows how accurate lip localization and tracking which is useful for speech recognition. In this work of study and experiments will be carried out the automatic lip localizing the lip shape for Myanmar consonants using the only visual information from lip movements which is useful for visual speech of Myanmar languages.

Keywords: lip reading, lip localization, lip tracking, Moore neighborhood tracing algorithm

Procedia PDF Downloads 348
3461 Effectiveness of Reinforcement Learning (RL) for Autonomous Energy Management Solutions

Authors: Tesfaye Mengistu

Abstract:

This thesis aims to investigate the effectiveness of Reinforcement Learning (RL) for Autonomous Energy Management solutions. The study explores the potential of Model Free RL approaches, such as Monte Carlo RL and Q-learning, to improve energy management by autonomously adjusting energy management strategies to maximize efficiency. The research investigates the implementation of RL algorithms for optimizing energy consumption in a single-agent environment. The focus is on developing a framework for the implementation of RL algorithms, highlighting the importance of RL for enabling autonomous systems to adapt quickly to changing conditions and make decisions based on previous experiences. Moreover, the paper proposes RL as a novel energy management solution to address nations' CO2 emission goals. Reinforcement learning algorithms are well-suited to solving problems with sequential decision-making patterns and can provide accurate and immediate outputs to ease the planning and decision-making process. This research provides insights into the challenges and opportunities of using RL for energy management solutions and recommends further studies to explore its full potential. In conclusion, this study provides valuable insights into how RL can be used to improve the efficiency of energy management systems and supports the use of RL as a promising approach for developing autonomous energy management solutions in residential buildings.

Keywords: artificial intelligence, reinforcement learning, monte carlo, energy management, CO2 emission

Procedia PDF Downloads 76
3460 Critical Evaluation of Occupational Health and Safety Challenges Facing the Construction Sector in the UK and Developing Anglophone West African Countries, Particularly the Gambia

Authors: Bintou Jobe

Abstract:

The construction sector, both in the United Kingdom (UK) and developing Anglophone West African countries, specifically The Gambia, is facing significant health and safety challenges. While the UK has established legislation and regulations to support Occupational Health and Safety (OHS) in the industry, the same level of support is lacking in developing countries. The significance of this review is to assess the extent and effectiveness of OHS legislation and regulatory reform in the construction industry, with a focus on understanding the challenges faced by both the UK and developing Anglophone West African countries. It aims to highlight the benefits of implementing an OHS management system, specifically ISO 45001. This study uses a literature review approach, synthesizing publications from the past decade and identifying common themes and best practices related to Occupational Health and Safety in the construction industry. Findings were analysed, compared, and conclusions and recommendations were drawn after developing research questions and addressing them. This comprehensive review of the literature allows for a detailed understanding of the challenges faced by the industry in both contexts. The findings of the study indicate that while the UK has established robust health and safety legislation, many UK construction companies have not fully met the standards outlined in ISO 45001. These challenges faced by the UK include poor data management, inadequate communication of best practices, insufficient training, and a lack of safety culture mirroring those observed in the developing Anglophone countries. Therefore, compliance with OHS management systems has been shown to yield benefits, including injury prevention and centralized health and safety documentation. In conclusion, the effectiveness of OHS legislation for developing Anglophone West African countries should consider the positive impact experienced by the UK. The implementation of ISO 45001 can serve as a benchmark standard and potentially inform recommendations for developing countries. The selection criteria for literature include search keywords and phrases, such as occupational health and safety challenges, The Gambia, developing countries management systems, ISO 45001, and impact and effectiveness of OHS legislation. The literature was sourced from Google Scholar, the UK Health and Safety Executive websites, and Google Advanced Search.

Keywords: ISO 45001, developing countries, occupational health and safety, UK

Procedia PDF Downloads 92
3459 Delineation of Green Infrastructure Buffer Areas with a Simulated Annealing: Consideration of Ecosystem Services Trade-Offs in the Objective Function

Authors: Andres Manuel Garcia Lamparte, Rocio Losada Iglesias, Marcos BoullóN Magan, David Miranda Barros

Abstract:

The biodiversity strategy of the European Union for 2030, mentions climate change as one of the key factors for biodiversity loss and considers green infrastructure as one of the solutions to this problem. In this line, the European Commission has developed a green infrastructure strategy which commits members states to consider green infrastructure in their territorial planning. This green infrastructure is aimed at granting the provision of a wide number of ecosystem services to support biodiversity and human well-being by countering the effects of climate change. Yet, there are not too many tools available to delimit green infrastructure. The available ones consider the potential of the territory to provide ecosystem services. However, these methods usually aggregate several maps of ecosystem services potential without considering possible trade-offs. This can lead to excluding areas with a high potential for providing ecosystem services which have many trade-offs with other ecosystem services. In order to tackle this problem, a methodology is proposed to consider ecosystem services trade-offs in the objective function of a simulated annealing algorithm aimed at delimiting green infrastructure multifunctional buffer areas. To this end, the provision potential maps of the regulating ecosystem services considered to delimit the multifunctional buffer areas are clustered in groups, so that ecosystem services that create trade-offs are excluded in each group. The normalized provision potential maps of the ecosystem services in each group are added to obtain a potential map per group which is normalized again. Then the potential maps for each group are combined in a raster map that shows the highest provision potential value in each cell. The combined map is then used in the objective function of the simulated annealing algorithm. The algorithm is run both using the proposed methodology and considering the ecosystem services individually. The results are analyzed with spatial statistics and landscape metrics to check the number of ecosystem services that the delimited areas produce, as well as their regularity and compactness. It has been observed that the proposed methodology increases the number of ecosystem services produced by delimited areas, improving their multifunctionality and increasing their effectiveness in preventing climate change impacts.

Keywords: ecosystem services trade-offs, green infrastructure delineation, multifunctional buffer areas, climate change

Procedia PDF Downloads 168
3458 Task Value and Research Culture of Southern Luzon State University

Authors: Antonio V. Romana, Rizaide A. Salayo, Maria Lavinia E. Fetalino

Abstract:

This study assessed the subjective task value and research culture of SLSU faculty. It used the sequential explanatory mixed-method research design. For the quantitative phase, a questionnaire on the research culture and task value were used. While in the qualitative phase, the data was coded and thematized to interpret the focus group discussion outcome. Results showed that the dimensions of the subjective task value, intrinsic, got the highest rank while the utility value got the lowest. It is worth mentioning that all subjective task values were "Agreed." From the FGD, faculty members valued research and wanted to be involved in this undertaking. However, the limited number of faculty researchers, heavy teaching workload, inadequate information on the research process, lack of self-confidence, and low incentives received from research hindered their writing and engagement with research. Thus, a policy brief was developed. It is recommended that the institution may conduct a series of research seminar workshops for the faculty members, plan regular research idea exchange activities, and revisit the university's research thrust and agenda for faculties specialization and expertise alignment. In addition, the university may also lessen the workload and hire additional faculty members so that educators may focus on their research work. Finally, cash incentives may still be considered upon knowing that the faculty members have varied experiences in doing research tasks.

Keywords: task value, interest value, attainment value, utility value, research culture

Procedia PDF Downloads 61
3457 The Effect of Traffic on Harmful Metals and Metalloids in the Street Dust and Surface Soil from Urban Areas of Tehran, Iran: Levels, Distribution and Chemical Partitioning Based on Single and Sequential Extraction Procedures

Authors: Hossein Arfaeinia, Ahmad Jonidi Jafari, Sina Dobaradaran, Sadegh Niazi, Mojtaba Ehsanifar, Amir Zahedi

Abstract:

Street dust and surface soil samples were collected from very heavy, heavy, medium and low traffic areas and natural site in Tehran, Iran. These samples were analyzed for some physical–chemical features, total and chemical speciation of selected metals and metalloids (Zn, Al, Sr, Pb, Cu, Cr, Cd, Co, Ni, and V) to study the effect of traffic on their mobility and accumulation in the environment. The pH, electrical conductivity (EC), carbonates and organic carbon (OC) values were similar in soil and dust samples from similar traffic areas. The traffic increases EC contents in dust/soil matrixes but has no effect on concentrations of metals and metalloids in soil samples. Rises in metal and metalloids levels with traffic were found in dust samples. Moreover, the traffic increases the percentage of acid soluble fraction and Fe and Mn oxides associated fractions of Pb and Zn. The mobilization of Cu, Zn, Pb, Cr in dust samples was easier than in soil. The speciation of metals and metalloids except Cd is mainly affected by physicochemical features in soil, although total metals and metalloids affected the speciation in dust samples (except chromium and nickel).

Keywords: street dust, surface soil, traffic, metals, metalloids, chemical speciation

Procedia PDF Downloads 250
3456 3D Objects Indexing Using Spherical Harmonic for Optimum Measurement Similarity

Authors: S. Hellam, Y. Oulahrir, F. El Mounchid, A. Sadiq, S. Mbarki

Abstract:

In this paper, we propose a method for three-dimensional (3-D)-model indexing based on defining a new descriptor, which we call new descriptor using spherical harmonics. The purpose of the method is to minimize, the processing time on the database of objects models and the searching time of similar objects to request object. Firstly we start by defining the new descriptor using a new division of 3-D object in a sphere. Then we define a new distance which will be used in the search for similar objects in the database.

Keywords: 3D indexation, spherical harmonic, similarity of 3D objects, measurement similarity

Procedia PDF Downloads 427
3455 The Influence of Wasta on Organizational Practices in Kuwait

Authors: Abrar Al-Enzi

Abstract:

Despite being frequently used everyday in the Arab World, Wasta, which is seen as a type of social capital, has received little attention from previous scholars, even in the Middle East. In simple words, Wasta basically means granting deserved or undeserved privileges to others through personal contacts. This paper suggests that Wasta is an important determinant of how some employees get recruited and turn to Wasta for privileges and favors in organizations. It is said, that Wasta accelerates career advancement and other work practices for employees, whether they deserve it or even are suitable for it or not. The overall goal of this paper is to see how Wasta influences human resource management practices by viewing the history of Wasta, the importance of using it, and how it affects employees as well as organizations in terms of recruitment and work practices. Accordingly, the question that will be addressed is: Does Wasta influence human resource management, knowledge sharing and innovation in Kuwait, which in turn affects employees’ commitment within organizations? Therefore, a mixed method sequential exploratory research design will be used to explore the research topic through initial exploratory interviews, paper-based and online surveys (Quantitative method) and semi-structured interviews (Qualitative method). The reason behind such a choice is because both qualitative and quantitative methods complement each other when combined by providing a clearer picture of the topic.

Keywords: human resource management practices, Kuwait, social capital, Wasta

Procedia PDF Downloads 200
3454 Random Forest Classification for Population Segmentation

Authors: Regina Chua

Abstract:

To reduce the costs of re-fielding a large survey, a Random Forest classifier was applied to measure the accuracy of classifying individuals into their assigned segments with the fewest possible questions. Given a long survey, one needed to determine the most predictive ten or fewer questions that would accurately assign new individuals to custom segments. Furthermore, the solution needed to be quick in its classification and usable in non-Python environments. In this paper, a supervised Random Forest classifier was modeled on a dataset with 7,000 individuals, 60 questions, and 254 features. The Random Forest consisted of an iterative collection of individual decision trees that result in a predicted segment with robust precision and recall scores compared to a single tree. A random 70-30 stratified sampling for training the algorithm was used, and accuracy trade-offs at different depths for each segment were identified. Ultimately, the Random Forest classifier performed at 87% accuracy at a depth of 10 with 20 instead of 254 features and 10 instead of 60 questions. With an acceptable accuracy in prioritizing feature selection, new tools were developed for non-Python environments: a worksheet with a formulaic version of the algorithm and an embedded function to predict the segment of an individual in real-time. Random Forest was determined to be an optimal classification model by its feature selection, performance, processing speed, and flexible application in other environments.

Keywords: machine learning, supervised learning, data science, random forest, classification, prediction, predictive modeling

Procedia PDF Downloads 88
3453 Simulation of Wet Scrubbers for Flue Gas Desulfurization

Authors: Anders Schou Simonsen, Kim Sorensen, Thomas Condra

Abstract:

Wet scrubbers are used for flue gas desulfurization by injecting water directly into the flue gas stream from a set of sprayers. The water droplets will flow freely inside the scrubber, and flow down along the scrubber walls as a thin wall film while reacting with the gas phase to remove SO₂. This complex multiphase phenomenon can be divided into three main contributions: the continuous gas phase, the liquid droplet phase, and the liquid wall film phase. This study proposes a complete model, where all three main contributions are taken into account and resolved using OpenFOAM for the continuous gas phase, and MATLAB for the liquid droplet and wall film phases. The 3D continuous gas phase is composed of five species: CO₂, H₂O, O₂, SO₂, and N₂, which are resolved along with momentum, energy, and turbulence. Source terms are present for four species, energy and momentum, which are affecting the steady-state solution. The liquid droplet phase experiences breakup, collisions, dynamics, internal chemistry, evaporation and condensation, species mass transfer, energy transfer and wall film interactions. Numerous sub-models have been implemented and coupled to realise the above-mentioned phenomena. The liquid wall film experiences impingement, acceleration, atomization, separation, internal chemistry, evaporation and condensation, species mass transfer, and energy transfer, which have all been resolved using numerous sub-models as well. The continuous gas phase has been coupled with the liquid phases using source terms by an approach, where the two software packages are couples using a link-structure. The complete CFD model has been verified using 16 experimental tests from an existing scrubber installation, where a gradient-based pattern search optimization algorithm has been used to tune numerous model parameters to match the experimental results. The CFD model needed to be fast for evaluation in order to apply this optimization routine, where approximately 1000 simulations were needed. The results show that the complex multiphase phenomena governing wet scrubbers can be resolved in a single model. The optimization routine was able to tune the model to accurately predict the performance of an existing installation. Furthermore, the study shows that a coupling between OpenFOAM and MATLAB is realizable, where the data and source term exchange increases the computational requirements by approximately 5%. This allows for exploiting the benefits of both software programs.

Keywords: desulfurization, discrete phase, scrubber, wall film

Procedia PDF Downloads 253
3452 Experiences of Timing Analysis of Parallel Embedded Software

Authors: Muhammad Waqar Aziz, Syed Abdul Baqi Shah

Abstract:

The execution time analysis is fundamental to the successful design and execution of real-time embedded software. In such analysis, the Worst-Case Execution Time (WCET) of a program is a key measure, on the basis of which system tasks are scheduled. The WCET analysis of embedded software is also needed for system understanding and to guarantee its behavior. WCET analysis can be performed statically (without executing the program) or dynamically (through measurement). Traditionally, research on the WCET analysis assumes sequential code running on single-core platforms. However, as computation is steadily moving towards using a combination of parallel programs and multi-core hardware, new challenges in WCET analysis need to be addressed. In this article, we report our experiences of performing the WCET analysis of Parallel Embedded Software (PES) running on multi-core platform. The primary purpose was to investigate how WCET estimates of PES can be computed statically, and how they can be derived dynamically. Our experiences, as reported in this article, include the challenges we faced, possible suggestions to these challenges and the workarounds that were developed. This article also provides observations on the benefits and drawbacks of deriving the WCET estimates using the said methods and provides useful recommendations for further research in this area.

Keywords: embedded software, worst-case execution-time analysis, static flow analysis, measurement-based analysis, parallel computing

Procedia PDF Downloads 315
3451 Emergency Multidisciplinary Continuing Care Case Management

Authors: Mekroud Amel

Abstract:

Emergency departments are known for the workload, the variety of pathologies and the difficulties in their management with the continuous influx of patients The role of our service in the management of patients with two or three mild to moderate organ failures, involving several disciplines at the same time, as well as the effect of this management on the skills and efficiency of our team has been demonstrated Borderline cases between two or three or even more disciplines, with instability of a vital function, which have been successfully managed in the emergency room, the therapeutic procedures adopted, the consequences on the quality and level of care delivered by our team, as well as that the logistical consequences, and the pedagogical consequences are demonstrated. The consequences found are Positive on the emergency teams, in rare situations are negative Regarding clinical situations, it is the entanglement of hemodynamic distress with right, left or global participation, tamponade, low flow with acute pulmonary edema, and/or state of shock With respiratory distress with more or less profound hypoxemia, with haematosis disorder related to a bacterial or viral lung infection, pleurisy, pneumothorax, bronchoconstrictive crisis. With neurological disorders such as recent stroke, comatose state, or others With metabolic disorders such as hyperkalaemia renal insufficiency severe ionic disorders with accidents with anti vitamin K With or without septate effusion of one or more serous membranes with or without tamponade It’s a Retrospective, monocentric, descriptive study Period 05.01.2022 to 10.31.2022 the purpose of our work: Search for a statistically significant link between the type of moderate to severe pathology managed in the emergency room whose problems are multivisceral on the efficiency of the healthcare team and its level of care and optional care offered for patients Statistical Test used: Chi2 test to prove the significant link between the resolution of serious multidisciplinary cases in the emergency room and the effectiveness of the team in the management of complicated cases Search for a statistically significant link : The management of the most difficult clinical cases for organ specialties has given general practitioner emergency teams a great perspective and has been able to improve their efficiency in the face of emergencies received

Keywords: emergency care teams, management of patients with dysfunction of more than one organ, learning curve, quality of care

Procedia PDF Downloads 76
3450 Classification of Land Cover Usage from Satellite Images Using Deep Learning Algorithms

Authors: Shaik Ayesha Fathima, Shaik Noor Jahan, Duvvada Rajeswara Rao

Abstract:

Earth's environment and its evolution can be seen through satellite images in near real-time. Through satellite imagery, remote sensing data provide crucial information that can be used for a variety of applications, including image fusion, change detection, land cover classification, agriculture, mining, disaster mitigation, and monitoring climate change. The objective of this project is to propose a method for classifying satellite images according to multiple predefined land cover classes. The proposed approach involves collecting data in image format. The data is then pre-processed using data pre-processing techniques. The processed data is fed into the proposed algorithm and the obtained result is analyzed. Some of the algorithms used in satellite imagery classification are U-Net, Random Forest, Deep Labv3, CNN, ANN, Resnet etc. In this project, we are using the DeepLabv3 (Atrous convolution) algorithm for land cover classification. The dataset used is the deep globe land cover classification dataset. DeepLabv3 is a semantic segmentation system that uses atrous convolution to capture multi-scale context by adopting multiple atrous rates in cascade or in parallel to determine the scale of segments.

Keywords: area calculation, atrous convolution, deep globe land cover classification, deepLabv3, land cover classification, resnet 50

Procedia PDF Downloads 135
3449 Open-Loop Vector Control of Induction Motor with Space Vector Pulse Width Modulation Technique

Authors: Karchung, S. Ruangsinchaiwanich

Abstract:

This paper presents open-loop vector control method of induction motor with space vector pulse width modulation (SVPWM) technique. Normally, the closed loop speed control is preferred and is believed to be more accurate. However, it requires a position sensor to track the rotor position which is not desirable to use it for certain workspace applications. This paper exhibits the performance of three-phase induction motor with the simplest control algorithm without the use of a position sensor nor an estimation block to estimate rotor position for sensorless control. The motor stator currents are measured and are transformed to synchronously rotating (d-q-axis) frame by use of Clarke and Park transformation. The actual control happens in this frame where the measured currents are compared with the reference currents. The error signal is fed to a conventional PI controller, and the corrected d-q voltage is generated. The controller outputs are transformed back to three phase voltages and are fed to SVPWM block which generates PWM signal for the voltage source inverter. The open loop vector control model along with SVPWM algorithm is modeled in MATLAB/Simulink software and is experimented and validated in TMS320F28335 DSP board.

Keywords: electric drive, induction motor, open-loop vector control, space vector pulse width modulation technique

Procedia PDF Downloads 143
3448 Study on Acoustic Source Detection Performance Improvement of Microphone Array Installed on Drones Using Blind Source Separation

Authors: Youngsun Moon, Yeong-Ju Go, Jong-Soo Choi

Abstract:

Most drones that currently have surveillance/reconnaissance missions are basically equipped with optical equipment, but we also need to use a microphone array to estimate the location of the acoustic source. This can provide additional information in the absence of optical equipment. The purpose of this study is to estimate Direction of Arrival (DOA) based on Time Difference of Arrival (TDOA) estimation of the acoustic source in the drone. The problem is that it is impossible to measure the clear target acoustic source because of the drone noise. To overcome this problem is to separate the drone noise and the target acoustic source using Blind Source Separation(BSS) based on Independent Component Analysis(ICA). ICA can be performed assuming that the drone noise and target acoustic source are independent and each signal has non-gaussianity. For maximized non-gaussianity each signal, we use Negentropy and Kurtosis based on probability theory. As a result, we can improve TDOA estimation and DOA estimation of the target source in the noisy environment. We simulated the performance of the DOA algorithm applying BSS algorithm, and demonstrated the simulation through experiment at the anechoic wind tunnel.

Keywords: aeroacoustics, acoustic source detection, time difference of arrival, direction of arrival, blind source separation, independent component analysis, drone

Procedia PDF Downloads 155
3447 The Relationship between Body Positioning and Badminton Smash Quality

Authors: Gongbing Shan, Shiming Li, Zhao Zhang, Bingjun Wan

Abstract:

Badminton originated in ancient civilizations in Europe and Asia more than 2000 years ago. Presently, it is played almost everywhere with estimated 220 million people playing badminton regularly, ranging from professionals to recreational players; and it is the second most played sport in the world after soccer. In Asia, the popularity of badminton and involvement of people surpass soccer. Unfortunately, scientific researches on badminton skills are hardly proportional to badminton’s popularity. A search of literature has shown that the literature body of biomechanical investigations is relatively small. One of the dominant skills in badminton is the forehand overhead smash, which consists of 1/5 attacks during games. Empirical evidences show that one has to adjust the body position in relation to the coming shuttlecock to produce a powerful and accurate smash. Therefore, positioning is a fundamental aspect influencing smash quality. A search of literature has shown that there is a dearth/lack of study on this fundamental aspect. The goals of this study were to determine the influence of positioning and training experience on smash quality in order to discover information that could help learn/acquire the skill. Using a 10-camera, 3D motion capture system (VICON MX, 200 frames/s) and 15-segment, full-body biomechanical model, 14 skilled and 15 novice players were measured and analyzed. Results have revealed that the body positioning has direct influence on the quality of a smash, especially on shuttlecock release angle and clearance height (passing over the net) of offensive players. The results also suggest that, for training a proper positioning, one could conduct a self-selected comfort position towards a statically hanged shuttlecock and then step one foot back – a practical reference marker for learning. This perceptional marker could be applied in guiding the learning and training of beginners. As one gains experience through repetitive training, improved limbs’ coordination would increase smash quality further. The researchers hope that the findings will benefit practitioners for developing effective training programs for beginners.

Keywords: 3D motion analysis, biomechanical modeling, shuttlecock release speed, shuttlecock release angle, clearance height

Procedia PDF Downloads 491
3446 Convolutional Neural Networks Architecture Analysis for Image Captioning

Authors: Jun Seung Woo, Shin Dong Ho

Abstract:

The Image Captioning models with Attention technology have developed significantly compared to previous models, but it is still unsatisfactory in recognizing images. We perform an extensive search over seven interesting Convolutional Neural Networks(CNN) architectures to analyze the behavior of different models for image captioning. We compared seven different CNN Architectures, according to batch size, using on public benchmarks: MS-COCO datasets. In our experimental results, DenseNet and InceptionV3 got about 14% loss and about 160sec training time per epoch. It was the most satisfactory result among the seven CNN architectures after training 50 epochs on GPU.

Keywords: deep learning, image captioning, CNN architectures, densenet, inceptionV3

Procedia PDF Downloads 125
3445 Polyampholytic Resins: Advances in Ion Exchanging Properties

Authors: N. P. G. N. Chandrasekara, R. M. Pashley

Abstract:

Ion exchange (IEX) resins are commonly available as cationic or anionic resins but not as polyampholytic resins. This is probably because sequential acid and base washing cannot produce complete regeneration of polyampholytic resins with chemically attached anionic and cationic groups in close proximity. The ‘Sirotherm’ process, developed by the Commonwealth Scientific and Industrial Research Organization (CSIRO) in Melbourne, Australia was originally based on the use of a physical mixture of weakly basic (WB) and weakly acidic (WA) ion-exchange resin beads. These resins were regenerated thermally and they were capable of removing salts from an aqueous solution at higher temperatures compared to the salt sorbed at ambient temperatures with a significant reduction of the sorption capacity with increasing temperature. A new process for the efficient regeneration of mixed bead resins using ammonium bicarbonate with heat was studied recently and this chemical/thermal regeneration technique has the capability for completely regenerating polyampholytic resins. Even so, the low IEX capacities of polyampholytic resins restrict their commercial applications. Recently, we have established another novel process for increasing the IEX capacity of a typical polyampholytic resin. In this paper we will discuss the chemical/thermal regeneration of a polyampholytic (WA/WB) resin and a novel process for enhancing its ion exchange capacity, by increasing its internal pore area. We also show how effective this method is for completely recycled regeneration, with the potential of substantially reducing chemical waste.

Keywords: capacity, ion exchange, polyampholytic resin, regeneration

Procedia PDF Downloads 375
3444 Obtaining High-Dimensional Configuration Space for Robotic Systems Operating in a Common Environment

Authors: U. Yerlikaya, R. T. Balkan

Abstract:

In this research, a method is developed to obtain high-dimensional configuration space for path planning problems. In typical cases, the path planning problems are solved directly in the 3-dimensional (D) workspace. However, this method is inefficient in handling the robots with various geometrical and mechanical restrictions. To overcome these difficulties, path planning may be formalized and solved in a new space which is called configuration space. The number of dimensions of the configuration space comes from the degree of freedoms of the system of interest. The method can be applied in two ways. In the first way, the point clouds of all the bodies of the system and interaction of them are used. The second way is performed via using the clearance function of simulation software where the minimum distances between surfaces of bodies are simultaneously measured. A double-turret system is held in the scope of this study. The 4-D configuration space of a double-turret system is obtained in these two ways. As a result, the difference between these two methods is around 1%, depending on the density of the point cloud. The disparity between the two forms steadily decreases as the point cloud density increases. At the end of the study, in order to verify 4-D configuration space obtained, 4-D path planning problem was realized as 2-D + 2-D and a sample path planning is carried out with using A* algorithm. Then, the accuracy of the configuration space is proved using the obtained paths on the simulation model of the double-turret system.

Keywords: A* algorithm, autonomous turrets, high-dimensional C-space, manifold C-space, point clouds

Procedia PDF Downloads 135
3443 Reed: An Approach Towards Quickly Bootstrapping Multilingual Acoustic Models

Authors: Bipasha Sen, Aditya Agarwal

Abstract:

Multilingual automatic speech recognition (ASR) system is a single entity capable of transcribing multiple languages sharing a common phone space. Performance of such a system is highly dependent on the compatibility of the languages. State of the art speech recognition systems are built using sequential architectures based on recurrent neural networks (RNN) limiting the computational parallelization in training. This poses a significant challenge in terms of time taken to bootstrap and validate the compatibility of multiple languages for building a robust multilingual system. Complex architectural choices based on self-attention networks are made to improve the parallelization thereby reducing the training time. In this work, we propose Reed, a simple system based on 1D convolutions which uses very short context to improve the training time. To improve the performance of our system, we use raw time-domain speech signals directly as input. This enables the convolutional layers to learn feature representations rather than relying on handcrafted features such as MFCC. We report improvement on training and inference times by atleast a factor of 4x and 7.4x respectively with comparable WERs against standard RNN based baseline systems on SpeechOcean's multilingual low resource dataset.

Keywords: convolutional neural networks, language compatibility, low resource languages, multilingual automatic speech recognition

Procedia PDF Downloads 115
3442 Quality of Service of Transportation Networks: A Hybrid Measurement of Travel Time and Reliability

Authors: Chin-Chia Jane

Abstract:

In a transportation network, travel time refers to the transmission time from source node to destination node, whereas reliability refers to the probability of a successful connection from source node to destination node. With an increasing emphasis on quality of service (QoS), both performance indexes are significant in the design and analysis of transportation systems. In this work, we extend the well-known flow network model for transportation networks so that travel time and reliability are integrated into the QoS measurement simultaneously. In the extended model, in addition to the general arc capacities, each intermediate node has a time weight which is the travel time for per unit of commodity going through the node. Meanwhile, arcs and nodes are treated as binary random variables that switch between operation and failure with associated probabilities. For pre-specified travel time limitation and demand requirement, the QoS of a transportation network is the probability that source can successfully transport the demand requirement to destination while the total transmission time is under the travel time limitation. This work is pioneering, since existing literatures that evaluate travel time reliability via a single optimization path, the proposed QoS focuses the performance of the whole network system. To compute the QoS of transportation networks, we first transfer the extended network model into an equivalent min-cost max-flow network model. In the transferred network, each arc has a new travel time weight which takes value 0. Each intermediate node is replaced by two nodes u and v, and an arc directed from u to v. The newly generated nodes u and v are perfect nodes. The new direct arc has three weights: travel time, capacity, and operation probability. Then the universal set of state vectors is recursively decomposed into disjoint subsets of reliable, unreliable, and stochastic vectors until no stochastic vector is left. The decomposition is made possible by applying existing efficient min-cost max-flow algorithm. Because the reliable subsets are disjoint, QoS can be obtained directly by summing the probabilities of these reliable subsets. Computational experiments are conducted on a benchmark network which has 11 nodes and 21 arcs. Five travel time limitations and five demand requirements are set to compute the QoS value. To make a comparison, we test the exhaustive complete enumeration method. Computational results reveal the proposed algorithm is much more efficient than the complete enumeration method. In this work, a transportation network is analyzed by an extended flow network model where each arc has a fixed capacity, each intermediate node has a time weight, and both arcs and nodes are independent binary random variables. The quality of service of the transportation network is an integration of customer demands, travel time, and the probability of connection. We present a decomposition algorithm to compute the QoS efficiently. Computational experiments conducted on a prototype network show that the proposed algorithm is superior to existing complete enumeration methods.

Keywords: quality of service, reliability, transportation network, travel time

Procedia PDF Downloads 216
3441 Thermal Conductivity and Diffusivity of Alternative Refrigerants as Retrofit for Freon 12

Authors: Mutalubi Aremu Akintunde, John Isa

Abstract:

The negative impact on the atmosphere, of chlorofluorocarbon refrigerants (CFC) radical changes and measures were put in place to replace them. This has led to search for alternative refrigerants over the past decades. This paper presents thermal conductivity, diffusivity and performance of two alternative refrigerants as replacement to R12, which has been a versatile refrigerant which had turned the refrigeration industries around for decades, but one of the offensive refrigerants. The new refrigerants were coded RA1 (50%R600a/50%R134a;) and RA2 (70%R600a/30%R134a). The diffusivities for RA1 and RA2 were estimated to be, 2.76384 X 10-8 m2/s and 2.74386 X 10-8 m2/s respectively, while that of R12 under the same experimental condition is 2.43772 X 10-8 m2/s. The performances of the two refrigerants in a refrigerator initially designed for R12, were very close to that of R12. Other thermodynamic parameters showed that R12 can be replaced with both RA1 and RA2.

Keywords: alternative refrigerants, conductivity, diffusivity, performance, refrigerants

Procedia PDF Downloads 158
3440 Influence of Emotional Intelligence on Educational Supervision and Leadership Style in Saudi Arabia

Authors: Jawaher Bakheet Almudarra

Abstract:

An Educational Supervisor assists teachers to develop their competence and skills in teaching, solving educational problems, and to improve the teaching methods to suit the educational process. They evaluate their teachers and write reports based on their assessments. In 1957, the Saudi Ministry of Education instituted Educational Supervision to facilitate effective management of schools, however, there have been concerns that the Educational Supervision has not been effective in executing its mandate. Studies depicted that Educational supervision has not been effective because it has been marred by poor and autocratic leadership practices such as stringent inspection, commanding and judging. Therefore, there is need to consider some of the ways in which school outcomes can be enhanced through the improvement of Educational supervision practices. Emotional intelligence is a relatively new concept that can be integrated into the Saudi education system that is yet to be examined in-depth and embraced particularly in the realm of educational leadership. Its recognition and adoption may improve leadership practices among Educational supervisors. This study employed a qualitative interpretive approach that will focus on decoding, describing and interpreting the connection between emotional intelligence and leadership. The study also took into account the social constructions that include consciousness, language and shared meanings. The data collection took place in the Office of Educational Supervisors in Riyadh and involved 4 Educational supervisors and 20 teachers from both genders- male and female. The data collection process encompasses three methods namely; qualitative emotional intelligence self-assessment questionnaires, reflective semi-structured interviews, and open workshops. The questionnaires would explore whether the Educational supervisors understand the meaning of emotional intelligence and its significance in enhancing the quality of education system in Saudi Arabia. Subsequently, reflective semi-structured interviews were carried out with the Educational supervisors to explore the connection between their leadership styles and the way they conceptualise their emotionality. The open workshops will include discussions on emotional aspects of Educational supervisors’ practices and how Educational supervisors make use of the emotional intelligence discourse in their leadership and supervisory relationships.

Keywords: directors of educational supervision, emotional intelligence, educational leadership, education management

Procedia PDF Downloads 420
3439 Design of Multi-Loop Controller for Minimization of Energy Consumption in the Distillation Column

Authors: Vinayambika S. Bhat, S. Shanmuga Priya, I. Thirunavukkarasu, Shreeranga Bhat

Abstract:

An attempt has been made to design a decoupling controller for systems with more inputs more outputs with dead time in it. The de-coupler is designed for the chemical process industry 3×3 plant transfer function with dead time. The Quantitative Feedback Theory (QFT) based controller has also been designed here for the 2×2 distillation column transfer function. The developed control techniques were simulated using the MATLAB/Simulink. Also, the stability of the process was analyzed, together with the presence of various perturbations in it. Time domain specifications like setting time along with overshoot and oscillations were analyzed to prove the efficiency of the de-coupler method. The load disturbance rejection was tested along with its performance. The QFT control technique was synthesized based on the stability and performance specifications in the presence of uncertainty in time constant of the plant transfer function through sequential loop shaping technique. Further, the energy efficiency of the distillation column was improved by proper tuning of the controller. A distillation column consumes 3% of the total energy consumption of the world. A suitable control technique is very important from an economic point of view. The real time implementation of the process is under process in our laboratory.

Keywords: distillation, energy, MIMO process, time delay, robust stability

Procedia PDF Downloads 407
3438 Optimization of Sequential Thermophilic Bio-Hydrogen/Methane Production from Mono-Ethylene Glycol via Anaerobic Digestion: Impact of Inoculum to Substrate Ratio and N/P Ratio

Authors: Ahmed Elreedy, Ahmed Tawfik

Abstract:

This investigation aims to assess the effect of inoculum to substrate ratio (ISR) and nitrogen to phosphorous balance on simultaneous biohydrogen and methane production from anaerobic decomposition of mono-ethylene glycol (MEG). Different ISRs were applied in the range between 2.65 and 13.23 gVSS/gCOD, whereas the tested N/P ratios were changed from 4.6 to 8.5; both under thermophilic conditions (55°C). The maximum obtained methane and hydrogen yields (MY and HY) of 151.86±10.8 and 22.27±1.1 mL/gCODinitial were recorded at ISRs of 5.29 and 3.78 gVSS/gCOD, respectively. Unlikely, the ammonification process, in terms of net ammonia produced, was found to be ISR and COD/N ratio dependent, reaching its peak value of 515.5±31.05 mgNH4-N/L at ISR and COD/N ratio of 13.23 gVSS/gCOD and 11.56. The optimum HY was enhanced by more than 1.45-fold with declining N/P ratio from 8.5 to 4.6; whereas, the MY was improved (1.6-fold), while increasing N/P ratio from 4.6 to 5.5 with no significant impact at N/P ratio of 8.5. The results obtained revealed that the methane production was strongly influenced by initial ammonia, compared to initial phosphate. Likewise, the generation of ammonia was markedly deteriorated from 535.25±41.5 to 238.33±17.6 mgNH4-N/L with increasing N/P ratio from 4.6 to 8.5. The kinetic study using Modified Gompertz equation was successfully fitted to the experimental outputs (R2 > 0.9761).

Keywords: mono-ethylene glycol, biohydrogen and methane, inoculum to substrate ratio, nitrogen to phosphorous balance, ammonification

Procedia PDF Downloads 376
3437 Supercomputer Simulation of Magnetic Multilayers Films

Authors: Vitalii Yu. Kapitan, Aleksandr V. Perzhu, Konstantin V. Nefedev

Abstract:

The necessity of studying magnetic multilayer structures is explained by the prospects of their practical application as a technological base for creating new storages medium. Magnetic multilayer films have many unique features that contribute to increasing the density of information recording and the speed of storage devices. Multilayer structures are structures of alternating magnetic and nonmagnetic layers. In frame of the classical Heisenberg model, lattice spin systems with direct short- and long-range exchange interactions were investigated by Monte Carlo methods. The thermodynamic characteristics of multilayer structures, such as the temperature behavior of magnetization, energy, and heat capacity, were investigated. The processes of magnetization reversal of multilayer structures in external magnetic fields were investigated. The developed software is based on the new, promising programming language Rust. Rust is a new experimental programming language developed by Mozilla. The language is positioned as an alternative to C and C++. For the Monte Carlo simulation, the Metropolis algorithm and its parallel implementation using MPI and the Wang-Landau algorithm were used. We are planning to study of magnetic multilayer films with asymmetric Dzyaloshinskii–Moriya (DM) interaction, interfacing effects and skyrmions textures. This work was supported by the state task of the Ministry of Education and Science of the Russia # 3.7383.2017/8.9

Keywords: The Monte Carlo methods, Heisenberg model, multilayer structures, magnetic skyrmion

Procedia PDF Downloads 163
3436 Hard Disk Failure Predictions in Supercomputing System Based on CNN-LSTM and Oversampling Technique

Authors: Yingkun Huang, Li Guo, Zekang Lan, Kai Tian

Abstract:

Hard disk drives (HDD) failure of the exascale supercomputing system may lead to service interruption and invalidate previous calculations, and it will cause permanent data loss. Therefore, initiating corrective actions before hard drive failures materialize is critical to the continued operation of jobs. In this paper, a highly accurate analysis model based on CNN-LSTM and oversampling technique was proposed, which can correctly predict the necessity of a disk replacement even ten days in advance. Generally, the learning-based method performs poorly on a training dataset with long-tail distribution, especially fault prediction is a very classic situation as the scarcity of failure data. To overcome the puzzle, a new oversampling was employed to augment the data, and then, an improved CNN-LSTM with the shortcut was built to learn more effective features. The shortcut transmits the results of the previous layer of CNN and is used as the input of the LSTM model after weighted fusion with the output of the next layer. Finally, a detailed, empirical comparison of 6 prediction methods is presented and discussed on a public dataset for evaluation. The experiments indicate that the proposed method predicts disk failure with 0.91 Precision, 0.91 Recall, 0.91 F-measure, and 0.90 MCC for 10 days prediction horizon. Thus, the proposed algorithm is an efficient algorithm for predicting HDD failure in supercomputing.

Keywords: HDD replacement, failure, CNN-LSTM, oversampling, prediction

Procedia PDF Downloads 75