Search results for: mapping algorithm
2237 An Assessment of the Writing Skills of Reflective Essay of Grade 10 Students in Selected Secondary Schools in Valenzuela City
Authors: Reynald Contreras, Shaina Marie Bho, Kate Roan Dela Cruz, Marvin Dela Cruz
Abstract:
This study was conducted with the aim of determining the skill level of grade ten (Grade 10) students in writing a reflective essay in selected secondary schools of Valenzuela. This research used descriptive and qualitative-quantitative research methods to systematically and accurately describe the level of writing skills of students and used a convenient sampling technique in selecting forty (40) students in grade ten. (Grade 10) at Polo, Wawang Pulo, and Arkong Batong high schools with a total of one hundred and twenty (120) students to assess the written reflective essay using modified rubrics developed based on 6+1 writing traits by Ruth Culham. According to the findings of the study, students at Polo and Wawang Pulo National high schools have low levels of writing skills that need to be developed or are not proficient. Meanwhile, Arkong Bato National High School has achieved a high degree of writing proficiency. Based on the study's findings, the researchers devised a suggested curriculum mapping for the suggested activity or intervention activity that would aid in the development and cultivation of the writing skills of children in grade ten (Grade 10).Keywords: writing skills, reflective essay, intervention activity, 6+1 writing traits, modified rubrics
Procedia PDF Downloads 1222236 DC/DC Boost Converter Applied to Photovoltaic Pumping System Application
Authors: S. Abdourraziq, M. A. Abdourraziq
Abstract:
One of the most famous and important applications of solar energy systems is water pumping. It is often used for irrigation or to supply water in countryside or private firm. However, the cost and the efficiency are still a concern, especially with a continued variation of solar radiation and temperature throughout the day. Then, the improvement of the efficiency of the system components is one of the different solutions to reducing the cost. In this paper, we will present a detailed definition of each element of a PV pumping system, and we will present the different MPPT algorithm used in the literature. Our system consists of a PV panel, a boost converter, a motor-pump set, and a storage tank.Keywords: PV cell, converter, MPPT, MPP, PV pumping system
Procedia PDF Downloads 1582235 Assessment of Efficiency of Underwater Undulatory Swimming Strategies Using a Two-Dimensional CFD Method
Authors: Dorian Audot, Isobel Margaret Thompson, Dominic Hudson, Joseph Banks, Martin Warner
Abstract:
In competitive swimming, after dives and turns, athletes perform underwater undulatory swimming (UUS), copying marine mammals’ method of locomotion. The body, performing this wave-like motion, accelerates the fluid downstream in its vicinity, generating propulsion with minimal resistance. Through this technique, swimmers can maintain greater speeds than surface swimming and take advantage of the overspeed granted by the dive (or push-off). Almost all previous work has considered UUS when performed at maximum effort. Critical parameters to maximize UUS speed are frequently discussed; however, this does not apply to most races. In only 3 out of the 16 individual competitive swimming events are athletes likely to attempt to perform UUS with the greatest speed, without thinking of the cost of locomotion. In the other cases, athletes will want to control the speed of their underwater swimming, attempting to maximise speed whilst considering energy expenditure appropriate to the duration of the event. Hence, there is a need to understand how swimmers adapt their underwater strategies to optimize the speed within the allocated energetic cost. This paper develops a consistent methodology that enables different sets of UUS kinematics to be investigated. These may have different propulsive efficiencies and force generation mechanisms (e.g.: force distribution along with the body and force magnitude). The developed methodology, therefore, needs to: (i) provide an understanding of the UUS propulsive mechanisms at different speeds, (ii) investigate the key performance parameters when UUS is not performed solely for maximizing speed; (iii) consistently determine the propulsive efficiency of a UUS technique. The methodology is separated into two distinct parts: kinematic data acquisition and computational fluid dynamics (CFD) analysis. For the kinematic acquisition, the position of several joints along the body and their sequencing were either obtained by video digitization or by underwater motion capture (Qualisys system). During data acquisition, the swimmers were asked to perform UUS at a constant depth in a prone position (facing the bottom of the pool) at different speeds: maximum effort, 100m pace, 200m pace and 400m pace. The kinematic data were input to a CFD algorithm employing a two-dimensional Large Eddy Simulation (LES). The algorithm adopted was specifically developed in order to perform quick unsteady simulations of deforming bodies and is therefore suitable for swimmers performing UUS. Despite its approximations, the algorithm is applied such that simulations are performed with the inflow velocity updated at every time step. It also enables calculations of the resistive forces (total and applied to each segment) and the power input of the modeled swimmer. Validation of the methodology is achieved by comparing the data obtained from the computations with the original data (e.g.: sustained swimming speed). This method is applied to the different kinematic datasets and provides data on swimmers’ natural responses to pacing instructions. The results show how kinematics affect force generation mechanisms and hence how the propulsive efficiency of UUS varies for different race strategies.Keywords: CFD, efficiency, human swimming, hydrodynamics, underwater undulatory swimming
Procedia PDF Downloads 2192234 Ontology Expansion via Synthetic Dataset Generation and Transformer-Based Concept Extraction
Authors: Andrey Khalov
Abstract:
The rapid proliferation of unstructured data in IT infrastructure management demands innovative approaches for extracting actionable knowledge. This paper presents a framework for ontology-based knowledge extraction that combines relational graph neural networks (R-GNN) with large language models (LLMs). The proposed method leverages the DOLCE framework as the foundational ontology, extending it with concepts from ITSMO for domain-specific applications in IT service management and outsourcing. A key component of this research is the use of transformer-based models, such as DeBERTa-v3-large, for automatic entity and relationship extraction from unstructured texts. Furthermore, the paper explores how transfer learning techniques can be applied to fine-tune large language models (LLaMA) for using to generate synthetic datasets to improve precision in BERT-based entity recognition and ontology alignment. The resulting IT Ontology (ITO) serves as a comprehensive knowledge base that integrates domain-specific insights from ITIL processes, enabling more efficient decision-making. Experimental results demonstrate significant improvements in knowledge extraction and relationship mapping, offering a cutting-edge solution for enhancing cognitive computing in IT service environments.Keywords: ontology expansion, synthetic dataset, transformer fine-tuning, concept extraction, DOLCE, BERT, taxonomy, LLM, NER
Procedia PDF Downloads 142233 Cognitive Semantics Study of Conceptual and Metonymical Expressions in Johnson's Speeches about COVID-19
Authors: Hussain Hameed Mayuuf
Abstract:
The study is an attempt to investigate the conceptual metonymies is used in political discourse about COVID-19. Thus, this study tries to analyze and investigate how the conceptual metonymies in Johnson's speech about coronavirus are constructed. This study aims at: Identifying how are metonymies relevant to understand the messages in Boris Johnson speeches and to find out how can conceptual blending theory help people to understand the messages in the political speech about COVID-19. Lastly, it tries to Point out the kinds of integration networks are common in political speech. The study is based on the hypotheses that conceptual blending theory is a powerful tool for investigating the intended messages in Johnson's speech and there are different processes of blending networks and conceptual mapping that enable the listeners to identify the messages in political speech. This study presents a qualitative and quantitative analysis of four speeches about COVID-19; they are said by Boris Johnson. The selected data have been tackled from the cognitive-semantic perspective by adopting Conceptual Blending Theory as a model for the analysis. It concludes that CBT is applicable to the analysis of metonymies in political discourse. Its mechanisms enable listeners to analyze and understand these speeches. Also the listener can identify and understand the hidden messages in Biden and Johnson's discourse about COVID-19 by using different conceptual networks. Finally, it is concluded that the double scope networks are the most common types of blending of metonymies in the political speech.Keywords: cognitive, semantics, conceptual, metonymical, Covid-19
Procedia PDF Downloads 1282232 Electrodermal Activity Measurement Using Constant Current AC Source
Authors: Cristian Chacha, David Asiain, Jesús Ponce de León, José Ramón Beltrán
Abstract:
This work explores and characterizes the behavior of the AFE AD5941 in impedance measurement using an embedded algorithm with a constant current AC source. The main aim of this research is to improve the exact measurement of impedance values for their application in EDA-focused wearable devices. Through comprehensive study and characterization, it has been observed that employing a measurement sequence with a constant current source produces results with increased dispersion but higher accuracy. As a result, this approach leads to a more accurate system for impedance measurement.Keywords: EDA, constant current AC source, wearable, precision, accuracy, impedance
Procedia PDF Downloads 1072231 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery
Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene
Abstract:
Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.Keywords: multi-objective, analysis, data flow, freight delivery, methodology
Procedia PDF Downloads 1802230 Optimization Process for Ride Quality of a Nonlinear Suspension Model Based on Newton-Euler’ Augmented Formulation
Authors: Mohamed Belhorma, Aboubakar S. Bouchikhi, Belkacem Bounab
Abstract:
This paper addresses modeling a Double A-Arm suspension, a three-dimensional nonlinear model has been developed using the multibody systems formalism. Dynamical study of the different components responses was done, particularly for the wheel assembly. To validate those results, the system was constructed and simulated by RecurDyn, a professional multibody dynamics simulation software. The model has been used as the Objectif function in an optimization algorithm for ride quality improvement.Keywords: double A-Arm suspension, multibody systems, ride quality optimization, dynamic simulation
Procedia PDF Downloads 1382229 Research Methods and Design Strategies to Improve Resilience in Coastal and Estuary Cities
Authors: Irene Perez Lopez
Abstract:
Delta and estuary cities are spaces constantly evolving, incessantly altered by the ever-changing actions of water transformation. Strategies that incorporate comprehensive and integrated approaches to planning and design with water will play a powerful role in defining new types of flood defense. These strategies will encourage more resilient and active urban environments, allowing for new spatial and functional programs. This abstract presents the undergoing research in Newcastle, the first urbanized delta in New South Wales (Australia), and the region's second-biggest catchment and estuary. The research methodology is organized in three phases: 1) a projective cartography that analyses maps and data across the region's recorded history, identifying past and present constraints, and predicting future conditions. The cartography aids to identify worst-case scenarios, revealing the implications of land reclamation that have not considered the confronting evolution of climate change and its conflicts with inhabitation; 2) the cartographic studies identify the areas under threat and form the basis for further interdisciplinary research, complimented by community consultation, to reduce flood risk and increase urban resilience and livability; 3) a speculative or prospective phase of design with water to generate evidence-based guidelines that strengthen urban resilience of shorelines and flood prone areas.Keywords: coastal defense, design, urban resilience, mapping
Procedia PDF Downloads 1322228 Tool Wear of Aluminum/Chromium/Tungsten Based Coated Cemented Carbide Tools in Cutting Sintered Steel
Authors: Tadahiro Wada, Hiroyuki Hanyu
Abstract:
In this study, to clarify the effectiveness of an aluminum/chromium/tungsten-based-coated tool for cutting sintered steel, tool wear was experimentally investigated. The sintered steel was turned with the (Al60,Cr25,W15)N-, (Al60,Cr25,W15)(C,N)- and (Al64,Cr28,W8)(C,N)-coated cemented carbide tools according to the physical vapor deposition (PVD) method. Moreover, the tool wear of the aluminum/chromium/tungsten-based-coated item was compared with that of the (Al,Cr)N coated tool. Furthermore, to clarify the tool wear mechanism of the aluminum/chromium/tungsten-coating film for cutting sintered steel, Scanning Electron Microscope observation and Energy Dispersive x-ray Spectroscopy mapping analysis were conducted on the abraded surface. The following results were obtained: (1) The wear progress of the (Al64,Cr28,W8)(C,N)-coated tool was the slowest among that of the five coated tools. (2) Adding carbon (C) to the aluminum/chromium/tungsten-based-coating film was effective for improving the wear-resistance. (3) The main wear mechanism of the (Al60,Cr25,W15)N-, the (Al60,Cr25,W15)(C,N)- and the (Al64,Cr28,W8)(C,N)-coating films was abrasive wear.Keywords: cutting, physical vapor deposition coating method, tool wear, tool wear mechanism, (Al, Cr, W)N-coating film, (Al, Cr, W)(C, N)-coating film, sintered steel
Procedia PDF Downloads 3812227 A Combined Meta-Heuristic with Hyper-Heuristic Approach to Single Machine Production Scheduling Problem
Authors: C. E. Nugraheni, L. Abednego
Abstract:
This paper is concerned with minimization of mean tardiness and flow time in a real single machine production scheduling problem. Two variants of genetic algorithm as meta-heuristic are combined with hyper-heuristic approach are proposed to solve this problem. These methods are used to solve instances generated with real world data from a company. Encouraging results are reported.Keywords: hyper-heuristics, evolutionary algorithms, production scheduling, meta-heuristic
Procedia PDF Downloads 3812226 Design and Implementation of a Hardened Cryptographic Coprocessor with 128-bit RISC-V Core
Authors: Yashas Bedre Raghavendra, Pim Vullers
Abstract:
This study presents the design and implementation of an abstract cryptographic coprocessor, leveraging AMBA(Advanced Microcontroller Bus Architecture) protocols - APB (Advanced Peripheral Bus) and AHB (Advanced High-performance Bus), to enable seamless integration with the main CPU(Central processing unit) and enhance the coprocessor’s algorithm flexibility. The primary objective is to create a versatile coprocessor that can execute various cryptographic algorithms, including ECC(Elliptic-curve cryptography), RSA(Rivest–Shamir–Adleman), and AES (Advanced Encryption Standard) while providing a robust and secure solution for modern secure embedded systems. To achieve this goal, the coprocessor is equipped with a tightly coupled memory (TCM) for rapid data access during cryptographic operations. The TCM is placed within the coprocessor, ensuring quick retrieval of critical data and optimizing overall performance. Additionally, the program memory is positioned outside the coprocessor, allowing for easy updates and reconfiguration, which enhances adaptability to future algorithm implementations. Direct links are employed instead of DMA(Direct memory access) for data transfer, ensuring faster communication and reducing complexity. The AMBA-based communication architecture facilitates seamless interaction between the coprocessor and the main CPU, streamlining data flow and ensuring efficient utilization of system resources. The abstract nature of the coprocessor allows for easy integration of new cryptographic algorithms in the future. As the security landscape continues to evolve, the coprocessor can adapt and incorporate emerging algorithms, making it a future-proof solution for cryptographic processing. Furthermore, this study explores the addition of custom instructions into RISC-V ISE (Instruction Set Extension) to enhance cryptographic operations. By incorporating custom instructions specifically tailored for cryptographic algorithms, the coprocessor achieves higher efficiency and reduced cycles per instruction (CPI) compared to traditional instruction sets. The adoption of RISC-V 128-bit architecture significantly reduces the total number of instructions required for complex cryptographic tasks, leading to faster execution times and improved overall performance. Comparisons are made with 32-bit and 64-bit architectures, highlighting the advantages of the 128-bit architecture in terms of reduced instruction count and CPI. In conclusion, the abstract cryptographic coprocessor presented in this study offers significant advantages in terms of algorithm flexibility, security, and integration with the main CPU. By leveraging AMBA protocols and employing direct links for data transfer, the coprocessor achieves high-performance cryptographic operations without compromising system efficiency. With its TCM and external program memory, the coprocessor is capable of securely executing a wide range of cryptographic algorithms. This versatility and adaptability, coupled with the benefits of custom instructions and the 128-bit architecture, make it an invaluable asset for secure embedded systems, meeting the demands of modern cryptographic applications.Keywords: abstract cryptographic coprocessor, AMBA protocols, ECC, RSA, AES, tightly coupled memory, secure embedded systems, RISC-V ISE, custom instructions, instruction count, cycles per instruction
Procedia PDF Downloads 702225 Ama de Casa: Gender Division of Labor the Response to Environmental and Economic Constraints, Ecuador
Authors: Tyrus C. Torres, Michael Harris
Abstract:
In a coastal town of Ecuador, the role of women is commonly defined as an ama de casa, a woman who works in the house, raises children, and contributes to the community. This project, under the guidance of Dr. Michael Harris from the Florida Atlantic University, seeks to understand how the role of an ama de casa provides a secure environment for men and women, coexists with economic and environmental constraints that explain the origins of how this environment has been formed. The coastal community aspects of familia (family), trabajo (work), relación (relationships), machismo (masculinity), feminista (femininity), and the culture of Ecuador define the ways of life in a coastal setting. This ethnographic research project included the following methodologies: environment mapping, conducting interviews, surveys, participant observation, direct and indirect observations, and integration into daily life. Immersion into the daily life and building relationships with the local people allowed the documentation of intricacies of both the cultural and social spheres. The findings of this research offer insight on how culture, economics, and environment can form female and male agency. Our investigation shows that occupations such as fishermen, laborers, ama de casas, and even students utilize occupational routes to create social agency in the face of economic and environmental constraints in Ecuador.Keywords: Ecuador, ethnography, gender division of labor, gender roles
Procedia PDF Downloads 2422224 Development of an Implicit Coupled Partitioned Model for the Prediction of the Behavior of a Flexible Slender Shaped Membrane in Interaction with Free Surface Flow under the Influence of a Moving Flotsam
Authors: Mahtab Makaremi Masouleh, Günter Wozniak
Abstract:
This research is part of an interdisciplinary project, promoting the design of a light temporary installable textile defence system against flood. In case river water levels increase abruptly especially in winter time, one can expect massive extra load on a textile protective structure in term of impact as a result of floating debris and even tree trunks. Estimation of this impulsive force on such structures is of a great importance, as it can ensure the reliability of the design in critical cases. This fact provides the motivation for the numerical analysis of a fluid structure interaction application, comprising flexible slender shaped and free-surface water flow, where an accelerated heavy flotsam tends to approach the membrane. In this context, the analysis on both the behavior of the flexible membrane and its interaction with moving flotsam is conducted by finite elements based solvers of the explicit solver and implicit Abacus solver available as products of SIMULIA software. On the other hand, a study on how free surface water flow behaves in response to moving structures, has been investigated using the finite volume solver of Star CCM+ from Siemens PLM Software. An automatic communication tool (CSE, SIMULIA Co-Simulation Engine) and the implementation of an effective partitioned strategy in form of an implicit coupling algorithm makes it possible for partitioned domains to be interconnected powerfully. The applied procedure ensures stability and convergence in the solution of these complicated issues, albeit with high computational cost; however, the other complexity of this study stems from mesh criterion in the fluid domain, where the two structures approach each other. This contribution presents the approaches for the establishment of a convergent numerical solution and compares the results with experimental findings.Keywords: co-simulation, flexible thin structure, fluid-structure interaction, implicit coupling algorithm, moving flotsam
Procedia PDF Downloads 3892223 Subjective Evaluation of Mathematical Morphology Edge Detection on Computed Tomography (CT) Images
Authors: Emhimed Saffor
Abstract:
In this paper, the problem of edge detection in digital images is considered. Three methods of edge detection based on mathematical morphology algorithm were applied on two sets (Brain and Chest) CT images. 3x3 filter for first method, 5x5 filter for second method and 7x7 filter for third method under MATLAB programming environment. The results of the above-mentioned methods are subjectively evaluated. The results show these methods are more efficient and satiable for medical images, and they can be used for different other applications.Keywords: CT images, Matlab, medical images, edge detection
Procedia PDF Downloads 3382222 Optimal Placement of the Unified Power Controller to Improve the Power System Restoration
Authors: Mohammad Reza Esmaili
Abstract:
One of the most important parts of the restoration process of a power network is the synchronizing of its subsystems. In this situation, the biggest concern of the system operators will be the reduction of the standing phase angle (SPA) between the endpoints of the two islands. In this regard, the system operators perform various actions and maneuvers so that the synchronization operation of the subsystems is successfully carried out and the system finally reaches acceptable stability. The most common of these actions include load control, generation control and, in some cases, changing the network topology. Although these maneuvers are simple and common, due to the weak network and extreme load changes, the restoration will be associated with low speed. One of the best ways to control the SPA is to use FACTS devices. By applying a soft control signal, these tools can reduce the SPA between two subsystems with more speed and accuracy, and the synchronization process can be done in less time. Meanwhile, the unified power controller (UPFC), a series-parallel compensator device with the change of transmission line power and proper adjustment of the phase angle, will be the proposed option in order to realize the subject of this research. Therefore, with the optimal placement of UPFC in a power system, in addition to improving the normal conditions of the system, it is expected to be effective in reducing the SPA during power system restoration. Therefore, the presented paper provides an optimal structure to coordinate the three problems of improving the division of subsystems, reducing the SPA and optimal power flow with the aim of determining the optimal location of UPFC and optimal subsystems. The proposed objective functions in this paper include maximizing the quality of the subsystems, reducing the SPA at the endpoints of the subsystems, and reducing the losses of the power system. Since there will be a possibility of creating contradictions in the simultaneous optimization of the proposed objective functions, the structure of the proposed optimization problem is introduced as a non-linear multi-objective problem, and the Pareto optimization method is used to solve it. The innovative technique proposed to implement the optimization process of the mentioned problem is an optimization algorithm called the water cycle (WCA). To evaluate the proposed method, the IEEE 39 bus power system will be used.Keywords: UPFC, SPA, water cycle algorithm, multi-objective problem, pareto
Procedia PDF Downloads 662221 ANAC-id - Facial Recognition to Detect Fraud
Authors: Giovanna Borges Bottino, Luis Felipe Freitas do Nascimento Alves Teixeira
Abstract:
This article aims to present a case study of the National Civil Aviation Agency (ANAC) in Brazil, ANAC-id. ANAC-id is the artificial intelligence algorithm developed for image analysis that recognizes standard images of unobstructed and uprighted face without sunglasses, allowing to identify potential inconsistencies. It combines YOLO architecture and 3 libraries in python - face recognition, face comparison, and deep face, providing robust analysis with high level of accuracy.Keywords: artificial intelligence, deepface, face compare, face recognition, YOLO, computer vision
Procedia PDF Downloads 1562220 Mapping the Poor in Ghana: A Geospatial Multidimensional Poverty Index Approach
Authors: Bernard Kumi-Boateng, Joseph Edem Vigbedor, Irene Asante Sakyi
Abstract:
Globally, especially in developing nations, governments persistently prioritize poverty alleviation and eradication as key objectives. Numerous international organizations also acknowledge the urgent need to reduce poverty levels over the next decade, making poverty reduction a critical global issue. During the past three decades, the government of Ghana has developed and subsequently implemented several development policy frameworks as part of its poverty reduction programmes. In order to reduce and alleviate poverty, one of the parameters that play a key role is statistics on poverty. However, in many developing countries such as Ghana such statistics do not exist thus it makes poverty alleviation intervention a bit scattered and untargeted. Due to this, there exist a major problem presently; that is reaching the poor to address their specific needs. In response to this challenge, there is therefore the need to produce poverty map to assist policy makers. This research therefore sought to use GIS to map out poverty endemic areas by displaying the spatial dimensions of poverty and identify the poverty pockets across the country adopting a Multidimensional (Non-Monetary) Poverty Index approach. Ten indicators which were categories under three dimensions were used. Results of the study showed that across Ghana, a considerable percentage of household are deprived in several non-monetary poverty indicators. Analysis of these indicators revealed wide disparities by region. Generally, wide disparities exist between the proportion of households deprived in the three northern regions and their counterparts in southern Ghana.Keywords: GIS, multidimensional poverty index, indicator, dimension, poverty
Procedia PDF Downloads 172219 Relevant LMA Features for Human Motion Recognition
Authors: Insaf Ajili, Malik Mallem, Jean-Yves Didier
Abstract:
Motion recognition from videos is actually a very complex task due to the high variability of motions. This paper describes the challenges of human motion recognition, especially motion representation step with relevant features. Our descriptor vector is inspired from Laban Movement Analysis method. We propose discriminative features using the Random Forest algorithm in order to remove redundant features and make learning algorithms operate faster and more effectively. We validate our method on MSRC-12 and UTKinect datasets.Keywords: discriminative LMA features, features reduction, human motion recognition, random forest
Procedia PDF Downloads 1952218 A Fast Calculation Approach for Position Identification in a Distance Space
Authors: Dawei Cai, Yuya Tokuda
Abstract:
The market of localization based service (LBS) is expanding. The acquisition of physical location is the fundamental basis for LBS. GPS, the de facto standard for outdoor localization, does not work well in indoor environment due to the blocking of signals by walls and ceiling. To acquire high accurate localization in an indoor environment, many techniques have been developed. Triangulation approach is often used for identifying the location, but a heavy and complex computation is necessary to calculate the location of the distances between the object and several source points. This computation is also time and power consumption, and not favorable to a mobile device that needs a long action life with battery. To provide a low power consumption approach for a mobile device, this paper presents a fast calculation approach to identify the location of the object without online solving solutions to simultaneous quadratic equations. In our approach, we divide the location identification into two parts, one is offline, and other is online. In offline mode, we make a mapping process that maps the location area to distance space and find a simple formula that can be used to identify the location of the object online with very light computation. The characteristic of the approach is a good tradeoff between the accuracy and computational amount. Therefore, this approach can be used in smartphone and other mobile devices that need a long work time. To show the performance, some simulation experimental results are provided also in the paper.Keywords: indoor localization, location based service, triangulation, fast calculation, mobile device
Procedia PDF Downloads 1742217 Closing the Gap: Efficient Voxelization with Equidistant Scanlines and Gap Detection
Authors: S. Delgado, C. Cerrada, R. S. Gómez
Abstract:
This research introduces an approach to voxelizing the surfaces of triangular meshes with efficiency and accuracy. Our method leverages parallel equidistant scan-lines and introduces a Gap Detection technique to address the limitations of existing approaches. We present a comprehensive study showcasing the method's effectiveness, scalability, and versatility in different scenarios. Voxelization is a fundamental process in computer graphics and simulations, playing a pivotal role in applications ranging from scientific visualization to virtual reality. Our algorithm focuses on enhancing the voxelization process, especially for complex models and high resolutions. One of the major challenges in voxelization in the Graphics Processing Unit (GPU) is the high cost of discovering the same voxels multiple times. These repeated voxels incur in costly memory operations with no useful information. Our scan-line-based method ensures that each voxel is detected exactly once when processing the triangle, enhancing performance without compromising the quality of the voxelization. The heart of our approach lies in the use of parallel, equidistant scan-lines to traverse the interiors of triangles. This minimizes redundant memory operations and avoids revisiting the same voxels, resulting in a significant performance boost. Moreover, our method's computational efficiency is complemented by its simplicity and portability. Written as a single compute shader in Graphics Library Shader Language (GLSL), it is highly adaptable to various rendering pipelines and hardware configurations. To validate our method, we conducted extensive experiments on a diverse set of models from the Stanford repository. Our results demonstrate not only the algorithm's efficiency, but also its ability to produce 26 tunnel free accurate voxelizations. The Gap Detection technique successfully identifies and addresses gaps, ensuring consistent and visually pleasing voxelized surfaces. Furthermore, we introduce the Slope Consistency Value metric, quantifying the alignment of each triangle with its primary axis. This metric provides insights into the impact of triangle orientation on scan-line based voxelization methods. It also aids in understanding how the Gap Detection technique effectively improves results by targeting specific areas where simple scan-line-based methods might fail. Our research contributes to the field of voxelization by offering a robust and efficient approach that overcomes the limitations of existing methods. The Gap Detection technique fills a critical gap in the voxelization process. By addressing these gaps, our algorithm enhances the visual quality and accuracy of voxelized models, making it valuable for a wide range of applications. In conclusion, "Closing the Gap: Efficient Voxelization with Equidistant Scan-lines and Gap Detection" presents an effective solution to the challenges of voxelization. Our research combines computational efficiency, accuracy, and innovative techniques to elevate the quality of voxelized surfaces. With its adaptable nature and valuable innovations, this technique could have a positive influence on computer graphics and visualization.Keywords: voxelization, GPU acceleration, computer graphics, compute shaders
Procedia PDF Downloads 722216 A Time-Reducible Approach to Compute Determinant |I-X|
Authors: Wang Xingbo
Abstract:
Computation of determinant in the form |I-X| is primary and fundamental because it can help to compute many other determinants. This article puts forward a time-reducible approach to compute determinant |I-X|. The approach is derived from the Newton’s identity and its time complexity is no more than that to compute the eigenvalues of the square matrix X. Mathematical deductions and numerical example are presented in detail for the approach. By comparison with classical approaches the new approach is proved to be superior to the classical ones and it can naturally reduce the computational time with the improvement of efficiency to compute eigenvalues of the square matrix.Keywords: algorithm, determinant, computation, eigenvalue, time complexity
Procedia PDF Downloads 4152215 An Intelligent Text Independent Speaker Identification Using VQ-GMM Model Based Multiple Classifier System
Authors: Ben Soltane Cheima, Ittansa Yonas Kelbesa
Abstract:
Speaker Identification (SI) is the task of establishing identity of an individual based on his/her voice characteristics. The SI task is typically achieved by two-stage signal processing: training and testing. The training process calculates speaker specific feature parameters from the speech and generates speaker models accordingly. In the testing phase, speech samples from unknown speakers are compared with the models and classified. Even though performance of speaker identification systems has improved due to recent advances in speech processing techniques, there is still need of improvement. In this paper, a Closed-Set Tex-Independent Speaker Identification System (CISI) based on a Multiple Classifier System (MCS) is proposed, using Mel Frequency Cepstrum Coefficient (MFCC) as feature extraction and suitable combination of vector quantization (VQ) and Gaussian Mixture Model (GMM) together with Expectation Maximization algorithm (EM) for speaker modeling. The use of Voice Activity Detector (VAD) with a hybrid approach based on Short Time Energy (STE) and Statistical Modeling of Background Noise in the pre-processing step of the feature extraction yields a better and more robust automatic speaker identification system. Also investigation of Linde-Buzo-Gray (LBG) clustering algorithm for initialization of GMM, for estimating the underlying parameters, in the EM step improved the convergence rate and systems performance. It also uses relative index as confidence measures in case of contradiction in identification process by GMM and VQ as well. Simulation results carried out on voxforge.org speech database using MATLAB highlight the efficacy of the proposed method compared to earlier work.Keywords: feature extraction, speaker modeling, feature matching, Mel frequency cepstrum coefficient (MFCC), Gaussian mixture model (GMM), vector quantization (VQ), Linde-Buzo-Gray (LBG), expectation maximization (EM), pre-processing, voice activity detection (VAD), short time energy (STE), background noise statistical modeling, closed-set tex-independent speaker identification system (CISI)
Procedia PDF Downloads 3092214 Challenges in Early Diagnosis of Enlarged Vestibular Aqueduct (EVA) in Pediatric Population: A Single Case Report
Authors: Asha Manoharan, Sooraj A. O, Anju K. G
Abstract:
Enlarged vestibular aqueduct (EVA) refers to the presence of congenital sensorineural hearing loss with an enlarged vestibular aqueduct. The Audiological symptoms of EVA are fluctuating and progressive in nature and the diagnosis of EVAS can be confirmed only with radiological evaluation. Hence it is difficult to differentiate EVA from conditions like Meniere’s disease, semi-circular dehiscence, etc based on audiological findings alone. EVA in adults is easy to identify due to distinct vestibular symptoms. In children, EVA can remain either unidentified or misdiagnosed until the vestibular symptoms are evident. Motor developmental delay, especially the ones involving a change of body alignment, has been reported in the pediatric population with EVA. So, it should be made mandatory to recommend radiological evaluation in young children with fluctuating hearing loss reporting with motor developmental delay. This single case study of a baby with Enlarged Vestibular Aqueduct (EVA) primarily aimed to address the following: a) Challenges while diagnosing young patients with EVA and fluctuating hearing loss, b) Importance of radiological evaluation in audiological diagnosis in the pediatric population, c) Need for regular monitoring of hearing, hearing aid performance, and cochlear implant mapping closely for potential fluctuations in such populations, d) Importance of reviewing developmental, language milestones in very young children with fluctuating hearing loss.Keywords: enlarged vestibular aqueduct (EVA), motor delay, radiological evaluation, fluctuating hearing loss, cochlear implant
Procedia PDF Downloads 1672213 Computational Fluid Dynamics Simulation of Reservoir for Dwell Time Prediction
Authors: Nitin Dewangan, Nitin Kattula, Megha Anawat
Abstract:
Hydraulic reservoir is the key component in the mobile construction vehicles; most of the off-road earth moving construction machinery requires bigger side hydraulic reservoirs. Their reservoir construction is very much non-uniform and designers used such design to utilize the space available under the vehicle. There is no way to find out the space utilization of the reservoir by oil and validity of design except virtual simulation. Computational fluid dynamics (CFD) helps to predict the reservoir space utilization by vortex mapping, path line plots and dwell time prediction to make sure the design is valid and efficient for the vehicle. The dwell time acceptance criteria for effective reservoir design is 15 seconds. The paper will describe the hydraulic reservoir simulation which is carried out using CFD tool acuSolve using automated mesh strategy. The free surface flow and moving reference mesh is used to define the oil flow level inside the reservoir. The first baseline design is not able to meet the acceptance criteria, i.e., dwell time below 15 seconds because the oil entry and exit ports were very close. CFD is used to redefine the port locations for the reservoir so that oil dwell time increases in the reservoir. CFD also proposed baffle design the effective space utilization. The final design proposed through CFD analysis is used for physical validation on the machine.Keywords: reservoir, turbulence model, transient model, level set, free-surface flow, moving frame of reference
Procedia PDF Downloads 1522212 Retrospective Reconstruction of Time Series Data for Integrated Waste Management
Authors: A. Buruzs, M. F. Hatwágner, A. Torma, L. T. Kóczy
Abstract:
The development, operation and maintenance of Integrated Waste Management Systems (IWMS) affects essentially the sustainable concern of every region. The features of such systems have great influence on all of the components of sustainability. In order to reach the optimal way of processes, a comprehensive mapping of the variables affecting the future efficiency of the system is needed such as analysis of the interconnections among the components and modelling of their interactions. The planning of a IWMS is based fundamentally on technical and economical opportunities and the legal framework. Modelling the sustainability and operation effectiveness of a certain IWMS is not in the scope of the present research. The complexity of the systems and the large number of the variables require the utilization of a complex approach to model the outcomes and future risks. This complex method should be able to evaluate the logical framework of the factors composing the system and the interconnections between them. The authors of this paper studied the usability of the Fuzzy Cognitive Map (FCM) approach modelling the future operation of IWMS’s. The approach requires two input data set. One is the connection matrix containing all the factors affecting the system in focus with all the interconnections. The other input data set is the time series, a retrospective reconstruction of the weights and roles of the factors. This paper introduces a novel method to develop time series by content analysis.Keywords: content analysis, factors, integrated waste management system, time series
Procedia PDF Downloads 3262211 Frontier Dynamic Tracking in the Field of Urban Plant and Habitat Research: Data Visualization and Analysis Based on Journal Literature
Authors: Shao Qi
Abstract:
The article uses the CiteSpace knowledge graph analysis tool to sort and visualize the journal literature on urban plants and habitats in the Web of Science and China National Knowledge Infrastructure databases. Based on a comprehensive interpretation of the visualization results of various data sources and the description of the intrinsic relationship between high-frequency keywords using knowledge mapping, the research hotspots, processes and evolution trends in this field are analyzed. Relevant case studies are also conducted for the hotspot contents to explore the means of landscape intervention and synthesize the understanding of research theories. The results show that (1) from 1999 to 2022, the research direction of urban plants and habitats gradually changed from focusing on plant and animal extinction and biological invasion to the field of human urban habitat creation, ecological restoration, and ecosystem services. (2) The results of keyword emergence and keyword growth trend analysis show that habitat creation research has shown a rapid and stable growth trend since 2017, and ecological restoration has gained long-term sustained attention since 2004. The hotspots of future research on urban plants and habitats in China may focus on habitat creation and ecological restoration.Keywords: research trends, visual analysis, habitat creation, ecological restoration
Procedia PDF Downloads 612210 Let’s Make Waves – Changing the Landscape for the Solent’s Film Industry
Authors: Roy Hanney
Abstract:
This research study aims to develop an evidential basis to inform strategic development of the film industry in the Solent (south central) region of the UK. The density of the creative industries around the region is driving the growth of jobs. Yet, film production in particular, appears to struggle with field configuration, lacks ecological cohesion, and suffers from underdeveloped ecosystems when compared to other areas bordering the region. Though thriving, a lack of coordinated leadership results in the continued reproduction of an ill-configured, constricted and socio-economically filtered workforce. One that struggles to seize strategic opportunities arising as a consequence of the ongoing investment in UK film production around the west of London. Taking a participatory approach, the study seeks to avoid the universalism of place marketing and focus on the situatedness of the region and its specific cultural, social, and economic contexts. The staging of a series of high profile networking events provided a much needed field configuring activity and enabled the capture of voices of those currently working in the sector. It will also provided the opportunity for an exploratory network mapping of the regional creative industries as a value exchange ecosystem. It is understood that a focus on production is not in itself a solution to the challenges faced in the region. There is a need to address issues of access as a counterbalance to skewed representation among the creative workforces thus the study also aims to report on opportunities for embedding diversity and inclusion in any strategic solutions.Keywords: creative, industries, ecosystem, ecology
Procedia PDF Downloads 992209 Arabic Handwriting Recognition Using Local Approach
Authors: Mohammed Arif, Abdessalam Kifouche
Abstract:
Optical character recognition (OCR) has a main role in the present time. It's capable to solve many serious problems and simplify human activities. The OCR yields to 70's, since many solutions has been proposed, but unfortunately, it was supportive to nothing but Latin languages. This work proposes a system of recognition of an off-line Arabic handwriting. This system is based on a structural segmentation method and uses support vector machines (SVM) in the classification phase. We have presented a state of art of the characters segmentation methods, after that a view of the OCR area, also we will address the normalization problems we went through. After a comparison between the Arabic handwritten characters & the segmentation methods, we had introduced a contribution through a segmentation algorithm.Keywords: OCR, segmentation, Arabic characters, PAW, post-processing, SVM
Procedia PDF Downloads 712208 Delivery of Patient-Directed Wound Care Via Mobile Application-Based Qualitative Analysis
Authors: Amulya Srivatsa, Gayatri Prakash, Deeksha Sarda, Varshni Nandakumar, Duncan Salmon
Abstract:
Delivery of Patient-Directed Wound Care Via Mobile Application-Based Qualitative Analysis Chronic wounds are difficult for patients to manage at-home due to their unpredictable healing process. These wounds are associated with increased morbidity and negatively affect physical and mental health. The solution is a mobile application that will have an algorithm-based checklist to determine the state of the wound based on different factors that vary from person to person. Once this information is gathered, the application will recommend a plan of care to the user and subsequent steps to be taken. The mobile application will allow users to perform a digital scan of the wound to extract quantitative information regarding wound width, length, and depth, which will then be uploaded to the EHR to notify the patient’s provider. This scan utilizes a photo taken by the user, who is prompted appropriately. Furthermore, users will enter demographic information and answer multiple choice and drop-down menus describing the wound state. The proposed solution can save patients from unnecessary trips to the hospital for chronic wound care. The next iteration of the application can incorporate AI to allow users to perform a digital scan of the wound to extract quantitative information regarding wound width, length, and depth, which can be shared with the patient’s provider to allow for more efficient treatment. Ultimately, this product can provide immediate and economical medical advice for patients that suffer from chronic wounds. Research Objectives: The application should be capable of qualitative analysis of a wound and recommend a plan of care to the user. Additionally, the results of the wound analysis should automatically upload to the patient’s EMR. Research Methodologies: The app has two components: the first is a checklist with tabs for varying factors that assists users in the assessment of their skin. Subsequently, the algorithm will create an at-home regimen for patients to follow to manage their wounds. Research Contributions: The app aims to return autonomy back to the patient and reduce the number of visits to a physician for chronic wound care. The app also serves to educate the patient on how best to care for their wounds.Keywords: wound, app, qualitative, analysis, home, chronic
Procedia PDF Downloads 67