Search results for: image registration techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8965

Search results for: image registration techniques

6235 Adopting the Community Health Workers Master List Registry for Community Health Workforce in Kenya

Authors: Gikunda Aloise, Mjema Saida, Barasa Herbert, Wanyungu John, Kimani Maureen

Abstract:

Background: Community Health Workforce (CHW) is health care providers at the community level (Level 1) and serves as a bridge between the community and the formal healthcare system. This human resource has enormous potential to extend healthcare services and ensures that the vulnerable, marginalized, and hard-to-reach populations have access to quality healthcare services at the community and primary health facility levels. However, these cadres are neither recognized, remunerated, nor in most instances, registered in a master list. Management and supervision of CHWs is not easy if their individual demographics, training capacity and incentives is not well documented through a centralized registry. Description: In February 2022, Amref supported the Kenya Ministry of Health in developing a community health workforce database called Community Health Workers Master List Registry (CHWML), which is hosted in Kenya Health Information System (KHIS) tracker. CHW registration exercise was through a sensitization meeting conducted by the County Community Health Focal Person for the Sub-County Community Health Focal Person and Community Health Assistants who uploaded information on individual demographics, training undertaken and incentives received by CHVs. Care was taken to ensure compliance with Kenyan laws on the availability and use of personal data as prescribed by the Data Protection Act, 2019 (DPA). Results and lessons learnt: By June 2022, 80,825 CHWs had been registered in the system; 78,174 (96%) CHVs and 2,636 (4%) CHAs. 25,235 (31%) are male, 55,505 (68%) are female & 85 (1%) are transgender. 39,979. (49%) had secondary education and 2500 (3%) had no formal education. Only 27 641 (34%) received a monthly stipend. 68,436 CHVs (85%) had undergone basic training. However, there is a need to validate the data to align with the current situation in the counties. Conclusions/Next steps: The use of CHWML will unlock opportunities for building more resilient and sustainable health systems and inform financial planning, resource allocation, capacity development, and quality service delivery. The MOH will update the CHWML guidelines in adherence to the data protection act which will inform standard procedures for maintaining, updating the registry and integrate Community Health Workforce registry with the HRH system.

Keywords: community health registry, community health volunteers (CHVs), community health workers masters list (CHWML), data protection act

Procedia PDF Downloads 110
6234 Machine Learning Techniques for Estimating Ground Motion Parameters

Authors: Farid Khosravikia, Patricia Clayton

Abstract:

The main objective of this study is to evaluate the advantages and disadvantages of various machine learning techniques in forecasting ground-motion intensity measures given source characteristics, source-to-site distance, and local site condition. Intensity measures such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Estimating these variables for future earthquake events is a key step in seismic hazard assessment and potentially subsequent risk assessment of different types of structures. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as a statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The algorithms are adjusted to quantify event-to-event and site-to-site variability of the ground motions by implementing them as random effects in the proposed models to reduce the aleatory uncertainty. All the algorithms are trained using a selected database of 4,528 ground-motions, including 376 seismic events with magnitude 3 to 5.8, recorded over the hypocentral distance range of 4 to 500 km in Oklahoma, Kansas, and Texas since 2005. The main reason of the considered database stems from the recent increase in the seismicity rate of these states attributed to petroleum production and wastewater disposal activities, which necessities further investigation in the ground motion models developed for these states. Accuracy of the models in predicting intensity measures, generalization capability of the models for future data, as well as usability of the models are discussed in the evaluation process. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available.

Keywords: artificial neural network, ground-motion models, machine learning, random forest, support vector machine

Procedia PDF Downloads 113
6233 The Effect of Incorporating Animal Assisted Interventions with Trauma Focused Cognitive Behavioral Therapy

Authors: Kayla Renteria

Abstract:

This study explored the role animal-assisted psychotherapy (AAP) can play in treating Post-Traumatic Stress Disorder (PTSD) when incorporated into Trauma-informed cognitive behavioral therapy (TF-CBT). A review of the literature was performed to show how incorporating AAP could benefit TF-CBT since this treatment model often presents difficulties, such as client motivation and avoidance of the exposure element of the intervention. In addition, the fluidity of treatment goals during complex trauma cases was explored, as this issue arose in the case study. This study follows the course of treatment of a 12-year-old female presenting with symptoms of PTSD. Treatment consisted of traditional components of the TF-CBT model, with the added elements of AAP to address typical treatment obstacles in TF-CBT. A registered therapy dog worked with the subject in all sessions throughout her treatment. The therapy dog was incorporated into components such as relaxation and coping techniques, narrative therapy techniques, and psychoeducation on the cognitive triangle. Throughout the study, the client’s situation and clinical needs required the therapist to switch goals to focus on current safety and stability. The therapy dog provided support and neurophysiological benefits to the client through AAP during this shift in treatment. The client was assessed quantitatively using the Child PTSD Symptom Scale Self Report for DSM-5 (CPSS-SR-5) before and after therapy and qualitatively through a feedback form given after treatment. The participant showed improvement in CPSS-SR-V scores, and she reported that the incorporation of the therapy animal improved her therapy. The results of this study show how the use of AAP provided the client a solid, consistent relationship with the therapy dog that supported her through processing various types of traumas. Implications of the results of treatment and for future research are discussed.

Keywords: animal-assisted therapy, trauma-focused cognitive behavioral therapy, PTSD in children, trauma treatment

Procedia PDF Downloads 198
6232 Secret Security Smart Lock Using Artificial Intelligence Hybrid Algorithm

Authors: Vahid Bayrami Rad

Abstract:

Ever since humans developed a collective way of life to the development of urbanization, the concern of security has always been considered one of the most important challenges of life. To protect property, locks have always been a practical tool. With the advancement of technology, the form of locks has changed from mechanical to electric. One of the most widely used fields of using artificial intelligence is its application in the technology of surveillance security systems. Currently, the technologies used in smart anti-theft door handles are one of the most potential fields for using artificial intelligence. Artificial intelligence has the possibility to learn, calculate, interpret and process by analyzing data with the help of algorithms and mathematical models and make smart decisions. We will use Arduino board to process data.

Keywords: arduino board, artificial intelligence, image processing, solenoid lock

Procedia PDF Downloads 57
6231 Robust Barcode Detection with Synthetic-to-Real Data Augmentation

Authors: Xiaoyan Dai, Hsieh Yisan

Abstract:

Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.

Keywords: barcode detection, data augmentation, deep learning, image-based processing

Procedia PDF Downloads 147
6230 Information Technology in Assessing Risks and Threats in the Transition of the Brand to the Digital Environment

Authors: Spanova Yerkezhan, Amantay Ayan, Alimzhanova Laura

Abstract:

This article discusses the concept of rebranding and its relationship to cybersecurity. Rebranding is the process of changing the appearance and image of a company or organization in order to appeal to new customers or change the perception of a company. It can be a powerful tool for businesses looking to renew their reputation or expand into new markets. In today's digital age, companies increasingly rely on technology and the internet to conduct business; rebranding can also present significant cybersecurity risks. This is because a rebranding effort can create new vulnerabilities for companies, particularly in terms of their online presence. This article explores the potential hazards associated with rebranding and provides recommendations for mitigating those risks. It also highlights the importance of considering cybersecurity in the rebranding process and how it can be integrated into the overall strategy for a successful and secure rebranding.

Keywords: rebranding, cybersecurity, cyberattack, logo, vulnerability

Procedia PDF Downloads 149
6229 Comparison of Developed Statokinesigram and Marker Data Signals by Model Approach

Authors: Boris Barbolyas, Kristina Buckova, Tomas Volensky, Cyril Belavy, Ladislav Dedik

Abstract:

Background: Based on statokinezigram, the human balance control is often studied. Approach to human postural reaction analysis is based on a combination of stabilometry output signal with retroreflective marker data signal processing, analysis, and understanding, in this study. The study shows another original application of Method of Developed Statokinesigram Trajectory (MDST), too. Methods: In this study, the participants maintained quiet bipedal standing for 10 s on stabilometry platform. Consequently, bilateral vibration stimuli to Achilles tendons in 20 s interval was applied. Vibration stimuli caused that human postural system took the new pseudo-steady state. Vibration frequencies were 20, 60 and 80 Hz. Participant's body segments - head, shoulders, hips, knees, ankles and little fingers were marked by 12 retroreflective markers. Markers positions were scanned by six cameras system BTS SMART DX. Registration of their postural reaction lasted 60 s. Sampling frequency was 100 Hz. For measured data processing were used Method of Developed Statokinesigram Trajectory. Regression analysis of developed statokinesigram trajectory (DST) data and retroreflective marker developed trajectory (DMT) data were used to find out which marker trajectories most correlate with stabilometry platform output signals. Scaling coefficients (λ) between DST and DMT by linear regression analysis were evaluated, too. Results: Scaling coefficients for marker trajectories were identified for all body segments. Head markers trajectories reached maximal value and ankle markers trajectories had a minimal value of scaling coefficient. Hips, knees and ankles markers were approximately symmetrical in the meaning of scaling coefficient. Notable differences of scaling coefficient were detected in head and shoulders markers trajectories which were not symmetrical. The model of postural system behavior was identified by MDST. Conclusion: Value of scaling factor identifies which body segment is predisposed to postural instability. Hypothetically, if statokinesigram represents overall human postural system response to vibration stimuli, then markers data represented particular postural responses. It can be assumed that cumulative sum of particular marker postural responses is equal to statokinesigram.

Keywords: center of pressure (CoP), method of developed statokinesigram trajectory (MDST), model of postural system behavior, retroreflective marker data

Procedia PDF Downloads 331
6228 Satisfaction of International Tourists during Their Visit to Bangkok, Thailand

Authors: Bovornluck Kuosuwan, Kevin Wongleedee

Abstract:

The purposes of this research was to study the level of satisfaction of international tourists in five important areas: satisfaction on visiting tourist destinations, satisfaction on tourist images, satisfaction on value for money, satisfaction on service quality, and satisfaction when compared with their expectation. A probability random sampling of 200 inbound tourists was utilized. A questionnaire was used to collect the data and small in-depth interviews were also used to get their opinions about their positive and negative evaluations of their experience travelling in Thailand. The findings revealed that the majority of respondents had a medium level of satisfaction. When examined in detail, the level of satisfaction can be ranked from highest to lowest according to the mean average as follows: visiting tourist destinations, expectations, service quality, tourist image, and value for money.

Keywords: inbound tourists, satisfaction, Thailand, international tourists

Procedia PDF Downloads 312
6227 Understanding Cyber Kill Chains: Optimal Allocation of Monitoring Resources Using Cooperative Game Theory

Authors: Roy. H. A. Lindelauf

Abstract:

Cyberattacks are complex processes consisting of multiple interwoven tasks conducted by a set of agents. Interdictions and defenses against such attacks often rely on cyber kill chain (CKC) models. A CKC is a framework that tries to capture the actions taken by a cyber attacker. There exists a growing body of literature on CKCs. Most of this work either a) describes the CKC with respect to one or more specific cyberattacks or b) discusses the tools and technologies used by the attacker at each stage of the CKC. Defenders, facing scarce resources, have to decide where to allocate their resources given the CKC and partial knowledge on the tools and techniques attackers use. In this presentation CKCs are analyzed through the lens of covert projects, i.e., interrelated tasks that have to be conducted by agents (human and/or computer) with the aim of going undetected. Various aspects of covert project models have been studied abundantly in the operations research and game theory domain, think of resource-limited interdiction actions that maximally delay completion times of a weapons project for instance. This presentation has investigated both cooperative and non-cooperative game theoretic covert project models and elucidated their relation to CKC modelling. To view a CKC as a covert project each step in the CKC is broken down into tasks and there are players of which each one is capable of executing a subset of the tasks. Additionally, task inter-dependencies are represented by a schedule. Using multi-glove cooperative games it is shown how a defender can optimize the allocation of his scarce resources (what, where and how to monitor) against an attacker scheduling a CKC. This study presents and compares several cooperative game theoretic solution concepts as metrics for assigning resources to the monitoring of agents.

Keywords: cyber defense, cyber kill chain, game theory, information warfare techniques

Procedia PDF Downloads 127
6226 A QoS Aware Cluster Based Routing Algorithm for Wireless Mesh Network Using LZW Lossless Compression

Authors: J. S. Saini, P. P. K. Sandhu

Abstract:

The multi-hop nature of Wireless Mesh Networks and the hasty progression of throughput demands results in multi- channels and multi-radios structures in mesh networks, but the main problem of co-channels interference reduces the total throughput, specifically in multi-hop networks. Quality of Service mentions a vast collection of networking technologies and techniques that guarantee the ability of a network to make available desired services with predictable results. Quality of Service (QoS) can be directed at a network interface, towards a specific server or router's performance, or in specific applications. Due to interference among various transmissions, the QoS routing in multi-hop wireless networks is formidable task. In case of multi-channel wireless network, since two transmissions using the same channel may interfere with each other. This paper has considered the Destination Sequenced Distance Vector (DSDV) routing protocol to locate the secure and optimised path. The proposed technique also utilizes the Lempel–Ziv–Welch (LZW) based lossless data compression and intra cluster data aggregation to enhance the communication between the source and the destination. The use of clustering has the ability to aggregate the multiple packets and locates a single route using the clusters to improve the intra cluster data aggregation. The use of the LZW based lossless data compression has ability to reduce the data packet size and hence it will consume less energy, thus increasing the network QoS. The MATLAB tool has been used to evaluate the effectiveness of the projected technique. The comparative analysis has shown that the proposed technique outperforms over the existing techniques.

Keywords: WMNS, QOS, flooding, collision avoidance, LZW, congestion control

Procedia PDF Downloads 323
6225 E-Learning Recommender System Based on Collaborative Filtering and Ontology

Authors: John Tarus, Zhendong Niu, Bakhti Khadidja

Abstract:

In recent years, e-learning recommender systems has attracted great attention as a solution towards addressing the problem of information overload in e-learning environments and providing relevant recommendations to online learners. E-learning recommenders continue to play an increasing educational role in aiding learners to find appropriate learning materials to support the achievement of their learning goals. Although general recommender systems have recorded significant success in solving the problem of information overload in e-commerce domains and providing accurate recommendations, e-learning recommender systems on the other hand still face some issues arising from differences in learner characteristics such as learning style, skill level and study level. Conventional recommendation techniques such as collaborative filtering and content-based deal with only two types of entities namely users and items with their ratings. These conventional recommender systems do not take into account the learner characteristics in their recommendation process. Therefore, conventional recommendation techniques cannot make accurate and personalized recommendations in e-learning environment. In this paper, we propose a recommendation technique combining collaborative filtering and ontology to recommend personalized learning materials to online learners. Ontology is used to incorporate the learner characteristics into the recommendation process alongside the ratings while collaborate filtering predicts ratings and generate recommendations. Furthermore, ontological knowledge is used by the recommender system at the initial stages in the absence of ratings to alleviate the cold-start problem. Evaluation results show that our proposed recommendation technique outperforms collaborative filtering on its own in terms of personalization and recommendation accuracy.

Keywords: collaborative filtering, e-learning, ontology, recommender system

Procedia PDF Downloads 355
6224 Management of Cultural Heritage: Bologna Gates

Authors: Alfonso Ippolito, Cristiana Bartolomei

Abstract:

A growing demand is felt today for realistic 3D models enabling the cognition and popularization of historical-artistic heritage. Evaluation and preservation of Cultural Heritage is inextricably connected with the innovative processes of gaining, managing, and using knowledge. The development and perfecting of techniques for acquiring and elaborating photorealistic 3D models, made them pivotal elements for popularizing information of objects on the scale of architectonic structures.

Keywords: cultural heritage, databases, non-contact survey, 2D-3D models

Procedia PDF Downloads 408
6223 Sustainability in Retaining Wall Construction with Geosynthetics

Authors: Sateesh Kumar Pisini, Swetha Priya Darshini, Sanjay Kumar Shukla

Abstract:

This paper seeks to present a research study on sustainability in construction of retaining wall using geosynthetics. Sustainable construction is a way for the building and infrastructure industry to move towards achieving sustainable development, taking into account environmental, socioeconomic and cultural issues. Geotechnical engineering, being very resource intensive, warrants an environmental sustainability study, but a quantitative framework for assessing the sustainability of geotechnical practices, particularly at the planning and design stages, does not exist. In geotechnical projects, major economic issues to be addressed are in the design and construction of stable slopes and retaining structures within space constraints. In this paper, quantitative indicators for assessing the environmental sustainability of retaining wall with geosynthetics are compared with conventional concrete retaining wall through life cycle assessment (LCA). Geosynthetics can make a real difference in sustainable construction techniques and contribute to development in developing countries in particular. Their imaginative application can result in considerable cost savings over the use of conventional designs and materials. The acceptance of geosynthetics in reinforced retaining wall construction has been triggered by a number of factors, including aesthetics, reliability, simple construction techniques, good seismic performance, and the ability to tolerate large deformations without structural distress. Reinforced retaining wall with geosynthetics is the best cost-effective and eco-friendly solution as compared with traditional concrete retaining wall construction. This paper presents an analysis of the theme of sustainability applied to the design and construction of traditional concrete retaining wall and presenting a cost-effective and environmental solution using geosynthetics.

Keywords: sustainability, retaining wall, geosynthetics, life cycle assessment

Procedia PDF Downloads 2043
6222 Methylene Blue Removal Using NiO nanoparticles-Sand Adsorption Packed Bed

Authors: Nedal N. Marei, Nashaat Nassar

Abstract:

Many treatment techniques have been used to remove the soluble pollutants from wastewater as; dyes and metal ions which could be found in rich amount in the used water of the textile and tanneries industry. The effluents from these industries are complex, containing a wide variety of dyes and other contaminants, such as dispersants, acids, bases, salts, detergents, humectants, oxidants, and others. These techniques can be divided into physical, chemical, and biological methods. Adsorption has been developed as an efficient method for the removal of heavy metals from contaminated water and soil. It is now recognized as an effective method for the removal of both organic and inorganic pollutants from wastewaters. Nanosize materials are new functional materials, which offer high surface area and have come up as effective adsorbents. Nano alumina is one of the most important ceramic materials widely used as an electrical insulator, presenting exceptionally high resistance to chemical agents, as well as giving excellent performance as a catalyst for many chemical reactions, in microelectronic, membrane applications, and water and wastewater treatment. In this study, methylene blue (MB) dye has been used as model dye of textile wastewater in order to synthesize a synthetic MB wastewater. NiO nanoparticles were added in small percentage in the sand packed bed adsorption columns to remove the MB from the synthetic textile wastewater. Moreover, different parameters have been evaluated; flow of the synthetic wastewater, pH, height of the bed, percentage of the NiO to the sand in the packed material. Different mathematical models where employed to find the proper model which describe the experimental data and help to analyze the mechanism of the MB adsorption. This study will provide good understanding of the dyes adsorption using metal oxide nanoparticles in the classical sand bed.

Keywords: adsorption, column, nanoparticles, methylene

Procedia PDF Downloads 249
6221 Analyzing Use of Figurativeness, Visual Elements, Allegory, Scenic Imagery as Support System in Punjabi Contemporary Theatre for Escaping Censorship

Authors: Shazia Anwer

Abstract:

This paper has discussed the unusual form of resistance in theatre against censorship board in Pakistan. The atypical approach of dramaturgy created massive space for performers and audiences to integrate and communicate. The social and religious absolutes creates suffocation in Pakistani society, strict control over all Fine and Performing Art has made art political, contemporary dramatics has started an amalgamated theatre to avoid censorship. Contemporary Punjabi theatre techniques are directly dependent on human cognition. The idea of indirect thought processing is not unique but dependent on spectators. The paper has provided an account of these techniques and their specific use for conveying specific messages across the audiences. For the Dramaturge of today, theatre space is an expression representing a linguistic formulation that includes qualities of experimental and non-traditional use of classical theatrical space in the context of fulfilling the concept of open theatre. Paper has explained the transformation of the theatrical experience into an event where the actor and the audience are co-existing and co-experiencing the dramatical experience. The denial of the existence of the 4th -Wall made two-way communication possible. This paper has elaborated that the previously marginalized genres such as naach, jugat, miras, are extensively included to counter the censorship board. Figurativeness, visual elements, allegory, scenic imagery are basic support system for contemporary Punjabi theatre. The body of the actor is used as a source for non-verbal communication, and for an escape from traditional theatrical space which by every means has every element that could be controlled and reprimanded by the controlling authority.

Keywords: communication, Punjabi theatre, figurativeness, censorship

Procedia PDF Downloads 124
6220 Cutting Plane Methods for Integer Programming: NAZ Cut and Its Variations

Authors: A. Bari

Abstract:

Integer programming is a branch of mathematical programming techniques in operations research in which some or all of the variables are required to be integer valued. Various cuts have been used to solve these problems. We have also developed cuts known as NAZ cut & A-T cut to solve the integer programming problems. These cuts are used to reduce the feasible region and then reaching the optimal solution in minimum number of steps.

Keywords: Integer Programming, NAZ cut, A-T cut, Cutting plane method

Procedia PDF Downloads 349
6219 The Effect of Tip Parameters on Vibration Modes of Atomic Force Microscope Cantilever

Authors: Mehdi Shekarzadeh, Pejman Taghipour Birgani

Abstract:

In this paper, the effect of mass and height of tip on the flexural vibration modes of an atomic force microscope (AFM) rectangular cantilever is analyzed. A closed-form expression for the sensitivity of vibration modes is derived using the relationship between the resonant frequency and contact stiffness of cantilever and sample. Each mode has a different sensitivity to variations in surface stiffness. This sensitivity directly controls the image resolution. It is obtained an AFM cantilever is more sensitive when the mass of tip is lower and the first mode is the most sensitive mode. Also, the effect of changes of tip height on the flexural sensitivity is negligible.

Keywords: atomic force microscope, AFM, vibration analysis, flexural vibration, cantilever

Procedia PDF Downloads 374
6218 The Effective Use of the Network in the Distributed Storage

Authors: Mamouni Mohammed Dhiya Eddine

Abstract:

This work aims at studying the exploitation of high-speed networks of clusters for distributed storage. Parallel applications running on clusters require both high-performance communications between nodes and efficient access to the storage system. Many studies on network technologies led to the design of dedicated architectures for clusters with very fast communications between computing nodes. Efficient distributed storage in clusters has been essentially developed by adding parallelization mechanisms so that the server(s) may sustain an increased workload. In this work, we propose to improve the performance of distributed storage systems in clusters by efficiently using the underlying high-performance network to access distant storage systems. The main question we are addressing is: do high-speed networks of clusters fit the requirements of a transparent, efficient and high-performance access to remote storage? We show that storage requirements are very different from those of parallel computation. High-speed networks of clusters were designed to optimize communications between different nodes of a parallel application. We study their utilization in a very different context, storage in clusters, where client-server models are generally used to access remote storage (for instance NFS, PVFS or LUSTRE). Our experimental study based on the usage of the GM programming interface of MYRINET high-speed networks for distributed storage raised several interesting problems. Firstly, the specific memory utilization in the storage access system layers does not easily fit the traditional memory model of high-speed networks. Secondly, client-server models that are used for distributed storage have specific requirements on message control and event processing, which are not handled by existing interfaces. We propose different solutions to solve communication control problems at the filesystem level. We show that a modification of the network programming interface is required. Data transfer issues need an adaptation of the operating system. We detail several propositions for network programming interfaces which make their utilization easier in the context of distributed storage. The integration of a flexible processing of data transfer in the new programming interface MYRINET/MX is finally presented. Performance evaluations show that its usage in the context of both storage and other types of applications is easy and efficient.

Keywords: distributed storage, remote file access, cluster, high-speed network, MYRINET, zero-copy, memory registration, communication control, event notification, application programming interface

Procedia PDF Downloads 205
6217 2D Point Clouds Features from Radar for Helicopter Classification

Authors: Danilo Habermann, Aleksander Medella, Carla Cremon, Yusef Caceres

Abstract:

This paper aims to analyze the ability of 2d point clouds features to classify different models of helicopters using radars. This method does not need to estimate the blade length, the number of blades of helicopters, and the period of their micro-Doppler signatures. It is also not necessary to generate spectrograms (or any other image based on time and frequency domain). This work transforms a radar return signal into a 2D point cloud and extracts features of it. Three classifiers are used to distinguish 9 different helicopter models in order to analyze the performance of the features used in this work. The high accuracy obtained with each of the classifiers demonstrates that the 2D point clouds features are very useful for classifying helicopters from radar signal.

Keywords: helicopter classification, point clouds features, radar, supervised classifiers

Procedia PDF Downloads 204
6216 High-Speed Particle Image Velocimetry of the Flow around a Moving Train Model with Boundary Layer Control Elements

Authors: Alexander Buhr, Klaus Ehrenfried

Abstract:

Trackside induced airflow velocities, also known as slipstream velocities, are an important criterion for the design of high-speed trains. The maximum permitted values are given by the Technical Specifications for Interoperability (TSI) and have to be checked in the approval process. For train manufactures it is of great interest to know in advance, how new train geometries would perform in TSI tests. The Reynolds number in moving model experiments is lower compared to full-scale. Especially the limited model length leads to a thinner boundary layer at the rear end. The hypothesis is that the boundary layer rolls up to characteristic flow structures in the train wake, in which the maximum flow velocities can be observed. The idea is to enlarge the boundary layer using roughness elements at the train model head so that the ratio between the boundary layer thickness and the car width at the rear end is comparable to a full-scale train. This may lead to similar flow structures in the wake and better prediction accuracy for TSI tests. In this case, the design of the roughness elements is limited by the moving model rig. Small rectangular roughness shapes are used to get a sufficient effect on the boundary layer, while the elements are robust enough to withstand the high accelerating and decelerating forces during the test runs. For this investigation, High-Speed Particle Image Velocimetry (HS-PIV) measurements on an ICE3 train model have been realized in the moving model rig of the DLR in Göttingen, the so called tunnel simulation facility Göttingen (TSG). The flow velocities within the boundary layer are analysed in a plain parallel to the ground. The height of the plane corresponds to a test position in the EN standard (TSI). Three different shapes of roughness elements are tested. The boundary layer thickness and displacement thickness as well as the momentum thickness and the form factor are calculated along the train model. Conditional sampling is used to analyse the size and dynamics of the flow structures at the time of maximum velocity in the train wake behind the train. As expected, larger roughness elements increase the boundary layer thickness and lead to larger flow velocities in the boundary layer and in the wake flow structures. The boundary layer thickness, displacement thickness and momentum thickness are increased by using larger roughness especially when applied in the height close to the measuring plane. The roughness elements also cause high fluctuations in the form factors of the boundary layer. Behind the roughness elements, the form factors rapidly are approaching toward constant values. This indicates that the boundary layer, while growing slowly along the second half of the train model, has reached a state of equilibrium.

Keywords: boundary layer, high-speed PIV, ICE3, moving train model, roughness elements

Procedia PDF Downloads 293
6215 Assessing Pain Using Morbid Motion Monitor System in the Pain Management of Nurse Practitioner

Authors: Mohammad Reza Dawoudi

Abstract:

With the increasing rate of patients suffering from chronic pain, several methods for evaluating of chronic pain are suggested. Motion of morbid has been defined as the rate of pine and it is linked with various co-morbid conditions. This study provides a summary of procedure useful to statistics performing direct behavioral observation in hospital settings. We describe the need for and usefulness of comprehensive “morbid motions” observations; provide a primer on the identification, definition, and assessment of morbid behaviors; and outline and discuss specific statistical procedures, including formulating referral motions, describing and conducting the observation. We also provide practical devices for observing and analyzing the obtained information into a report that guides clinical intervention.

Keywords: assessing pain, DNA modeling, image matching technique, pain scale

Procedia PDF Downloads 387
6214 TessPy – Spatial Tessellation Made Easy

Authors: Jonas Hamann, Siavash Saki, Tobias Hagen

Abstract:

Discretization of urban areas is a crucial aspect in many spatial analyses. The process of discretization of space into subspaces without overlaps and gaps is called tessellation. It helps understanding spatial space and provides a framework for analyzing geospatial data. Tessellation methods can be divided into two groups: regular tessellations and irregular tessellations. While regular tessellation methods, like squares-grids or hexagons-grids, are suitable for addressing pure geometry problems, they cannot take the unique characteristics of different subareas into account. However, irregular tessellation methods allow the border between the subareas to be defined more realistically based on urban features like a road network or Points of Interest (POI). Even though Python is one of the most used programming languages when it comes to spatial analysis, there is currently no library that combines different tessellation methods to enable users and researchers to compare different techniques. To close this gap, we are proposing TessPy, an open-source Python package, which combines all above-mentioned tessellation methods and makes them easily accessible to everyone. The core functions of TessPy represent the five different tessellation methods: squares, hexagons, adaptive squares, Voronoi polygons, and city blocks. By using regular methods, users can set the resolution of the tessellation which defines the finesse of the discretization and the desired number of tiles. Irregular tessellation methods allow users to define which spatial data to consider (e.g., amenity, building, office) and how fine the tessellation should be. The spatial data used is open-source and provided by OpenStreetMap. This data can be easily extracted and used for further analyses. Besides the methodology of the different techniques, the state-of-the-art, including examples and future work, will be discussed. All dependencies can be installed using conda or pip; however, the former is more recommended.

Keywords: geospatial data science, geospatial data analysis, tessellations, urban studies

Procedia PDF Downloads 113
6213 Nonlinear Aerodynamic Parameter Estimation of a Supersonic Air to Air Missile by Using Artificial Neural Networks

Authors: Tugba Bayoglu

Abstract:

Aerodynamic parameter estimation is very crucial in missile design phase, since accurate high fidelity aerodynamic model is required for designing high performance and robust control system, developing high fidelity flight simulations and verification of computational and wind tunnel test results. However, in literature, there is not enough missile aerodynamic parameter identification study for three main reasons: (1) most air to air missiles cannot fly with constant speed, (2) missile flight test number and flight duration are much less than that of fixed wing aircraft, (3) variation of the missile aerodynamic parameters with respect to Mach number is higher than that of fixed wing aircraft. In addition to these challenges, identification of aerodynamic parameters for high wind angles by using classical estimation techniques brings another difficulty in the estimation process. The reason for this, most of the estimation techniques require employing polynomials or splines to model the behavior of the aerodynamics. However, for the missiles with a large variation of aerodynamic parameters with respect to flight variables, the order of the proposed model increases, which brings computational burden and complexity. Therefore, in this study, it is aimed to solve nonlinear aerodynamic parameter identification problem for a supersonic air to air missile by using Artificial Neural Networks. The method proposed will be tested by using simulated data which will be generated with a six degree of freedom missile model, involving a nonlinear aerodynamic database. The data will be corrupted by adding noise to the measurement model. Then, by using the flight variables and measurements, the parameters will be estimated. Finally, the prediction accuracy will be investigated.

Keywords: air to air missile, artificial neural networks, open loop simulation, parameter identification

Procedia PDF Downloads 258
6212 Mikrophonie I (1964) by Karlheinz Stockhausen - Between Idea and Auditory Image

Authors: Justyna Humięcka-Jakubowska

Abstract:

1. Background in music analysis. Traditionally, when we think about a composer’s sketches, the chances are that we are thinking in terms of the working out of detail, rather than the evolution of an overall concept. Since music is a “time art’, it follows that questions of a form cannot be entirely detached from considerations of time. One could say that composers tend to regard time either as a place gradually and partially intuitively filled, or they can look for a specific strategy to occupy it. In my opinion, one thing that sheds light on Stockhausen's compositional thinking is his frequent use of 'form schemas', that is often a single-page representation of the entire structure of a piece. 2. Background in music technology. Sonic Visualiser is a program used to study a musical recording. It is an open source application for viewing, analysing, and annotating music audio files. It contains a number of visualisation tools, which are designed with useful default parameters for musical analysis. Additionally, the Vamp plugin format of SV supports to provide analysis such as for example structural segmentation. 3. Aims. The aim of my paper is to show how SV may be used to obtain a better understanding of the specific musical work, and how the compositional strategy does impact on musical structures and musical surfaces. I want to show that ‘traditional” music analytic methods don’t allow to indicate interrelationships between musical surface (which is perceived) and underlying musical/acoustical structure. 4. Main Contribution. Stockhausen had dealt with the most diverse musical problems by the most varied methods. A characteristic which he had never ceased to be placed at the center of his thought and works, it was the quest for a new balance founded upon an acute connection between speculation and intuition. In the case with Mikrophonie I (1964) for tam-tam and 6 players Stockhausen makes a distinction between the "connection scheme", which indicates the ground rules underlying all versions, and the form scheme, which is associated with a particular version. The preface to the published score includes both the connection scheme, and a single instance of a "form scheme", which is what one can hear on the CD recording. In the current study, the insight into the compositional strategy chosen by Stockhausen was been compared with auditory image, that is, with the perceived musical surface. Stockhausen's musical work is analyzed both in terms of melodic/voice and timbre evolution. 5. Implications The current study shows how musical structures have determined of musical surface. My general assumption is this, that while listening to music we can extract basic kinds of musical information from musical surfaces. It is shown that an interactive strategies of musical structure analysis can offer a very fruitful way of looking directly into certain structural features of music.

Keywords: automated analysis, composer's strategy, mikrophonie I, musical surface, stockhausen

Procedia PDF Downloads 284
6211 Advancing Urban Sustainability through Data-Driven Machine Learning Solutions

Authors: Nasim Eslamirad, Mahdi Rasoulinezhad, Francesco De Luca, Sadok Ben Yahia, Kimmo Sakari Lylykangas, Francesco Pilla

Abstract:

With the ongoing urbanization, cities face increasing environmental challenges impacting human well-being. To tackle these issues, data-driven approaches in urban analysis have gained prominence, leveraging urban data to promote sustainability. Integrating Machine Learning techniques enables researchers to analyze and predict complex environmental phenomena like Urban Heat Island occurrences in urban areas. This paper demonstrates the implementation of data-driven approach and interpretable Machine Learning algorithms with interpretability techniques to conduct comprehensive data analyses for sustainable urban design. The developed framework and algorithms are demonstrated for Tallinn, Estonia to develop sustainable urban strategies to mitigate urban heat waves. Geospatial data, preprocessed and labeled with UHI levels, are used to train various ML models, with Logistic Regression emerging as the best-performing model based on evaluation metrics to derive a mathematical equation representing the area with UHI or without UHI effects, providing insights into UHI occurrences based on buildings and urban features. The derived formula highlights the importance of building volume, height, area, and shape length to create an urban environment with UHI impact. The data-driven approach and derived equation inform mitigation strategies and sustainable urban development in Tallinn and offer valuable guidance for other locations with varying climates.

Keywords: data-driven approach, machine learning transparent models, interpretable machine learning models, urban heat island effect

Procedia PDF Downloads 15
6210 Strategic Entrepreneurship: Model Proposal for Post-Troika Sustainable Cultural Organizations

Authors: Maria Inês Pinho

Abstract:

Recent literature on issues of Cultural Management (also called Strategic Management for cultural organizations) systematically seeks for models that allow such equipment to adapt to the constant change that occurs in contemporary societies. In the last decade, the world, and in particular Europe has experienced a serious financial problem that has triggered defensive mechanisms, both in the direction of promoting the balance of public accounts and in the sense of the anonymous loss of the democratic and cultural values of each nation. If in the first case emerged the Troika that led to strong cuts in funding for Culture, deeply affecting those organizations; in the second case, the commonplace citizen is seen fighting for the non-closure of cultural equipment. Despite this, the cultural manager argues that there is no single formula capable of solving the need to adapt to change. In another way, it is up to this agent to know the existing scientific models and to adapt them in the best way to the reality of the institution he coordinates. These actions, as a rule, are concerned with the best performance vis-à-vis external audiences or with the financial sustainability of cultural organizations. They forget, therefore, that all this mechanics cannot function without its internal public, without its Human Resources. The employees of the cultural organization must then have an entrepreneurial posture - must be intrapreneurial. This paper intends to break this form of action and lead the cultural manager to understand that his role should be in the sense of creating value for society, through a good organizational performance. This is only possible with a posture of strategic entrepreneurship. In other words, with a link between: Cultural Management, Cultural Entrepreneurship and Cultural Intrapreneurship. In order to prove this assumption, the case study methodology was used with the symbol of the European Capital of Culture (Casa da Música) as well as qualitative and quantitative techniques. The qualitative techniques included the procedure of in-depth interviews to managers, founders and patrons and focus groups to public with and without experience in managing cultural facilities. The quantitative techniques involved the application of a questionnaire to middle management and employees of Casa da Música. After the triangulation of the data, it was proved that contemporary management of cultural organizations must implement among its practices, the concept of Strategic Entrepreneurship and its variables. Also, the topics which characterize the Cultural Intrapreneurship notion (job satisfaction, the quality in organizational performance, the leadership and the employee engagement and autonomy) emerged. The findings show then that to be sustainable, a cultural organization should meet the concerns of both external and internal forum. In other words, it should have an attitude of citizenship to the communities, visible on a social responsibility and a participatory management, only possible with the implementation of the concept of Strategic Entrepreneurship and its variable of Cultural Intrapreneurship.

Keywords: cultural entrepreneurship, cultural intrapreneurship, cultural organizations, strategic management

Procedia PDF Downloads 161
6209 A Neuroscience-Based Learning Technique: Framework and Application to STEM

Authors: Dante J. Dorantes-González, Aldrin Balsa-Yepes

Abstract:

Existing learning techniques such as problem-based learning, project-based learning, or case study learning are learning techniques that focus mainly on technical details, but give no specific guidelines on learner’s experience and emotional learning aspects such as arousal salience and valence, being emotional states important factors affecting engagement and retention. Some approaches involving emotion in educational settings, such as social and emotional learning, lack neuroscientific rigorousness and use of specific neurobiological mechanisms. On the other hand, neurobiology approaches lack educational applicability. And educational approaches mainly focus on cognitive aspects and disregard conditioning learning. First, authors start explaining the reasons why it is hard to learn thoughtfully, then they use the method of neurobiological mapping to track the main limbic system functions, such as the reward circuit, and its relations with perception, memories, motivations, sympathetic and parasympathetic reactions, and sensations, as well as the brain cortex. The authors conclude explaining the major finding: The mechanisms of nonconscious learning and the triggers that guarantee long-term memory potentiation. Afterward, the educational framework for practical application and the instructors’ guidelines are established. An implementation example in engineering education is given, namely, the study of tuned-mass dampers for earthquake oscillations attenuation in skyscrapers. This work represents an original learning technique based on nonconscious learning mechanisms to enhance long-term memories that complement existing cognitive learning methods.

Keywords: emotion, emotion-enhanced memory, learning technique, STEM

Procedia PDF Downloads 78
6208 Deep Learning-Based Approach to Automatic Abstractive Summarization of Patent Documents

Authors: Sakshi V. Tantak, Vishap K. Malik, Neelanjney Pilarisetty

Abstract:

A patent is an exclusive right granted for an invention. It can be a product or a process that provides an innovative method of doing something, or offers a new technical perspective or solution to a problem. A patent can be obtained by making the technical information and details about the invention publicly available. The patent owner has exclusive rights to prevent or stop anyone from using the patented invention for commercial uses. Any commercial usage, distribution, import or export of a patented invention or product requires the patent owner’s consent. It has been observed that the central and important parts of patents are scripted in idiosyncratic and complex linguistic structures that can be difficult to read, comprehend or interpret for the masses. The abstracts of these patents tend to obfuscate the precise nature of the patent instead of clarifying it via direct and simple linguistic constructs. This makes it necessary to have an efficient access to this knowledge via concise and transparent summaries. However, as mentioned above, due to complex and repetitive linguistic constructs and extremely long sentences, common extraction-oriented automatic text summarization methods should not be expected to show a remarkable performance when applied to patent documents. Other, more content-oriented or abstractive summarization techniques are able to perform much better and generate more concise summaries. This paper proposes an efficient summarization system for patents using artificial intelligence, natural language processing and deep learning techniques to condense the knowledge and essential information from a patent document into a single summary that is easier to understand without any redundant formatting and difficult jargon.

Keywords: abstractive summarization, deep learning, natural language Processing, patent document

Procedia PDF Downloads 110
6207 The Value of Store Choice Criteria on Perceived Patronage Intentions

Authors: Susana Marques

Abstract:

Research on how store environment cues influence consumers’ store choice decision criteria, such as store operations, product quality, monetary price, store image and sales promotion, is sparse. Especially absent research on the simultaneous impact of multiple store environment cues. The authors propose a comprehensive store choice model that includes: three types of store environment cues as exogenous constructs; various store choice criteria as possible mediating constructs, and store patronage intentions as an endogenous construct. On the basis of testing with a sample of 561 customers of hypermarkets, the model is partially supported. This study used structural equation modelling to test the proposed model.

Keywords: store choice, store patronage, structural equation modelling, retailing

Procedia PDF Downloads 259
6206 The Use of Themes and Variations in Early and Contemporary Juju Music

Authors: Olupemi E. Oludare

Abstract:

This paper discusses the thematic structure of Yoruba popular music of Southwest Nigeria. It examines the use of themes and variations in early and contemporary Juju music. The work is an outcome of a research developed by the author in his doctoral studies at the University of Lagos, Nigeria, with the aim of analyzing the thematic and motivic developments in Yoruba popular genres. Observations, interviews, live recordings and CDs were used as methods for eliciting information. Field recordings and CDs of selected musical samples were also transcribed and notated. The research established the prevalent use of string of themes by Juju musicians as a compositional technique in moving from one musical section to another, as they communicate the verbal messages in their song. These themes consisting of the popular ‘call and response’ form found in most African music, analogous to the western ‘subject and answer’ style of the fugue or sonata form, although without the tonic–dominant relations. Due to the short and repetitive form of African melodies and rhythms, a theme is restated as a variation, where its rhythmic and melodic motifs are stylistically developed and repeated, but still retaining its recognizable core musical structure. The findings of this study showed that Juju musicians generally often employ a thematic plan where new themes are used to arrange the songs into sections, and each theme is developed into variations in order to further expand the music, eliminate monotony, and create musical aesthetics, serving as hallmark of its musical identity. The study established the musical and extra-musical attributes of the genre, while recommending further research towards analyzing the various compositional techniques employed in African popular genres.

Keywords: compositional techniques, popular music, theme and variation, thematic development

Procedia PDF Downloads 401