Search results for: motion data acquisition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26781

Search results for: motion data acquisition

23061 Auditory Brainstem Response in Wave VI for the Detection of Learning Disabilities

Authors: Maria Isabel Garcia-Planas, Maria Victoria Garcia-Camba

Abstract:

The use of brain stem auditory evoked potential (BAEP) is a common way to study the auditory function of people, a way to learn the functionality of a part of the brain neuronal groups that intervene in the learning process by studying the behaviour of wave VI. The latest advances in neuroscience have revealed the existence of different brain activity in the learning process that can be highlighted through the use of innocuous, low-cost, and easy-access techniques such as, among others, the BAEP that can help us to detect early possible neurodevelopmental difficulties for their subsequent assessment and cure. To date and to the authors' best knowledge, only the latency data obtained, observing the first to V waves and mainly in the left ear, were taken into account. This work shows that it is essential to take into account both ears; with these latest data, it has been possible had diagnosed more precise some cases than with the previous data had been diagnosed as 'normal' despite showing signs of some alteration that motivated the new consultation to the specialist.

Keywords: ear, neurodevelopment, auditory evoked potentials, intervals of normality, learning disabilities

Procedia PDF Downloads 168
23060 Quantum Cryptography: Classical Cryptography Algorithms’ Vulnerability State as Quantum Computing Advances

Authors: Tydra Preyear, Victor Clincy

Abstract:

Quantum computing presents many computational advantages over classical computing methods due to the utilization of quantum mechanics. The capability of this computing infrastructure poses threats to standard cryptographic systems such as RSA and AES, which are designed for classical computing environments. This paper discusses the impact that quantum computing has on cryptography, while focusing on the evolution from classical cryptographic concepts to quantum and post-quantum cryptographic concepts. Standard Cryptography is essential for securing data by utilizing encryption and decryption methods, and these methods face vulnerability problems due to the advancement of quantum computing. In order to counter these vulnerabilities, the methods that are proposed are quantum cryptography and post-quantum cryptography. Quantum cryptography uses principles such as the uncertainty principle and photon polarization in order to provide secure data transmission. In addition, the concept of Quantum key distribution is introduced to ensure more secure communication channels by distributing cryptographic keys. There is the emergence of post-quantum cryptography which is used for improving cryptographic algorithms in order to be more secure from attacks by classical and quantum computers. Throughout this exploration, the paper mentions the critical role of the advancement of cryptographic methods to keep data integrity and privacy safe from quantum computing concepts. Future research directions that would be discussed would be more effective cryptographic methods through the advancement of technology.

Keywords: quantum computing, quantum cryptography, cryptography, data integrity and privacy

Procedia PDF Downloads 29
23059 Intelligent Electric Vehicle Charging System (IEVCS)

Authors: Prateek Saxena, Sanjeev Singh, Julius Roy

Abstract:

The security of the power distribution grid remains a paramount to the utility professionals while enhancing and making it more efficient. The most serious threat to the system can be maintaining the transformers, as the load is ever increasing with the addition of elements like electric vehicles. In this paper, intelligent transformer monitoring and grid management has been proposed. The engineering is done to use the evolving data from the smart meter for grid analytics and diagnostics for preventive maintenance. The two-tier architecture for hardware and software integration is coupled to form a robust system for the smart grid. The proposal also presents interoperable meter standards for easy integration. Distribution transformer analytics based on real-time data benefits utilities preventing outages, protects the revenue loss, improves the return on asset and reduces overall maintenance cost by predictive monitoring.

Keywords: electric vehicle charging, transformer monitoring, data analytics, intelligent grid

Procedia PDF Downloads 793
23058 Self-Organizing Maps for Credit Card Fraud Detection

Authors: ChunYi Peng, Wei Hsuan CHeng, Shyh Kuang Ueng

Abstract:

This study focuses on the application of self-organizing maps (SOM) technology in analyzing credit card transaction data, aiming to enhance the accuracy and efficiency of fraud detection. Som, as an artificial neural network, is particularly suited for pattern recognition and data classification, making it highly effective for the complex and variable nature of credit card transaction data. By analyzing transaction characteristics with SOM, the research identifies abnormal transaction patterns that could indicate potentially fraudulent activities. Moreover, this study has developed a specialized visualization tool to intuitively present the relationships between SOM analysis outcomes and transaction data, aiding financial institution personnel in quickly identifying and responding to potential fraud, thereby reducing financial losses. Additionally, the research explores the integration of SOM technology with composite intelligent system technologies (including finite state machines, fuzzy logic, and decision trees) to further improve fraud detection accuracy. This multimodal approach provides a comprehensive perspective for identifying and understanding various types of fraud within credit card transactions. In summary, by integrating SOM technology with visualization tools and composite intelligent system technologies, this research offers a more effective method of fraud detection for the financial industry, not only enhancing detection accuracy but also deepening the overall understanding of fraudulent activities.

Keywords: self-organizing map technology, fraud detection, information visualization, data analysis, composite intelligent system technologies, decision support technologies

Procedia PDF Downloads 60
23057 Robust Recognition of Locomotion Patterns via Data-Driven Machine Learning in the Cloud Environment

Authors: Shinoy Vengaramkode Bhaskaran, Kaushik Sathupadi, Sandesh Achar

Abstract:

Human locomotion recognition is important in a variety of sectors, such as robotics, security, healthcare, fitness tracking and cloud computing. With the increasing pervasiveness of peripheral devices, particularly Inertial Measurement Units (IMUs) sensors, researchers have attempted to exploit these advancements in order to precisely and efficiently identify and categorize human activities. This research paper introduces a state-of-the-art methodology for the recognition of human locomotion patterns in a cloud environment. The methodology is based on a publicly available benchmark dataset. The investigation implements a denoising and windowing strategy to deal with the unprocessed data. Next, feature extraction is adopted to abstract the main cues from the data. The SelectKBest strategy is used to abstract optimal features from the data. Furthermore, state-of-the-art ML classifiers are used to evaluate the performance of the system, including logistic regression, random forest, gradient boosting and SVM have been investigated to accomplish precise locomotion classification. Finally, a detailed comparative analysis of results is presented to reveal the performance of recognition models.

Keywords: artificial intelligence, cloud computing, IoT, human locomotion, gradient boosting, random forest, neural networks, body-worn sensors

Procedia PDF Downloads 14
23056 Feature Selection of Personal Authentication Based on EEG Signal for K-Means Cluster Analysis Using Silhouettes Score

Authors: Jianfeng Hu

Abstract:

Personal authentication based on electroencephalography (EEG) signals is one of the important field for the biometric technology. More and more researchers have used EEG signals as data source for biometric. However, there are some disadvantages for biometrics based on EEG signals. The proposed method employs entropy measures for feature extraction from EEG signals. Four type of entropies measures, sample entropy (SE), fuzzy entropy (FE), approximate entropy (AE) and spectral entropy (PE), were deployed as feature set. In a silhouettes calculation, the distance from each data point in a cluster to all another point within the same cluster and to all other data points in the closest cluster are determined. Thus silhouettes provide a measure of how well a data point was classified when it was assigned to a cluster and the separation between them. This feature renders silhouettes potentially well suited for assessing cluster quality in personal authentication methods. In this study, “silhouettes scores” was used for assessing the cluster quality of k-means clustering algorithm is well suited for comparing the performance of each EEG dataset. The main goals of this study are: (1) to represent each target as a tuple of multiple feature sets, (2) to assign a suitable measure to each feature set, (3) to combine different feature sets, (4) to determine the optimal feature weighting. Using precision/recall evaluations, the effectiveness of feature weighting in clustering was analyzed. EEG data from 22 subjects were collected. Results showed that: (1) It is possible to use fewer electrodes (3-4) for personal authentication. (2) There was the difference between each electrode for personal authentication (p<0.01). (3) There is no significant difference for authentication performance among feature sets (except feature PE). Conclusion: The combination of k-means clustering algorithm and silhouette approach proved to be an accurate method for personal authentication based on EEG signals.

Keywords: personal authentication, K-mean clustering, electroencephalogram, EEG, silhouettes

Procedia PDF Downloads 287
23055 Developing an Active Leisure Wear Range: A Dilemma for Khanna Enterprises

Authors: Jagriti Mishra, Vasundhara Chaudhary

Abstract:

Introduction: The case highlights various issues and challenges faced by Khanna Enterprises while conceptualizing and execution of launching Active Leisure wear in the domestic market, where different steps involved in the range planning and production have been elaborated. Although Khanna Enterprises was an established company which dealt in the production of knitted and woven garments, they took the risk of launching a new concept- Active Leisure wear for Millennials. Methodology: It is based on primary and secondary research where data collection has been done through survey, in-depth interviews and various reports, forecasts, and journals. Findings: The research through primary and secondary data and execution of active leisure wear substantiated the acceptance, not only by the millennials but also by the generation X. There was a demand of bigger sizes as well as more muted colours. Conclusion: The sales data paved the way for future product development in tune with the strengths of Khanna Enterprises.

Keywords: millennials, range planning, production, active leisure wear

Procedia PDF Downloads 209
23054 Computed Tomography Myocardial Perfusion on a Patient with Hypertrophic Cardiomyopathy

Authors: Jitendra Pratap, Daphne Prybyszcuk, Luke Elliott, Arnold Ng

Abstract:

Introduction: Coronary CT angiography is a non-invasive imaging technique for the assessment of coronary artery disease and has high sensitivity and negative predictive value. However, the correlation between the degree of CT coronary stenosis and the significance of hemodynamic obstruction is poor. The assessment of myocardial perfusion has mostly been undertaken by Nuclear Medicine (SPECT), but it is now possible to perform stress myocardial CT perfusion (CTP) scans quickly and effectively using CT scanners with high temporal resolution. Myocardial CTP is in many ways similar to neuro perfusion imaging technique, where radiopaque iodinated contrast is injected intravenously, transits the pulmonary and cardiac structures, and then perfuses through the coronary arteries into the myocardium. On the Siemens Force CT scanner, a myocardial perfusion scan is performed using a dynamic axial acquisition, where the scanner shuffles in and out every 1-3 seconds (heart rate dependent) to be able to cover the heart in the z plane. This is usually performed over 38 seconds. Report: A CT myocardial perfusion scan can be utilised to complement the findings of a CT Coronary Angiogram. Implementing a CT Myocardial Perfusion study as part of a routine CT Coronary Angiogram procedure provides a ‘One Stop Shop’ for diagnosis of coronary artery disease. This case study demonstrates that although the CT Coronary Angiogram was within normal limits, the perfusion scan provided additional, clinically significant information in regards to the haemodynamics within the myocardium of a patient with Hypertrophic Obstructive Cardio Myopathy (HOCM). This negated the need for further diagnostics studies such as cardiac ECHO or Nuclear Medicine Stress tests. Conclusion: CT coronary angiography with adenosine stress myocardial CTP was utilised in this case to specifically exclude coronary artery disease in conjunction with accessing perfusion within the hypertrophic myocardium. Adenosine stress myocardial CTP demonstrated the reduced myocardial blood flow within the hypertrophic myocardium, but the coronary arteries did not show any obstructive disease. A CT coronary angiogram scan protocol that incorporates myocardial perfusion can provide diagnostic information on the haemodynamic significance of any coronary artery stenosis and has the potential to be a “One Stop Shop” for cardiac imaging.

Keywords: CT, cardiac, myocardium, perfusion

Procedia PDF Downloads 136
23053 A Review of Data Visualization Best Practices: Lessons for Open Government Data Portals

Authors: Bahareh Ansari

Abstract:

Background: The Open Government Data (OGD) movement in the last decade has encouraged many government organizations around the world to make their data publicly available to advance democratic processes. But current open data platforms have not yet reached to their full potential in supporting all interested parties. To make the data useful and understandable for everyone, scholars suggested that opening the data should be supplemented by visualization. However, different visualizations of the same information can dramatically change an individual’s cognitive and emotional experience in working with the data. This study reviews the data visualization literature to create a list of the methods empirically tested to enhance users’ performance and experience in working with a visualization tool. This list can be used in evaluating the OGD visualization practices and informing the future open data initiatives. Methods: Previous reviews of visualization literature categorized the visualization outcomes into four categories including recall/memorability, insight/comprehension, engagement, and enjoyment. To identify the papers, a search for these outcomes was conducted in the abstract of the publications of top-tier visualization venues including IEEE Transactions for Visualization and Computer Graphics, Computer Graphics, and proceedings of the CHI Conference on Human Factors in Computing Systems. The search results are complemented with a search in the references of the identified articles, and a search for 'open data visualization,' and 'visualization evaluation' keywords in the IEEE explore and ACM digital libraries. Articles are included if they provide empirical evidence through conducting controlled user experiments, or provide a review of these empirical studies. The qualitative synthesis of the studies focuses on identification and classifying the methods, and the conditions under which they are examined to positively affect the visualization outcomes. Findings: The keyword search yields 760 studies, of which 30 are included after the title/abstract review. The classification of the included articles shows five distinct methods: interactive design, aesthetic (artistic) style, storytelling, decorative elements that do not provide extra information including text, image, and embellishment on the graphs), and animation. Studies on decorative elements show consistency on the positive effects of these elements on user engagement and recall but are less consistent in their examination of the user performance. This inconsistency could be attributable to the particular data type or specific design method used in each study. The interactive design studies are consistent in their findings of the positive effect on the outcomes. Storytelling studies show some inconsistencies regarding the design effect on user engagement, enjoyment, recall, and performance, which could be indicative of the specific conditions required for the use of this method. Last two methods, aesthetics and animation, have been less frequent in the included articles, and provide consistent positive results on some of the outcomes. Implications for e-government: Review of the visualization best-practice methods show that each of these methods is beneficial under specific conditions. By using these methods in a potentially beneficial condition, OGD practices can promote a wide range of individuals to involve and work with the government data and ultimately engage in government policy-making procedures.

Keywords: best practices, data visualization, literature review, open government data

Procedia PDF Downloads 109
23052 Reduced Power Consumption by Randomization for DSI3

Authors: David Levy

Abstract:

The newly released Distributed System Interface 3 (DSI3) Bus Standard specification defines 3 modulation levels from which 16 valid symbols are coded. This structure creates power consumption variations depending on the transmitted data of a factor of more than 2 between minimum and maximum. The power generation unit has to consider therefore the worst case maximum consumption all the time and be built accordingly. This paper proposes a method to reduce both the average current consumption and worst case current consumption. The transmitter randomizes the data using several pseudo-random sequences. It then estimates the energy consumption of the generated frames and selects to transmit the one which consumes the least. The transmitter also prepends the index of the pseudo-random sequence, which is not randomized, to allow the receiver to recover the original data using the correct sequence. We show that in the case that the frame occupies most of the DSI3 synchronization period, we achieve average power consumption reduction by up to 13% and the worst case power consumption is reduced by 17.7%.

Keywords: DSI3, energy, power consumption, randomization

Procedia PDF Downloads 539
23051 System Identification of Building Structures with Continuous Modeling

Authors: Ruichong Zhang, Fadi Sawaged, Lotfi Gargab

Abstract:

This paper introduces a wave-based approach for system identification of high-rise building structures with a pair of seismic recordings, which can be used to evaluate structural integrity and detect damage in post-earthquake structural condition assessment. The fundamental of the approach is based on wave features of generalized impulse and frequency response functions (GIRF and GFRF), i.e., wave responses at one structural location to an impulsive motion at another reference location in time and frequency domains respectively. With a pair of seismic recordings at the two locations, GFRF is obtainable as Fourier spectral ratio of the two recordings, and GIRF is then found with the inverse Fourier transformation of GFRF. With an appropriate continuous model for the structure, a closed-form solution of GFRF, and subsequent GIRF, can also be found in terms of wave transmission and reflection coefficients, which are related to structural physical properties above the impulse location. Matching the two sets of GFRF and/or GIRF from recordings and the model helps identify structural parameters such as wave velocity or shear modulus. For illustration, this study examines ten-story Millikan Library in Pasadena, California with recordings of Yorba Linda earthquake of September 3, 2002. The building is modelled as piecewise continuous layers, with which GFRF is derived as function of such building parameters as impedance, cross-sectional area, and damping. GIRF can then be found in closed form for some special cases and numerically in general. Not only does this study reveal the influential factors of building parameters in wave features of GIRF and GRFR, it also shows some system-identification results, which are consistent with other vibration- and wave-based results. Finally, this paper discusses the effectiveness of the proposed model in system identification.

Keywords: wave-based approach, seismic responses of buildings, wave propagation in structures, construction

Procedia PDF Downloads 235
23050 Ensemble-Based SVM Classification Approach for miRNA Prediction

Authors: Sondos M. Hammad, Sherin M. ElGokhy, Mahmoud M. Fahmy, Elsayed A. Sallam

Abstract:

In this paper, an ensemble-based Support Vector Machine (SVM) classification approach is proposed. It is used for miRNA prediction. Three problems, commonly associated with previous approaches, are alleviated. These problems arise due to impose assumptions on the secondary structural of premiRNA, imbalance between the numbers of the laboratory checked miRNAs and the pseudo-hairpins, and finally using a training data set that does not consider all the varieties of samples in different species. We aggregate the predicted outputs of three well-known SVM classifiers; namely, Triplet-SVM, Virgo and Mirident, weighted by their variant features without any structural assumptions. An additional SVM layer is used in aggregating the final output. The proposed approach is trained and then tested with balanced data sets. The results of the proposed approach outperform the three base classifiers. Improved values for the metrics of 88.88% f-score, 92.73% accuracy, 90.64% precision, 96.64% specificity, 87.2% sensitivity, and the area under the ROC curve is 0.91 are achieved.

Keywords: MiRNAs, SVM classification, ensemble algorithm, assumption problem, imbalance data

Procedia PDF Downloads 351
23049 Quality of Life of Patients on Oral Antiplatelet Therapy in Outpatient Cardiac Department Dr. Hasan Sadikin Central General Hospital Bandung

Authors: Andhiani Sharfina Arnellya, Mochammad Indra Permana, Dika Pramita Destiani, Ellin Febrina

Abstract:

Health Research Data, Ministry of Health of Indonesia in 2007, showed coronary heart disease (CHD) or coronary artery disease (CAD) was the third leading cause of death in Indonesia after hypertension and stroke with 7.2% incidence rate. Antiplatelet is one of the important therapy in management of patients with CHD. In addition to therapeutic effect on patients, quality of life is one aspect of another assessment to see the success of antiplatelet therapy. The purpose of this study was to determine the quality of life of patients on oral antiplatelet therapy in outpatient cardiac department Dr. Hasan Sadikin central general hospital, Bandung, Indonesia. This research is a cross sectional by collecting data through quality of life questionnaire of patients which performed prospectively as primary data and secondary data from medical record of patients. The results of this study showed that 54.3% of patients had a good quality of life, 45% had a moderate quality of life, and 0.7% had a poor quality of life. There are no significant differences in quality of life-based on age, gender, diagnosis, and duration of drug use.

Keywords: antiplatelet, quality of life, coronary artery disease, coronary heart disease

Procedia PDF Downloads 326
23048 Commissioning of a Flattening Filter Free (FFF) using an Anisotropic Analytical Algorithm (AAA)

Authors: Safiqul Islam, Anamul Haque, Mohammad Amran Hossain

Abstract:

Aim: To compare the dosimetric parameters of the flattened and flattening filter free (FFF) beam and to validate the beam data using anisotropic analytical algorithm (AAA). Materials and Methods: All the dosimetric data’s (i.e. depth dose profiles, profile curves, output factors, penumbra etc.) required for the beam modeling of AAA were acquired using the Blue Phantom RFA for 6 MV, 6 FFF, 10MV & 10FFF. Progressive resolution Optimizer and Dose Volume Optimizer algorithm for VMAT and IMRT were are also configured in the beam model. Beam modeling of the AAA were compared with the measured data sets. Results: Due to the higher and lover energy component in 6FFF and 10 FFF the surface doses are 10 to 15% higher compared to flattened 6 MV and 10 MV beams. FFF beam has a lower mean energy compared to the flattened beam and the beam quality index were 6 MV 0.667, 6FFF 0.629, 10 MV 0.74 and 10 FFF 0.695 respectively. Gamma evaluation with 2% dose and 2 mm distance criteria for the Open Beam, IMRT and VMAT plans were also performed and found a good agreement between the modeled and measured data. Conclusion: We have successfully modeled the AAA algorithm for the flattened and FFF beams and achieved a good agreement with the calculated and measured value.

Keywords: commissioning of a Flattening Filter Free (FFF) , using an Anisotropic Analytical Algorithm (AAA), flattened beam, parameters

Procedia PDF Downloads 304
23047 Molecular Characterization of Polyploid Bamboo (Dendrocalamus hamiltonii) Using Microsatellite Markers

Authors: Rajendra K. Meena, Maneesh S. Bhandari, Santan Barthwal, Harish S. Ginwal

Abstract:

Microsatellite markers are the most valuable tools for the characterization of plant genetic resources or population genetic analysis. Since it is codominant and allelic markers, utilizing them in polyploid species remained doubtful. In such cases, the microsatellite marker is usually analyzed by treating them as a dominant marker. In the current study, it has been showed that despite losing the advantage of co-dominance, microsatellite markers are still a powerful tool for genotyping of polyploid species because of availability of large number of reproducible alleles per locus. It has been studied by genotyping of 19 subpopulations of Dendrocalamus hamiltonii (hexaploid bamboo species) with 17 polymorphic simple sequence repeat (SSR) primer pairs. Among these, ten primers gave typical banding pattern of microsatellite marker as expected in diploid species, but rest 7 gave an unusual pattern, i.e., more than two bands per locus per genotype. In such case, genotyping data are generally analyzed by considering as dominant markers. In the current study, data were analyzed in both ways as dominant and co-dominant. All the 17 primers were first scored as nonallelic data and analyzed; later, the ten primers giving standard banding patterns were analyzed as allelic data and the results were compared. The UPGMA clustering and genetic structure showed that results obtained with both the data sets are very similar with slight variation, and therefore the SSR marker could be utilized to characterize polyploid species by considering them as a dominant marker. The study is highly useful to widen the scope for SSR markers applications and beneficial to the researchers dealing with polyploid species.

Keywords: microsatellite markers, Dendrocalamus hamiltonii, dominant and codominant, polyploids

Procedia PDF Downloads 146
23046 The Sapir-Whorf Hypothesis and Multicultural Effects on Translators: A Case Study from Chinese Ethnic Minority Literature

Authors: Yuqiao Zhou

Abstract:

The Sapir-Whorf hypothesis (SWH) emphasizes the effect produced by language on people’s minds. According to linguistic relativity, language has evolved over the course of human life on earth, and, in turn, the acquisition of language shapes learners’ thoughts. Despite much attention drawn by SWH, few scholars have attempted to analyse people’s thoughts via their literary works. And yet, the linguistic choices that create a narrative can enable us to examine its writer’s thoughts. Still, less work has been done on the impact of language on the minds of bilingual people. Internationalization has resulted in an increasing number of bilingual and multilingual individuals. In China, where more than one hundred languages are used for communication, most people are bilingual in Mandarin Chinese (the official language of China) and their own dialect. Taking as its corpus the ethnic minority myth of Ge Sa-er Wang by Alai and its English translation by Goldblatt and Lin, this paper aims to analyse the effects of culture on bilingual people’s minds. It will first analyse Alai’s thoughts on using the original version of Ge Sa-er Wang; next, it will examine the thoughts of the two translators by looking at translation choices made in the English version; finally, it will compare the cultural influences evident in the thoughts of Alai, and Goldblatt and Lin. Whereas Alai can speak two Sino-Tibetan languages – Mandarin Chinese and Tibetan – Goldblatt and Lin can speak two languages from different families – Mandarin Chinese (a Sino-Tibetan language) and English (an Indo-European language). The results reveal two systems of thought existing in the translators’ minds; Alai’s text, on the other hand, does not reveal a significant influence from North China, where Mandarin Chinese originated. The findings reveal the inconsistency of a second language’s influence on people’s minds. Notably, they suggest that the more different the two languages are, the greater the influence produced by the second language culture on people’s thoughts. It is hoped that this research will expand the scope of SWH as well as shed light on future translation studies on ethnic minority literature.

Keywords: Sapir-Whorf hypothesis, cultural translation, cultural-specific items, Ge Sa-er Wang, ethnic minority literature, Tibet

Procedia PDF Downloads 127
23045 Combined Tarsal Coalition Resection and Arthroereisis in Treatment of Symptomatic Rigid Flat Foot in Pediatric Population

Authors: Michael Zaidman, Naum Simanovsky

Abstract:

Introduction. Symptomatic tarsal coalition with rigid flat foot often demands operative solution. An isolated coalition resection does not guarantee pain relief; correction of co-existing foot deformity may be required. The objective of the study was to analyze the results of combination of tarsal coalition resection and arthroereisis. Patients and methods. We retrospectively reviewed medical records and radiographs of children operatively treated in our institution for symptomatic calcaneonavicular or talocalcaneal coalition between the years 2019 and 2022. Eight patients (twelve feet), 4 boys and 4 girls with mean age 11.2 years, were included in the study. In six patients (10 feet) calcaneonavicular coalition was diagnosed, two patients (two feet) sustained talonavicular coalition. To quantify degrees of foot deformity, we used calcaneal pitch angle, lateral talar-first metatarsal (Meary's) angle, and talonavicular coverage angle. The clinical results were assessed using the American Orthopaedic Foot and Ankle Society (AOFAS) Ankle Hindfoot Score. Results. The mean follow-up was 28 month. The preoperative mean talonavicular coverage angle was 17,75º as compared with postoperative mean angle of 5.4º. The calcaneal pitch angle improved from mean 6,8º to 16,4º. The mean preoperative Meary’s angle of -11.3º improved to mean 2.8º. The preoperative mean AOFAS score improved from 54.7 to 93.1 points post-operatively. In nine of twelve feet, overall clinical outcome judged by AOFAS scale was excellent (90-100 points), in three feet was good (80-90 points). Six patients (ten feet) obviously improved their subtalar range of motion. Conclusion. For symptomatic stiff or rigid flat feet associated with tarsal coalition, the combination of coalition resection and arthroereisis leads to normalization of radiographic parameters, clinical and functional improvement with good patient’s satisfaction and likely to be more effective than the isolated procedures.

Keywords: rigid flat foot, tarsal coalition resection, arthroereisis, outcome

Procedia PDF Downloads 65
23044 Effectiveness of Exercise and TENS in the Treatment of Temporomandibular Joint Disorders

Authors: Arben Murtezani, Shefqet Mrasori, Vančo Spirov, Bukurije Rama, Oliver Dimitrovski, Visar Bunjaku

Abstract:

Overview: Temporomandibular disorders (TMDs) are chronic musculoskeletal pain conditions. Clinical indicators of discomfort are related to the use of the joint stiffness during first motions after extended rest and restricted joint range of motion can cause substantial pain and disability. There is little evidence that physical therapy methods of management cause long-lasting reduction in signs and symptoms. Exercise programs premeditated to improve physical fitness have beneficial effects on chronic pain and disability of the musculoskeletal system. Objective: The aim of this study was to assess the effectiveness of physical therapy interventions in the management of temporomandibular disorders. Materials and Methods: A prospective comparative study with a 2-month follow-up period was conducted between April 2016 and June 2016 at the Physical Medicine and Rehabilitation Clinic in Prishtina. Forty six patients with TMDs, (more than three months duration of symptoms) were randomized into two groups: the TENS therapy group (n=24) and combination of active exercise and manual therapy group (n=22). The TENS therapy group patients were treated with twelve sessions of TENS. The treatment period of both groups was 3 weeks at an outpatient clinic. Following main outcome measures were evaluated: (1) pain at rest (2) pain at stress (3) impairment (4) mouth opening at base-line, before and after treatment and at 3 month follow-up. Results: Significant reduction in pain was observed in both treatment groups. In the TENS group 73% (16/22) achieved at least 80% improvement from baseline in TMJ pain at 2 months compared with 54% (13/24) in the exercise group (difference of 19%; 95% confidence interval 220 to 30%). Active and passive maximum mouth opening has been greater in the TENS group (p < 0.05). Conclusion: Exercise therapy in combination with TENS seems to be useful in the treatment of temporomandibular disorders.

Keywords: temporomandibular joint disorders, TENS, manual therapy, exercise

Procedia PDF Downloads 235
23043 Big Data Analysis Approach for Comparison New York Taxi Drivers' Operation Patterns between Workdays and Weekends Focusing on the Revenue Aspect

Authors: Yongqi Dong, Zuo Zhang, Rui Fu, Li Li

Abstract:

The records generated by taxicabs which are equipped with GPS devices is of vital importance for studying human mobility behavior, however, here we are focusing on taxi drivers' operation strategies between workdays and weekends temporally and spatially. We identify a group of valuable characteristics through large scale drivers' behavior in a complex metropolis environment. Based on the daily operations of 31,000 taxi drivers in New York City, we classify drivers into top, ordinary and low-income groups according to their monthly working load, daily income, daily ranking and the variance of the daily rank. Then, we apply big data analysis and visualization methods to compare the different characteristics among top, ordinary and low income drivers in selecting of working time, working area as well as strategies between workdays and weekends. The results verify that top drivers do have special operation tactics to help themselves serve more passengers, travel faster thus make more money per unit time. This research provides new possibilities for fully utilizing the information obtained from urban taxicab data for estimating human behavior, which is not only very useful for individual taxicab driver but also to those policy-makers in city authorities.

Keywords: big data, operation strategies, comparison, revenue, temporal, spatial

Procedia PDF Downloads 228
23042 A Comparative Analysis of Innovation Maturity Models: Towards the Development of a Technology Management Maturity Model

Authors: Nikolett Deutsch, Éva Pintér, Péter Bagó, Miklós Hetényi

Abstract:

Strategic technology management has emerged and evolved parallelly with strategic management paradigms. It focuses on the opportunity for organizations operating mainly in technology-intensive industries to explore and exploit technological capabilities upon which competitive advantage can be obtained. As strategic technology management involves multifunction within an organization, requires broad and diversified knowledge, and must be developed and implemented with business objectives to enable a firm’s profitability and growth, excellence in strategic technology management provides unique opportunities for organizations in terms of building a successful future. Accordingly, a framework supporting the evaluation of the technological readiness level of management can significantly contribute to developing organizational competitiveness through a better understanding of strategic-level capabilities and deficiencies in operations. In the last decade, several innovation maturity assessment models have appeared and become designated management tools that can serve as references for future practical approaches expected to be used by corporate leaders, strategists, and technology managers to understand and manage technological capabilities and capacities. The aim of this paper is to provide a comprehensive review of the state-of-the-art innovation maturity frameworks, to investigate the critical lessons learned from their application, to identify the similarities and differences among the models, and identify the main aspects and elements valid for the field and critical functions of technology management. To this end, a systematic literature review was carried out considering the relevant papers and articles published in highly ranked international journals around the 27 most widely known innovation maturity models from four relevant digital sources. Key findings suggest that despite the diversity of the given models, there is still room for improvement regarding the common understanding of innovation typologies, the full coverage of innovation capabilities, and the generalist approach to the validation and practical applicability of the structure and content of the models. Furthermore, the paper proposes an initial structure by considering the maturity assessment of the technological capacities and capabilities - i.e., technology identification, technology selection, technology acquisition, technology exploitation, and technology protection - covered by strategic technology management.

Keywords: innovation capabilities, innovation maturity models, technology audit, technology management, technology management maturity models

Procedia PDF Downloads 63
23041 Using Morlet Wavelet Filter to Denoising Geoelectric ‘Disturbances’ Map of Moroccan Phosphate Deposit ‘Disturbances’

Authors: Saad Bakkali

Abstract:

Morocco is a major producer of phosphate, with an annual output of 19 million tons and reserves in excess of 35 billion cubic meters. This represents more than 75% of world reserves. Resistivity surveys have been successfully used in the Oulad Abdoun phosphate basin. A Schlumberger resistivity survey over an area of 50 hectares was carried out. A new field procedure based on analytic signal response of resistivity data was tested to deal with the presence of phosphate deposit disturbances. A resistivity map was expected to allow the electrical resistivity signal to be imaged in 2D. 2D wavelet is standard tool in the interpretation of geophysical potential field data. Wavelet transform is particularly suitable in denoising, filtering and analyzing geophysical data singularities. Wavelet transform tools are applied to analysis of a moroccan phosphate deposit ‘disturbances’. Wavelet approach applied to modeling surface phosphate “disturbances” was found to be consistently useful.

Keywords: resistivity, Schlumberger, phosphate, wavelet, Morocco

Procedia PDF Downloads 424
23040 Imputation of Incomplete Large-Scale Monitoring Count Data via Penalized Estimation

Authors: Mohamed Dakki, Genevieve Robin, Marie Suet, Abdeljebbar Qninba, Mohamed A. El Agbani, Asmâa Ouassou, Rhimou El Hamoumi, Hichem Azafzaf, Sami Rebah, Claudia Feltrup-Azafzaf, Nafouel Hamouda, Wed a.L. Ibrahim, Hosni H. Asran, Amr A. Elhady, Haitham Ibrahim, Khaled Etayeb, Essam Bouras, Almokhtar Saied, Ashrof Glidan, Bakar M. Habib, Mohamed S. Sayoud, Nadjiba Bendjedda, Laura Dami, Clemence Deschamps, Elie Gaget, Jean-Yves Mondain-Monval, Pierre Defos Du Rau

Abstract:

In biodiversity monitoring, large datasets are becoming more and more widely available and are increasingly used globally to estimate species trends and con- servation status. These large-scale datasets challenge existing statistical analysis methods, many of which are not adapted to their size, incompleteness and heterogeneity. The development of scalable methods to impute missing data in incomplete large-scale monitoring datasets is crucial to balance sampling in time or space and thus better inform conservation policies. We developed a new method based on penalized Poisson models to impute and analyse incomplete monitoring data in a large-scale framework. The method al- lows parameterization of (a) space and time factors, (b) the main effects of predic- tor covariates, as well as (c) space–time interactions. It also benefits from robust statistical and computational capability in large-scale settings. The method was tested extensively on both simulated and real-life waterbird data, with the findings revealing that it outperforms six existing methods in terms of missing data imputation errors. Applying the method to 16 waterbird species, we estimated their long-term trends for the first time at the entire North African scale, a region where monitoring data suffer from many gaps in space and time series. This new approach opens promising perspectives to increase the accuracy of species-abundance trend estimations. We made it freely available in the r package ‘lori’ (https://CRAN.R-project.org/package=lori) and recommend its use for large- scale count data, particularly in citizen science monitoring programmes.

Keywords: biodiversity monitoring, high-dimensional statistics, incomplete count data, missing data imputation, waterbird trends in North-Africa

Procedia PDF Downloads 160
23039 Statistical Investigation Projects: A Way for Pre-Service Mathematics Teachers to Actively Solve a Campus Problem

Authors: Muhammet Şahal, Oğuz Köklü

Abstract:

As statistical thinking and problem-solving processes have become increasingly important, teachers need to be more rigorously prepared with statistical knowledge to teach their students effectively. This study examined preservice mathematics teachers' development of statistical investigation projects using data and exploratory data analysis tools, following a design-based research perspective and statistical investigation cycle. A total of 26 pre-service senior mathematics teachers from a public university in Turkiye participated in the study. They formed groups of 3-4 members voluntarily and worked on their statistical investigation projects for six weeks. The data sources were audio recordings of pre-service teachers' group discussions while working on their projects in class, whole-class video recordings, and each group’s weekly and final reports. As part of the study, we reviewed weekly reports, provided timely feedback specific to each group, and revised the following week's class work based on the groups’ needs and development in their project. We used content analysis to analyze groups’ audio and classroom video recordings. The participants encountered several difficulties, which included formulating a meaningful statistical question in the early phase of the investigation, securing the most suitable data collection strategy, and deciding on the data analysis method appropriate for their statistical questions. The data collection and organization processes were challenging for some groups and revealed the importance of comprehensive planning. Overall, preservice senior mathematics teachers were able to work on a statistical project that contained the formulation of a statistical question, planning, data collection, analysis, and reaching a conclusion holistically, even though they faced challenges because of their lack of experience. The study suggests that preservice senior mathematics teachers have the potential to apply statistical knowledge and techniques in a real-world context, and they could proceed with the project with the support of the researchers. We provided implications for the statistical education of teachers and future research.

Keywords: design-based study, pre-service mathematics teachers, statistical investigation projects, statistical model

Procedia PDF Downloads 89
23038 Aero-Hydrodynamic Model for a Floating Offshore Wind Turbine

Authors: Beatrice Fenu, Francesco Niosi, Giovanni Bracco, Giuliana Mattiazzo

Abstract:

In recent years, Europe has seen a great development of renewable energy, in a perspective of reducing polluting emissions and transitioning to cleaner forms of energy, as established by the European Green New Deal. Wind energy has come to cover almost 15% of European electricity needs andis constantly growing. In particular, far-offshore wind turbines are attractive from the point of view of exploiting high-speed winds and high wind availability. Considering offshore wind turbine siting that combines the resources analysis, the bathymetry, environmental regulations, and maritime traffic and considering the waves influence in the stability of the platform, the hydrodynamic characteristics of the platform become fundamental for the evaluation of the performances of the turbine, especially for the pitch motion. Many platform's geometries have been studied and used in the last few years. Their concept is based upon different considerations as hydrostatic stability, material, cost and mooring system. A new method to reach a high-performances substructure for different kinds of wind turbines is proposed. The system that considers substructure, mooring, and wind turbine is implemented in Orcaflex, and the simulations are performed considering several sea states and wind speeds. An external dynamic library is implemented for the turbine control system. The study shows the comparison among different substructures and the new concepts developed. In order to validate the model, CFD simulations will be performed by mean of STAR CCM+, and a comparison between rigid and elastic body for what concerns blades and tower will be carried out. A global model will be built to predict the productivity of the floating turbine according to siting, resources, substructure, and mooring. The Levelized Cost of Electricity (LCOE) of the system is estimated, giving a complete overview about the advantages of floating offshore wind turbine plants. Different case studies will be presented.

Keywords: aero-hydrodynamic model, computational fluid dynamics, floating offshore wind, siting, verification, and validation

Procedia PDF Downloads 217
23037 The Study on the Tourism Routes to Create Interpretation for Promote Cultural Tourism in Bangnoi Floating Market, Bangkontee District, Samut Songkhram Province, Thailand

Authors: Pornnapat Berndt

Abstract:

The purpose of this research is to study the tourism routes in Bangnoi Floating Market, Bangkhontee District, Samut Songkhram province, Thailand in order to create type and form of interpretation to promote cultural tourism based on local community and visitor requirement. To accomplish the goals and objectives, qualitative research will be applied. The research instruments used are observation, questionnaires, basic interviews, in-depth interviews, focus group, interviewed of key local informants including site visitors. The study also uses both primary data and secondary data. A Statistical Package for Social Sciences (SPSS) was used to analyze the data. Descriptive and inferential statistics such as tables, percentage, mean and standard deviation were used for data analysis and summary. From research result, it is revealed that the local community requirement on types of interpretation conforms to visitors require which need guide post, guide book, etc. with up to date and informally content to present Bangnoi Floating Market which got the most demand score (3.78) considered as most wanted demand.

Keywords: interpretation, cultural tourism, tourism route, local community, stakeholders participated

Procedia PDF Downloads 295
23036 Modular Data and Calculation Framework for a Technology-based Mapping of the Manufacturing Process According to the Value Stream Management Approach

Authors: Tim Wollert, Fabian Behrendt

Abstract:

Value Stream Management (VSM) is a widely used methodology in the context of Lean Management for improving end-to-end material and information flows from a supplier to a customer from a company’s perspective. Whereas the design principles, e.g. Pull, value-adding, customer-orientation and further ones are still valid against the background of an increasing digitalized and dynamic environment, the methodology itself for mapping a value stream is characterized as time- and resource-intensive due to the high degree of manual activities. The digitalization of processes in the context of Industry 4.0 enables new opportunities to reduce these manual efforts and make the VSM approach more agile. The paper at hand aims at providing a modular data and calculation framework, utilizing the available business data, provided by information and communication technologies for automizing the value stream mapping process with focus on the manufacturing process.

Keywords: lean management 4.0, value stream management (VSM) 4.0, dynamic value stream mapping, enterprise resource planning (ERP)

Procedia PDF Downloads 152
23035 Self-Organizing Maps for Credit Card Fraud Detection and Visualization

Authors: Peng Chun-Yi, Chen Wei-Hsuan, Ueng Shyh-Kuang

Abstract:

This study focuses on the application of self-organizing maps (SOM) technology in analyzing credit card transaction data, aiming to enhance the accuracy and efficiency of fraud detection. Som, as an artificial neural network, is particularly suited for pattern recognition and data classification, making it highly effective for the complex and variable nature of credit card transaction data. By analyzing transaction characteristics with SOM, the research identifies abnormal transaction patterns that could indicate potentially fraudulent activities. Moreover, this study has developed a specialized visualization tool to intuitively present the relationships between SOM analysis outcomes and transaction data, aiding financial institution personnel in quickly identifying and responding to potential fraud, thereby reducing financial losses. Additionally, the research explores the integration of SOM technology with composite intelligent system technologies (including finite state machines, fuzzy logic, and decision trees) to further improve fraud detection accuracy. This multimodal approach provides a comprehensive perspective for identifying and understanding various types of fraud within credit card transactions. In summary, by integrating SOM technology with visualization tools and composite intelligent system technologies, this research offers a more effective method of fraud detection for the financial industry, not only enhancing detection accuracy but also deepening the overall understanding of fraudulent activities.

Keywords: self-organizing map technology, fraud detection, information visualization, data analysis, composite intelligent system technologies, decision support technologies

Procedia PDF Downloads 62
23034 Arthroscopic Fixation of Posterior Cruciate Ligament Avulsion Fracture through Posterior Trans Septal Portal Using Button Fixation Device: Mini Tight Rope

Authors: Ratnakar Rao, Subair Khan, Hari Haran

Abstract:

Posterior cruciate ligament (PCL) avulsion fractures is a rare condition and commonly mismanaged.Surgical reattachment has been shown to produce better result compared with conservative management.Only few techniques are reported in arthroscopic fixation of PCL Avulsion Fracture and they are complex.We describe a new technique in fixation of the PCL Avulsion fracture through a posterior trans septal portal using button fixation device (Mini Tight Rope). Eighteen patients with an isolated posterior cruciate ligament avulsion fracture were operated under arthroscopy. Standard Antero Medial Portal and Antero Lateral portals made and additional Postero Medial and Postero Lateral portals made and trans Septal portal established. Avulsion fracture identified, elevated, prepared. Reduction achieved using PCL Tibial guide (Arthrex) and fixation was achieved using Mini Tight Rope,Arthrex (2 buttons with a suture). Reduction confirmed using probe and Image intensifier. Postoperative assessment made clinically and radiologically. 15 patients had good to excellent results with no posterior sag or instability. The range of motion was normal. No complications were recorded per operatively. 2 patients had communition of the fragment while drilling, for one patient it was managed by suturing technique and the second patient PCL Reconstruction was done. One patient had persistent instability with poor outcome. Establishing trans septal portal helps in better visualization of the posterior compartment of the knee. Assessment of the bony fragment, preparation 0f the bone bed andit protects from injury to posterior neurovascular structures. Fixation using the button with suture (Mini Tight Rope) is stable and easily reproducible for PCL Avulsion fracture with single large fragment.

Keywords: PCL avulsion, arthroscopy, transeptal, minitight rope technique

Procedia PDF Downloads 258
23033 Exploring Social Impact of Emerging Technologies from Futuristic Data

Authors: Heeyeul Kwon, Yongtae Park

Abstract:

Despite the highly touted benefits, emerging technologies have unleashed pervasive concerns regarding unintended and unforeseen social impacts. Thus, those wishing to create safe and socially acceptable products need to identify such side effects and mitigate them prior to the market proliferation. Various methodologies in the field of technology assessment (TA), namely Delphi, impact assessment, and scenario planning, have been widely incorporated in such a circumstance. However, literatures face a major limitation in terms of sole reliance on participatory workshop activities. They unfortunately missed out the availability of a massive untapped data source of futuristic information flooding through the Internet. This research thus seeks to gain insights into utilization of futuristic data, future-oriented documents from the Internet, as a supplementary method to generate social impact scenarios whilst capturing perspectives of experts from a wide variety of disciplines. To this end, network analysis is conducted based on the social keywords extracted from the futuristic documents by text mining, which is then used as a guide to produce a comprehensive set of detailed scenarios. Our proposed approach facilitates harmonized depictions of possible hazardous consequences of emerging technologies and thereby makes decision makers more aware of, and responsive to, broad qualitative uncertainties.

Keywords: emerging technologies, futuristic data, scenario, text mining

Procedia PDF Downloads 492
23032 A Survey on Lossless Compression of Bayer Color Filter Array Images

Authors: Alina Trifan, António J. R. Neves

Abstract:

Although most digital cameras acquire images in a raw format, based on a Color Filter Array that arranges RGB color filters on a square grid of photosensors, most image compression techniques do not use the raw data; instead, they use the rgb result of an interpolation algorithm of the raw data. This approach is inefficient and by performing a lossless compression of the raw data, followed by pixel interpolation, digital cameras could be more power efficient and provide images with increased resolution given that the interpolation step could be shifted to an external processing unit. In this paper, we conduct a survey on the use of lossless compression algorithms with raw Bayer images. Moreover, in order to reduce the effect of the transition between colors that increase the entropy of the raw Bayer image, we split the image into three new images corresponding to each channel (red, green and blue) and we study the same compression algorithms applied to each one individually. This simple pre-processing stage allows an improvement of more than 15% in predictive based methods.

Keywords: bayer image, CFA, lossless compression, image coding standards

Procedia PDF Downloads 323