Search results for: image processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5842

Search results for: image processing

1942 Neuroimaging Markers for Screening Former NFL Players at Risk for Developing Alzheimer's Disease / Dementia Later in Life

Authors: Vijaykumar M. Baragi, Ramtilak Gattu, Gabriela Trifan, John L. Woodard, K. Meyers, Tim S. Halstead, Eric Hipple, Ewart Mark Haacke, Randall R. Benson

Abstract:

NFL players, by virtue of their exposure to repetitive head injury, are at least twice as likely to develop Alzheimer's disease (AD) and dementia as the general population. Early recognition and intervention prior to onset of clinical symptoms could potentially avert/delay the long-term consequences of these diseases. Since AD is thought to have a long preclinical incubation period, the aim of the current research was to determine whether former NFL players, referred to a depression center, showed evidence of incipient dementia in their structural imaging prior to diagnosis of dementia. Thus, to identify neuroimaging markers of AD, against which former NFL players would be compared, we conducted a comprehensive volumetric analysis using a cohort of early stage AD patients (ADNI) to produce a set of brain regions demonstrating sensitivity to early AD pathology (i.e., the “AD fingerprint”). A cohort of 46 former NFL players’ brain MRIs were then interrogated using the AD fingerprint. Brain scans were done using a T1-weighted MPRAGE sequence. The Free Surfer image analysis suite (version 6.0) was used to obtain the volumetric and cortical thickness data. A total of 55 brain regions demonstrated significant atrophy or ex vacuo dilatation bilaterally in AD patients vs. healthy controls. Of the 46 former NFL players, 19 (41%) demonstrated a greater than expected number of atrophied/dilated AD regions when compared with age-matched controls, presumably reflecting AD pathology.

Keywords: alzheimers, neuroimaging biomarkers, traumatic brain injury, free surfer, ADNI

Procedia PDF Downloads 154
1941 Challenges of Management of Acute Pancreatitis in Low Resource Setting

Authors: Md. Shakhawat Hossain, Jimma Hossain, Md. Naushad Ali

Abstract:

Acute pancreatitis is a dangerous medical emergency in the practice of gastroenterology. Management of acute pancreatitis needs multidisciplinary approach with support starts from emergency to ICU. So, there is a chance of mismanagement in every steps, especially in low resource settings. Other factors such as patient’s financial condition, education, social custom, transport facility, referral system from periphery may also challenge the current guidelines for management. The present study is intended to determine the clinico-pathological profile, severity assessment and challenges of management of acute pancreatitis in a government laid tertiary care hospital to image the real scenario of management in a low resource place. A total 100 patients of acute pancreatitis were studied in this prospective study, held in the Department of Gastroenterology, Rangpur medical college hospital, Bangladesh from July 2017 to July 2018 within one year. Regarding severity, 85 % of the patients were mild, whereas 13 were moderately severe, and 2 had severe acute pancreatitis according to the revised Atlanta criteria. The most common etiologies of acute pancreatitis in our study were gall stone (15%) and biliary sludge (15%), whereas 54% were idiopathic. The most common challenges we faced were delay in hospital admission (59%) and delay in hospital diagnosis (20%). Others are non-adherence of patient party, and lack of investigation facility, physician’s poor knowledge about current guidelines. We were able to give early aggressive fluid to only 18% of patients as per current guideline. Conclusion: Management of acute pancreatitis as per guideline is challenging when optimum facility is lacking. So, modified guidelines for assessment and management of acute pancreatitis should be prepared for low resource setting.

Keywords: acute pancreatitis, challenges of management, severity, prognosis

Procedia PDF Downloads 131
1940 Revisiting the Link between Corporate Social Performance and Corporate Financial Performance Post 2008 Global Economic Crisis

Authors: Anand Choudhary

Abstract:

Following the global economic crisis in 2008, businesses and more especially the big multinational conglomerates were increasingly viewed by the people world over as one of the major causes of the economic problems faced by millions globally, in terms of job loss and lifetime savings being wiped out as banks and pension funds went bankrupt and people stared at an insecure financial future. This caused a lot of resentment in the public against big businesses and fueled several protest movements by the people such as “Occupy Wall Street” in different parts of the world. This forced the big businesses to respond to the challenge by adopting more people-centric policies and initiatives for local communities in societies where they operate as part of their corporate social responsibility (CSR), in order to regain their social acceptance among the people whilst earning their ‘social license to operate’. The current paper studies many of such large MNCs across the United States of America, India and South Africa, which changed the way they did business earlier, following the global economic crisis in 2008, by incorporating capacity building initiatives for local communities as part of their CSR strategy and explores whether it has contributed to improving their financial performance. It is a conceptual research paper using secondary source data. The findings reveal that there is a positive correlation between the companies’ corporate social performance and corporate financial performance. In addition, the findings also bring to light that the MNCs examined as part of the current paper have improved their image in the eyes of their stakeholders following the change in their CSR strategy and initiatives.

Keywords: corporate social responsibility (CSR), Corporate Social Performance (CSP), Corporate Financial Performance (CFP), local communities

Procedia PDF Downloads 335
1939 Characterization of Laminar Flow and Power Consumption in Agitated Vessel with Curved Blade Agitator

Authors: Amine Benmoussa, Mohamed Bouanini, Mebrouk Rebhi

Abstract:

Stirring is one of the unifying processes which form part of the mechanical unit operations in process technology such chemical, biotechnological, pharmaceutical, petrochemical, cosmetic, and food processing. Therefore determining the level of mixing and overall behavior and performance of the mixing tanks are crucial from the product quality and process economics point of views. The most fundamental needs for the analysis of these processes from both a theoretical and industrial perspective are the knowledge of the hydrodynamic behavior and the flow structure in such tanks. Depending on the purpose of the operation carried out in mixer, the best choice for geometry of the tank and agitator type can vary widely. Initially, a local and global study namely the velocity and power number on a typical agitation system agitated by a mobile-type two-blade straight (d/D=0.5) allowed us to test the reliability of the CFD, the result were compared with those of experimental literature, a very good concordance was observed. The stream function, the velocity profile, the velocity fields and power number are analyzed. It was shown that the hydrodynamics is modified by the curvature of the mobile which plays a key role.

Keywords: agitated vessels, curved blade agitator, laminar flow, finite volume method

Procedia PDF Downloads 284
1938 System for Electromyography Signal Emulation Through the Use of Embedded Systems

Authors: Valentina Narvaez Gaitan, Laura Valentina Rodriguez Leguizamon, Ruben Dario Hernandez B.

Abstract:

This work describes a physiological signal emulation system that uses electromyography (EMG) signals obtained from muscle sensors in the first instance. These signals are used to extract their characteristics to model and emulate specific arm movements. The main objective of this effort is to develop a new biomedical software system capable of generating physiological signals through the use of embedded systems by establishing the characteristics of the acquired signals. The acquisition system used was Biosignals, which contains two EMG electrodes used to acquire signals from the forearm muscles placed on the extensor and flexor muscles. Processing algorithms were implemented to classify the signals generated by the arm muscles when performing specific movements such as wrist flexion extension, palmar grip, and wrist pronation-supination. Matlab software was used to condition and preprocess the signals for subsequent classification. Subsequently, the mathematical modeling of each signal is performed to be generated by the embedded system, with a validation of the accuracy of the obtained signal using the percentage of cross-correlation, obtaining a precision of 96%. The equations are then discretized to be emulated in the embedded system, obtaining a system capable of generating physiological signals according to the characteristics of medical analysis.

Keywords: classification, electromyography, embedded system, emulation, physiological signals

Procedia PDF Downloads 111
1937 LCA and Multi-Criteria Analysis of Fly Ash Concrete Pavements

Authors: Marcela Ondova, Adriana Estokova

Abstract:

Rapid industrialization results in increased use of natural resources bring along serious ecological and environmental imbalance due to the dumping of industrial wastes. Principles of sustainable construction have to be accepted with regard to the consumption of natural resources and the production of harmful emissions. Cement is a great importance raw material in the building industry and today is its large amount used in the construction of concrete pavements. Concerning raw materials cost and producing CO2 emission the replacing of cement in concrete mixtures with more sustainable materials is necessary. To reduce this environmental impact people all over the world are looking for a solution. Over a period of last ten years, the image of fly ash has completely been changed from a polluting waste to resource material and it can solve the major problems of cement use. Fly ash concretes are proposed as a potential approach for achieving substantial reductions in cement. It is known that it improves the workability of concrete, extends the life cycle of concrete roads, and reduces energy use and greenhouse gas as well as amount of coal combustion products that must be disposed in landfills. Life cycle assessment also proved that a concrete pavement with fly ash cement replacement is considerably more environmentally friendly compared to standard concrete roads. In addition, fly ash is cheap raw material, and the costs saving are guaranteed. The strength properties, resistance to a frost or de-icing salts, which are important characteristics in the construction of concrete pavements, have reached the required standards as well. In terms of human health it can´t be stated that a concrete cover with fly ash could be dangerous compared with a cover without fly ash. Final Multi-criteria analysis also pointed that a concrete with fly ash is a clearly proper solution.

Keywords: life cycle assessment, fly ash, waste, concrete pavements

Procedia PDF Downloads 406
1936 Recovery of Metals from Electronic Waste by Physical and Chemical Recycling Processes

Authors: Muammer Kaya

Abstract:

The main purpose of this article is to provide a comprehensive review of various physical and chemical processes for electronic waste (e-waste) recycling, their advantages and shortfalls towards achieving a cleaner process of waste utilization, with especial attention towards extraction of metallic values. Current status and future perspectives of waste printed circuit boards (PCBs) recycling are described. E-waste characterization, dismantling/ disassembly methods, liberation and classification processes, composition determination techniques are covered. Manual selective dismantling and metal-nonmetal liberation at – 150 µm at two step crushing are found to be the best. After size reduction, mainly physical separation/concentration processes employing gravity, electrostatic, magnetic separators, froth floatation etc., which are commonly used in mineral processing, have been critically reviewed here for separation of metals and non-metals, along with useful utilizations of the non-metallic materials. The recovery of metals from e-waste material after physical separation through pyrometallurgical, hydrometallurgical or biohydrometallurgical routes is also discussed along with purification and refining and some suitable flowsheets are also given. It seems that hydrometallurgical route will be a key player in the base and precious metals recoveries from e-waste. E-waste recycling will be a very important sector in the near future from economic and environmental perspectives.

Keywords: e-waste, WEEE, recycling, metal recovery, hydrometallurgy, pirometallurgy, biometallurgy

Procedia PDF Downloads 356
1935 Linking Excellence in Biomedical Knowledge and Computational Intelligence Research for Personalized Management of Cardiovascular Diseases within Personal Health Care

Authors: T. Rocha, P. Carvalho, S. Paredes, J. Henriques, A. Bianchi, V. Traver, A. Martinez

Abstract:

The main goal of LINK project is to join competences in intelligent processing in order to create a research ecosystem to address two central scientific and technical challenges for personal health care (PHC) deployment: i) how to merge clinical evidence knowledge in computational decision support systems for PHC management and ii) how to provide achieve personalized services, i.e., solutions adapted to the specific user needs and characteristics. The final goal of one of the work packages (WP2), designated Sustainable Linking and Synergies for Excellence, is the definition, implementation and coordination of the necessary activities to create and to strengthen durable links between the LiNK partners. This work focuses on the strategy that has been followed to achieve the definition of the Research Tracks (RT), which will support a set of actions to be pursued along the LiNK project. These include common research activities, knowledge transfer among the researchers of the consortium, and PhD student and post-doc co-advisement. Moreover, the RTs will establish the basis for the definition of concepts and their evolution to project proposals.

Keywords: LiNK Twin European Project, personal health care, cardiovascular diseases, research tracks

Procedia PDF Downloads 216
1934 Laser Registration and Supervisory Control of neuroArm Robotic Surgical System

Authors: Hamidreza Hoshyarmanesh, Hosein Madieh, Sanju Lama, Yaser Maddahi, Garnette R. Sutherland, Kourosh Zareinia

Abstract:

This paper illustrates the concept of an algorithm to register specified markers on the neuroArm surgical manipulators, an image-guided MR-compatible tele-operated robot for microsurgery and stereotaxy. Two range-finding algorithms, namely time-of-flight and phase-shift, are evaluated for registration and supervisory control. The time-of-flight approach is implemented in a semi-field experiment to determine the precise position of a tiny retro-reflective moving object. The moving object simulates a surgical tool tip. The tool is a target that would be connected to the neuroArm end-effector during surgery inside the magnet bore of the MR imaging system. In order to apply flight approach, a 905-nm pulsed laser diode and an avalanche photodiode are utilized as the transmitter and receiver, respectively. For the experiment, a high frequency time to digital converter was designed using a field-programmable gate arrays. In the phase-shift approach, a continuous green laser beam with a wavelength of 530 nm was used as the transmitter. Results showed that a positioning error of 0.1 mm occurred when the scanner-target point distance was set in the range of 2.5 to 3 meters. The effectiveness of this non-contact approach exhibited that the method could be employed as an alternative for conventional mechanical registration arm. Furthermore, the approach is not limited by physical contact and extension of joint angles.

Keywords: 3D laser scanner, intraoperative MR imaging, neuroArm, real time registration, robot-assisted surgery, supervisory control

Procedia PDF Downloads 286
1933 The Effect of Extrusion Processing on Solubility and Molecular Weight of Water-Soluble Arabinoxylan

Authors: Abdulmannan Fadel

Abstract:

Arabinoxylan is a non-starch polysaccharide (NSP), which is one of the most important polysaccharides contained within cereal grains. Wheat endosperm pentosan and rice bran contain a significant amount of arabinoxylan (7% in rice bran and 10-12% in wheat endosperm pentosan). Several methods have been used for arabinoxylan extraction with varying degrees of success e.g. enzymatic and alkaline treatment. Yet, the use of extrusion alone as a pre-treatment to increase the yield and reduce the molecular weight in wheat endosperm pentosan and rice bran has not been investigated. The samples (wheat pentosan and rice bran) were extruded using a Twin-screw extruder at a range of screw speeds (80 and 160 rpm) and barrel temperatures range (80 to 140°C) with a throughput of 30 Kg hr-1 and moisture content of 25%. Arabinoxylans were extracted with water and the extraction yield and molecular weight was determined using size exclusion high-pressure liquid chromatography system. It was found that increasing screw speed from 80 rpm to 160 rpm, did not effect the extraction yield (p < 0.05) of arabinoxylan from either the wheat endosperm pentosan or the rice bran. However, the molecular weight of the extracted arabinoxylans from pentosan was found to decrease with increasing screw speed in wheat endosperm pentosan. These low molecular weight arabinoxylans have been suggested as immunomodulators.

Keywords: arabinoxylans, extrusion, wheat endosperm pentosan, rice bran

Procedia PDF Downloads 416
1932 One-Shot Text Classification with Multilingual-BERT

Authors: Hsin-Yang Wang, K. M. A. Salam, Ying-Jia Lin, Daniel Tan, Tzu-Hsuan Chou, Hung-Yu Kao

Abstract:

Detecting user intent from natural language expression has a wide variety of use cases in different natural language processing applications. Recently few-shot training has a spike of usage on commercial domains. Due to the lack of significant sample features, the downstream task performance has been limited or leads to an unstable result across different domains. As a state-of-the-art method, the pre-trained BERT model gathering the sentence-level information from a large text corpus shows improvement on several NLP benchmarks. In this research, we are proposing a method to change multi-class classification tasks into binary classification tasks, then use the confidence score to rank the results. As a language model, BERT performs well on sequence data. In our experiment, we change the objective from predicting labels into finding the relations between words in sequence data. Our proposed method achieved 71.0% accuracy in the internal intent detection dataset and 63.9% accuracy in the HuffPost dataset. Acknowledgment: This work was supported by NCKU-B109-K003, which is the collaboration between National Cheng Kung University, Taiwan, and SoftBank Corp., Tokyo.

Keywords: OSML, BERT, text classification, one shot

Procedia PDF Downloads 101
1931 Mind Care Assistant - Companion App

Authors: Roshani Gusain, Deep Sinha, Karan Nayal, Anmol Kumar Mishra, Manav Singh

Abstract:

In this research paper, we introduce "Mind Care Assistant - Companion App", which is a Flutter and Firebase-based mental health monitor. The app wants to improve and monitor the mental health of its users, it uses noninvasive ways to check for a change in their emotional state. By responding to questions, the app will provide individualized suggestions ᅳ tasks and mindfulness exercises ᅳ for users who are depressed or anxious. The app features a chat-bot that incorporates cognitive behavioural therapy (CBT) principles and combines natural language processing with machine learning to develop personalised responses. The feature of the app that makes it easy for us to choose between iOS and Android is cross-platform, which allows users from both mobile systems to experience almost no changes in their interfaces. With Firebase integration synchronized and real-time data storage, security is easily possible. The paper covers the architecture of the app, how it was developed and some important features. The primary research result presents the promise of a "Mind Care Assistant" in mental health care using new wait-for-health technology, proposing a full stack application to be able to manage depression/anxiety and overall Mental well-being very effectively.

Keywords: mental health, mobile application, flutter, firebase, Depression, Anxiety

Procedia PDF Downloads 13
1930 House Facades and Emotions: Exploring the Psychological Impact of Architectural Features

Authors: Nour Tawil, Sandra Weber, Kirsten K. Roessler, Martin Mau, Simone Kuhn

Abstract:

The link between “quality” residential environments and human health and well-being has long been proposed. While the physical properties of a sound environment have been fairly defined, little focus has been given to the psychological impact of architectural elements. Recently, studies have investigated the response to architectural parameters, using measures of physiology, brain activity, and emotion. Results showed different aspects of interest: detailed and open versus blank and closed facades, patterns in perceiving different elements, and a visual bias for capturing faces in buildings. However, in the absence of a consensus on methodologies, the available studies remain unsystematic and face many limitations regarding the underpinning psychological mechanisms. To bridge some of these gaps, an online study was launched to investigate design features that influence the aesthetic judgement and emotional evaluation of house facades, using a well-controlled stimulus set of Canadian houses. A methodical modelling of design features will be performed to extract both high and low level image properties, in addition to segmentation of layout-related features. 300 participants from Canada, Denmark, and Germany will rate the images on twelve psychological dimensions representing appealing aspects of a house. Subjective ratings are expected to correlate with specific architectural elements while controlling for typicality and familiarity, and other individual differences. With the lack of relevant studies, this research aims to identify architectural elements of beneficial qualities that can inform design strategies for optimized residential spaces.

Keywords: architectural elements, emotions, psychological response, residential facades.

Procedia PDF Downloads 231
1929 The Fusion of Blockchain and AI in Supply Chain Finance: Scalability in Distributed Systems

Authors: Wu You, Burra Venkata Durga Kumar

Abstract:

This study examines the promising potential of integrating Blockchain and Artificial Intelligence (AI) technologies to scalability in Distributed Systems within the field of supply chain finance. The finance industry is continually confronted with scalability challenges in its Distributed Systems, particularly within the supply chain finance sector, impacting efficiency and security. Blockchain, with its inherent attributes of high scalability and secure distributed ledger system, coupled with AI's strengths in optimizing data processing and decision-making, holds the key to innovating the industry's approach to these issues. This study elucidates the synergistic interplay between Blockchain and AI, detailing how their fusion can drive a significant transformation in the supply chain finance sector's Distributed Systems. It offers specific use-cases within this field to illustrate the practical implications and potential benefits of this technological convergence. The study also discusses future possibilities and current challenges in implementing this groundbreaking approach within the context of supply chain finance. It concludes that the intersection of Blockchain and AI could ignite a new epoch of enhanced efficiency, security, and transparency in the Distributed Systems of supply chain finance within the financial industry.

Keywords: blockchain, artificial intelligence (AI), scaled distributed systems, supply chain finance, efficiency and security

Procedia PDF Downloads 93
1928 Comparative Analysis of the Computer Methods' Usage for Calculation of Hydrocarbon Reserves in the Baltic Sea

Authors: Pavel Shcherban, Vlad Golovanov

Abstract:

Nowadays, the depletion of hydrocarbon deposits on the land of the Kaliningrad region leads to active geological exploration and development of oil and natural gas reserves in the southeastern part of the Baltic Sea. LLC 'Lukoil-Kaliningradmorneft' implements a comprehensive program for the development of the region's shelf in 2014-2023. Due to heterogeneity of reservoir rocks in various open fields, as well as with ambiguous conclusions on the contours of deposits, additional geological prospecting and refinement of the recoverable oil reserves are carried out. The key element is use of an effective technique of computer stock modeling at the first stage of processing of the received data. The following step uses information for the cluster analysis, which makes it possible to optimize the field development approaches. The article analyzes the effectiveness of various methods for reserves' calculation and computer modelling methods of the offshore hydrocarbon fields. Cluster analysis allows to measure influence of the obtained data on the development of a technical and economic model for mining deposits. The relationship between the accuracy of the calculation of recoverable reserves and the need of modernization of existing mining infrastructure, as well as the optimization of the scheme of opening and development of oil deposits, is observed.

Keywords: cluster analysis, computer modelling of deposits, correction of the feasibility study, offshore hydrocarbon fields

Procedia PDF Downloads 166
1927 A Comparative Evaluation of Cognitive Load Management: Case Study of Postgraduate Business Students

Authors: Kavita Goel, Donald Winchester

Abstract:

In a world of information overload and work complexities, academics often struggle to create an online instructional environment enabling efficient and effective student learning. Research has established that students’ learning styles are different, some learn faster when taught using audio and visual methods. Attributes like prior knowledge and mental effort affect their learning. ‘Cognitive load theory’, opines learners have limited processing capacity. Cognitive load depends on the learner’s prior knowledge, the complexity of content and tasks, and instructional environment. Hence, the proper allocation of cognitive resources is critical for students’ learning. Consequently, a lecturer needs to understand the limits and strengths of the human learning processes, various learning styles of students, and accommodate these requirements while designing online assessments. As acknowledged in the cognitive load theory literature, visual and auditory explanations of worked examples potentially lead to a reduction of cognitive load (effort) and increased facilitation of learning when compared to conventional sequential text problem solving. This will help learner to utilize both subcomponents of their working memory. Instructional design changes were introduced at the case site for the delivery of the postgraduate business subjects. To make effective use of auditory and visual modalities, video recorded lectures, and key concept webinars were delivered to students. Videos were prepared to free up student limited working memory from irrelevant mental effort as all elements in a visual screening can be viewed simultaneously, processed quickly, and facilitates greater psychological processing efficiency. Most case study students in the postgraduate programs are adults, working full-time at higher management levels, and studying part-time. Their learning style and needs are different from other tertiary students. The purpose of the audio and visual interventions was to lower the students cognitive load and provide an online environment supportive to their efficient learning. These changes were expected to impact the student’s learning experience, their academic performance and retention favourably. This paper posits that these changes to instruction design facilitates students to integrate new knowledge into their long-term memory. A mixed methods case study methodology was used in this investigation. Primary data were collected from interviews and survey(s) of students and academics. Secondary data were collected from the organisation’s databases and reports. Some evidence was found that the academic performance of students does improve when new instructional design changes are introduced although not statistically significant. However, the overall grade distribution of student’s academic performance has changed and skewed higher which shows deeper understanding of the content. It was identified from feedback received from students that recorded webinars served as better learning aids than material with text alone, especially with more complex content. The recorded webinars on the subject content and assessments provides flexibility to students to access this material any time from repositories, many times, and this enhances students learning style. Visual and audio information enters student’s working memory more effectively. Also as each assessment included the application of the concepts, conceptual knowledge interacted with the pre-existing schema in the long-term memory and lowered student’s cognitive load.

Keywords: cognitive load theory, learning style, instructional environment, working memory

Procedia PDF Downloads 145
1926 On the Existence of Homotopic Mapping Between Knowledge Graphs and Graph Embeddings

Authors: Jude K. Safo

Abstract:

Knowledge Graphs KG) and their relation to Graph Embeddings (GE) represent a unique data structure in the landscape of machine learning (relative to image, text and acoustic data). Unlike the latter, GEs are the only data structure sufficient for representing hierarchically dense, semantic information needed for use-cases like supply chain data and protein folding where the search space exceeds the limits traditional search methods (e.g. page-rank, Dijkstra, etc.). While GEs are effective for compressing low rank tensor data, at scale, they begin to introduce a new problem of ’data retreival’ which we observe in Large Language Models. Notable attempts by transE, TransR and other prominent industry standards have shown a peak performance just north of 57% on WN18 and FB15K benchmarks, insufficient practical industry applications. They’re also limited, in scope, to next node/link predictions. Traditional linear methods like Tucker, CP, PARAFAC and CANDECOMP quickly hit memory limits on tensors exceeding 6.4 million nodes. This paper outlines a topological framework for linear mapping between concepts in KG space and GE space that preserve cardinality. Most importantly we introduce a traceable framework for composing dense linguistic strcutures. We demonstrate performance on WN18 benchmark this model hits. This model does not rely on Large Langauge Models (LLM) though the applications are certainy relevant here as well.

Keywords: representation theory, large language models, graph embeddings, applied algebraic topology, applied knot theory, combinatorics

Procedia PDF Downloads 68
1925 Human Computer Interaction Using Computer Vision and Speech Processing

Authors: Shreyansh Jain Jeetmal, Shobith P. Chadaga, Shreyas H. Srinivas

Abstract:

Internet of Things (IoT) is seen as the next major step in the ongoing revolution in the Information Age. It is predicted that in the near future billions of embedded devices will be communicating with each other to perform a plethora of tasks with or without human intervention. One of the major ongoing hotbed of research activity in IoT is Human Computer Interaction (HCI). HCI is used to facilitate communication between an intelligent system and a user. An intelligent system typically comprises of a system consisting of various sensors, actuators and embedded controllers which communicate with each other to monitor data collected from the environment. Communication by the user to the system is typically done using voice. One of the major ongoing applications of HCI is in home automation as a personal assistant. The prime objective of our project is to implement a use case of HCI for home automation. Our system is designed to detect and recognize the users and personalize the appliances in the house according to their individual preferences. Our HCI system is also capable of speaking with the user when certain commands are spoken such as searching on the web for information and controlling appliances. Our system can also monitor the environment in the house such as air quality and gas leakages for added safety.

Keywords: human computer interaction, internet of things, computer vision, sensor networks, speech to text, text to speech, android

Procedia PDF Downloads 362
1924 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis

Authors: Meng Su

Abstract:

High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.

Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis

Procedia PDF Downloads 108
1923 HcDD: The Hybrid Combination of Disk Drives in Active Storage Systems

Authors: Shu Yin, Zhiyang Ding, Jianzhong Huang, Xiaojun Ruan, Xiaomin Zhu, Xiao Qin

Abstract:

Since large-scale and data-intensive applications have been widely deployed, there is a growing demand for high-performance storage systems to support data-intensive applications. Compared with traditional storage systems, next-generation systems will embrace dedicated processor to reduce computational load of host machines and will have hybrid combinations of different storage devices. The advent of flash- memory-based solid state disk has become a critical role in revolutionizing the storage world. However, instead of simply replacing the traditional magnetic hard disk with the solid state disk, it is believed that finding a complementary approach to corporate both of them is more challenging and attractive. This paper explores an idea of active storage, an emerging new storage configuration, in terms of the architecture and design, the parallel processing capability, the cooperation of other machines in cluster computing environment, and a disk configuration, the hybrid combination of different types of disk drives. Experimental results indicate that the proposed HcDD achieves better I/O performance and longer storage system lifespan.

Keywords: arallel storage system, hybrid storage system, data inten- sive, solid state disks, reliability

Procedia PDF Downloads 448
1922 The Impact of the General Data Protection Regulation on Human Resources Management in Schools

Authors: Alexandra Aslanidou

Abstract:

The General Data Protection Regulation (GDPR), concerning the protection of natural persons within the European Union with regard to the processing of personal data and on the free movement of such data, became applicable in the European Union (EU) on 25 May 2018 and transformed the way personal data were being treated under the Data Protection Directive (DPD) regime, generating sweeping organizational changes to both public sector and business. A social practice that is considerably influenced in the way of its day-to-day operations is Human Resource (HR) management, for which the importance of GDPR cannot be underestimated. That is because HR processes personal data coming in all shapes and sizes from many different systems and sources. The significance of the proper functioning of an HR department, specifically in human-centered, service-oriented environments such as the education field, is decisive due to the fact that HR operations in schools, conducted effectively, determine the quality of the provided services and consequently have a considerable impact on the success of the educational system. The purpose of this paper is to analyze the decisive role that GDPR plays in HR departments that operate in schools and in order to practically evaluate the aftermath of the Regulation during the first months of its applicability; a comparative use cases analysis in five highly dynamic schools, across three EU Member States, was attempted.

Keywords: general data protection regulation, human resource management, educational system

Procedia PDF Downloads 100
1921 Automatic Tagging and Accuracy in Assamese Text Data

Authors: Chayanika Hazarika Bordoloi

Abstract:

This paper is an attempt to work on a highly inflectional language called Assamese. This is also one of the national languages of India and very little has been achieved in terms of computational research. Building a language processing tool for a natural language is not very smooth as the standard and language representation change at various levels. This paper presents inflectional suffixes of Assamese verbs and how the statistical tools, along with linguistic features, can improve the tagging accuracy. Conditional random fields (CRF tool) was used to automatically tag and train the text data; however, accuracy was improved after linguistic featured were fed into the training data. Assamese is a highly inflectional language; hence, it is challenging to standardizing its morphology. Inflectional suffixes are used as a feature of the text data. In order to analyze the inflections of Assamese word forms, a list of suffixes is prepared. This list comprises suffixes, comprising of all possible suffixes that various categories can take is prepared. Assamese words can be classified into inflected classes (noun, pronoun, adjective and verb) and un-inflected classes (adverb and particle). The corpus used for this morphological analysis has huge tokens. The corpus is a mixed corpus and it has given satisfactory accuracy. The accuracy rate of the tagger has gradually improved with the modified training data.

Keywords: CRF, morphology, tagging, tagset

Procedia PDF Downloads 194
1920 FPGA Implementation of a Marginalized Particle Filter for Delineation of P and T Waves of ECG Signal

Authors: Jugal Bhandari, K. Hari Priya

Abstract:

The ECG signal provides important clinical information which could be used to pretend the diseases related to heart. Accordingly, delineation of ECG signal is an important task. Whereas delineation of P and T waves is a complex task. This paper deals with the Study of ECG signal and analysis of signal by means of Verilog Design of efficient filters and MATLAB tool effectively. It includes generation and simulation of ECG signal, by means of real time ECG data, ECG signal filtering and processing by analysis of different algorithms and techniques. In this paper, we design a basic particle filter which generates a dynamic model depending on the present and past input samples and then produces the desired output. Afterwards, the output will be processed by MATLAB to get the actual shape and accurate values of the ranges of P-wave and T-wave of ECG signal. In this paper, Questasim is a tool of mentor graphics which is being used for simulation and functional verification. The same design is again verified using Xilinx ISE which will be also used for synthesis, mapping and bit file generation. Xilinx FPGA board will be used for implementation of system. The final results of FPGA shall be verified with ChipScope Pro where the output data can be observed.

Keywords: ECG, MATLAB, Bayesian filtering, particle filter, Verilog hardware descriptive language

Procedia PDF Downloads 367
1919 The Advancements of Transformer Models in Part-of-Speech Tagging System for Low-Resource Tigrinya Language

Authors: Shamm Kidane, Ibrahim Abdella, Fitsum Gaim, Simon Mulugeta, Sirak Asmerom, Natnael Ambasager, Yoel Ghebrihiwot

Abstract:

The call for natural language processing (NLP) systems for low-resource languages has become more apparent than ever in the past few years, with the arduous challenges still present in preparing such systems. This paper presents an improved dataset version of the Nagaoka Tigrinya Corpus for Parts-of-Speech (POS) classification system in the Tigrinya language. The size of the initial Nagaoka dataset was incremented, totaling the new tagged corpus to 118K tokens, which comprised the 12 basic POS annotations used previously. The additional content was also annotated manually in a stringent manner, followed similar rules to the former dataset and was formatted in CONLL format. The system made use of the novel approach in NLP tasks and use of the monolingually pre-trained TiELECTRA, TiBERT and TiRoBERTa transformer models. The highest achieved score is an impressive weighted F1-score of 94.2%, which surpassed the previous systems by a significant measure. The system will prove useful in the progress of NLP-related tasks for Tigrinya and similarly related low-resource languages with room for cross-referencing higher-resource languages.

Keywords: Tigrinya POS corpus, TiBERT, TiRoBERTa, conditional random fields

Procedia PDF Downloads 103
1918 Design of Two-Channel Quadrature Mirror Filter Banks Using a Transformation Approach

Authors: Ju-Hong Lee, Yi-Lin Shieh

Abstract:

Two-dimensional (2-D) quadrature mirror filter (QMF) banks have been widely considered for high-quality coding of image and video data at low bit rates. Without implementing subband coding, a 2-D QMF bank is required to have an exactly linear-phase response without magnitude distortion, i.e., the perfect reconstruction (PR) characteristics. The design problem of 2-D QMF banks with the PR characteristics has been considered in the literature for many years. This paper presents a transformation approach for designing 2-D two-channel QMF banks. Under a suitable one-dimensional (1-D) to two-dimensional (2-D) transformation with a specified decimation/interpolation matrix, the analysis and synthesis filters of the QMF bank are composed of 1-D causal and stable digital allpass filters (DAFs) and possess the 2-D doubly complementary half-band (DC-HB) property. This facilitates the design problem of the two-channel QMF banks by finding the real coefficients of the 1-D recursive DAFs. The design problem is formulated based on the minimax phase approximation for the 1-D DAFs. A novel objective function is then derived to obtain an optimization for 1-D minimax phase approximation. As a result, the problem of minimizing the objective function can be simply solved by using the well-known weighted least-squares (WLS) algorithm in the minimax (L∞) optimal sense. The novelty of the proposed design method is that the design procedure is very simple and the designed 2-D QMF bank achieves perfect magnitude response and possesses satisfactory phase response. Simulation results show that the proposed design method provides much better design performance and much less design complexity as compared with the existing techniques.

Keywords: Quincunx QMF bank, doubly complementary filter, digital allpass filter, WLS algorithm

Procedia PDF Downloads 225
1917 Effect of Citric Acid and Clove on Cured Smoked Meat: A Traditional Meat Product

Authors: Esther Eduzor, Charles A. Negbenebor, Helen O. Agu

Abstract:

Smoking of meat enhances the taste and look of meat, it also increases its longevity, and helps preserve the meat by slowing down the spoilage of fat and growth of bacteria. The Lean meat from the forequarter of beef carcass was obtained from the Maiduguri abattoir. The meat was cut into four portions with weight ranging from 525-545 g. The meat was cut into bits measuring about 8 cm in length, 3.5 cm in thickness and weighed 64.5 g. Meat samples were washed, cured with various concentration of sodium chloride, sodium nitrate, citric acid and clove for 30 min, drained and smoked in a smoking kiln at a temperature range of 55-600°C, for 8 hr a day for 3 days. The products were stored at ambient temperature and evaluated microbiologically and organoleptically. In terms of processing and storage there were increases in pH, free fatty acid content, a decrease in water holding capacity and microbial count of the cured smoked meat. The panelists rated control samples significantly (p < 0.05) higher in terms of colour, texture, taste and overall acceptability. The following organisms were isolated and identified during storage: Bacillus specie, Bacillus subtilis, streptococcus, Pseudomonas, Aspergillus niger, Candida and Penicillium specie. The study forms a basis for new product development for meat industry.

Keywords: citric acid, cloves, smoked meat, bioengineering

Procedia PDF Downloads 445
1916 The Perspectives of Preparing Psychology Practitioners in Armenian Universities

Authors: L. Petrosyan

Abstract:

The problem of psychologist training remains a key priority in Armenia. During the Soviet period, the notion of a psychologist was obscure not only in Armenia but also in other Soviet republics. The breakup of the Soviet Union triggered a gradual change in this area activating the cooperation with specialists from other countries. The need for recovery from the psychological trauma caused by the 1988 earthquake pushed forward the development of practical psychology in Armenia. This phenomenon led to positive changes in perception of and interest to a psychologist profession.Armenian universities started designing special programs for psychologists’ preparation. Armenian psychologists combined their efforts in the field of training relevant specialists. During the recent years, the Bologna educational system was introduced in Armenia which led to implementation of education quality improvement programs. Nevertheless, even today the issue of psychologists’ training is not yet settled in Armenian universities. So far graduate psychologists haven’t got a clear idea of personal and professional qualities of a psychologist. Recently, as a result of educational reforms, the psychology curricula underwent changes, but so far they have not led to a desired outcome. Almost all curricula in certain specialties are aimed to form professional competencies and strengthen practical skills. A survey conducted in Armenia aimed to identify what are the ideas of young psychology specialists on the image of a psychologist. The survey respondents were 45 specialists holding bachelor’s degree as well as 30 master degree graduates, who have not been working yet. The research reveals that we need to change the approach of preparing psychology practitioners in the universities of Armenia. Such an approach to psychologist training will make it possible to train qualified specialists for enhancement of modern psychology theory and practice.

Keywords: practitioners, psychology degree, study, professional competencies

Procedia PDF Downloads 452
1915 The Yak of Thailand: Folk Icons Transcending Culture, Religion, and Media

Authors: David M. Lucas, Charles W. Jarrett

Abstract:

In the culture of Thailand, the Yak serve as a mediated icon representing strength, power, and mystical protection not only for the Buddha, but for population of worshipers. Originating from the forests of China, the Yak continue to stand guard at the gates of Buddhist temples. The Yak represents Thai culture in the hearts of Thai people. This paper presents a qualitative study regarding the curious mix of media, culture, and religion that projects the Yak of Thailand as a larger than life message throughout the political, cultural, and religious spheres. The gate guardians, or gods as they are sometimes called, appear throughout the religious temples of Asian cultures. However, the Asian cultures demonstrate differences in artistic renditions (or presentations) of such sentinels. Thailand gate guards (the Yak) stand in front of many Buddhist temples, and these iconic figures display unique features with varied symbolic significance. The temple (or wat), plays a vital role in every community; and, for many people, Thailand’s temples are the country’s most endearing sights. The authors applied folk-nography as a methodology to illustrate the importance of the Thai Yak in serving as meaningful icons that transcend not only time, but the culture, religion, and mass media. The Yak represent mythical, religious, artistic, cultural, and militaristic significance for the Thai people. Data collection included interviews, focus groups, and natural observations. This paper summarizes the perceptions of the Thai people concerning their gate sentries and the relationship, communication, connection, and the enduring respect that Thai people hold for their guardians of the gates.

Keywords: communication, culture, folknography, icon, image, media, protection, religion, yak

Procedia PDF Downloads 399
1914 Statistical Tools for SFRA Diagnosis in Power Transformers

Authors: Rahul Srivastava, Priti Pundir, Y. R. Sood, Rajnish Shrivastava

Abstract:

For the interpretation of the signatures of sweep frequency response analysis(SFRA) of transformer different types of statistical techniques serves as an effective tool for doing either phase to phase comparison or sister unit comparison. In this paper with the discussion on SFRA several statistics techniques like cross correlation coefficient (CCF), root square error (RSQ), comparative standard deviation (CSD), Absolute difference, mean square error(MSE),Min-Max ratio(MM) are presented through several case studies. These methods require sample data size and spot frequencies of SFRA signatures that are being compared. The techniques used are based on power signal processing tools that can simplify result and limits can be created for the severity of the fault occurring in the transformer due to several short circuit forces or due to ageing. The advantages of using statistics techniques for analyzing of SFRA result are being indicated through several case studies and hence the results are obtained which determines the state of the transformer.

Keywords: absolute difference (DABS), cross correlation coefficient (CCF), mean square error (MSE), min-max ratio (MM-ratio), root square error (RSQ), standard deviation (CSD), sweep frequency response analysis (SFRA)

Procedia PDF Downloads 697
1913 Damage Identification Using Experimental Modal Analysis

Authors: Niladri Sekhar Barma, Satish Dhandole

Abstract:

Damage identification in the context of safety, nowadays, has become a fundamental research interest area in the field of mechanical, civil, and aerospace engineering structures. The following research is aimed to identify damage in a mechanical beam structure and quantify the severity or extent of damage in terms of loss of stiffness, and obtain an updated analytical Finite Element (FE) model. An FE model is used for analysis, and the location of damage for single and multiple damage cases is identified numerically using the modal strain energy method and mode shape curvature method. Experimental data has been acquired with the help of an accelerometer. Fast Fourier Transform (FFT) algorithm is applied to the measured signal, and subsequently, post-processing is done in MEscopeVes software. The two sets of data, the numerical FE model and experimental results, are compared to locate the damage accurately. The extent of the damage is identified via modal frequencies using a mixed numerical-experimental technique. Mode shape comparison is performed by Modal Assurance Criteria (MAC). The analytical FE model is adjusted by the direct method of model updating. The same study has been extended to some real-life structures such as plate and GARTEUR structures.

Keywords: damage identification, damage quantification, damage detection using modal analysis, structural damage identification

Procedia PDF Downloads 116