Search results for: digital learning objects
3652 Adoption of Proactive and Reactive Supply Chain Resilience Strategies: A Comparison between Apparel and Construction Industries in Sri Lanka
Authors: Anuradha Ranawakage, Chathurani Silva
Abstract:
With the growing expansion of global businesses, supply chains are increasingly exposed to numerous disruptions. Organizations adopt various strategies to mitigate the impact of these disruptions. Depending on the variations in the conditions and characteristics of supply chains, the adoption of resilience strategies may vary across industries. However, these differences are largely unexplored in the existing literature. Hence, this study aims to evaluate the adoption of three proactive strategies: proactive collaboration, digital connectivity, integrated SC risk management, and three reactive strategies: reactive collaboration, inventory and reserve capacity, and lifeline maintenance in the apparel and construction industries in Sri Lanka. An online questionnaire was used to collect data on the implementation of resilience strategies from a sample of 162 apparel and 185 construction companies operating in Sri Lanka. This research makes a significant contribution to the field of supply chain management by assessing the extent to which different resilience strategies are functioned within the apparel and construction industries in Sri Lanka, particularly in an era after a global pandemic that significantly disrupted supply chains all around the world.Keywords: apparel, construction, proactive strategies, reactive strategies, supply chain resilience
Procedia PDF Downloads 563651 Role of QR Codes in Environmental Consciousness of Apparel Consumption
Authors: Eleanor L. Kutschera
Abstract:
This study explores the possible impact that QR codes play in helping individuals make more sustainable choices regarding apparel consumption. Data was collected via an online survey to ascertain individuals’ knowledge, attitudes, and behaviors with regard to QR codes and how this impacts their decisions to purchase apparel. Results from 250 participants provide both qualitative and quantitative data that provide valuable information regarding consumers’ use of QR codes and more sustainable purchases. Specifically, results indicate that QR codes are currently under-utilized in the apparel industry but have the potential to generate more environmentally conscious purchases. Also, results posit that while the cost of the item is the most influential factor in purchasing sustainable garments, other factors such as how, where, and what it is made of are in the middle, along with the company’s story/inspiration for creation have an impact. Moreover, participants posit the use of QR codes could make them more informed and empowered consumers, and they would be more likely to make purchases that are better for the environment. Participants’ qualitative responses provide useful incentives that could increase their future sustainable purchases. Finally, this study touches on the study’s limitations, implications, and future direction of research.Keywords: digital ID, QR codes, environmental consciousness, sustainability, fashion industry, apparel consumption
Procedia PDF Downloads 1033650 The OLOS® Way to Cultural Heritage: User Interface with Anthropomorphic Characteristics
Authors: Daniele Baldacci, Remo Pareschi
Abstract:
Augmented Reality and Augmented Intelligence are radically changing information technology. The path that starts from the keyboard and then, passing through milestones such as Siri, Alexa and other vocal avatars, reaches a more fluid and natural communication with computers, thus converting the dichotomy between man and machine into a harmonious interaction, now heads unequivocally towards a new IT paradigm, where holographic computing will play a key role. The OLOS® platform contributes substantially to this trend in that it infuses computers with human features, by transferring the gestures and expressions of persons of flesh and bones to anthropomorphic holographic interfaces which in turn will use them to interact with real-life humans. In fact, we could say, boldly but with a solid technological background to back the statement, that OLOS® gives reality to an altogether new entity, placed at the exact boundary between nature and technology, namely the holographic human being. Holographic humans qualify as the perfect carriers for the virtual reincarnation of characters handed down from history and tradition. Thus, they provide for an innovative and highly immersive way of experiencing our cultural heritage as something alive and pulsating in the present.Keywords: digital cinematography, human-computer interfaces, holographic simulation, interactive museum exhibits
Procedia PDF Downloads 1163649 Audio-Lingual Method and the English-Speaking Proficiency of Grade 11 Students
Authors: Marthadale Acibo Semacio
Abstract:
Speaking skill is a crucial part of English language teaching and learning. This actually shows the great importance of this skill in English language classes. Through speaking, ideas and thoughts are shared with other people, and a smooth interaction between people takes place. The study examined the levels of speaking proficiency of the control and experimental groups on pronunciation, grammatical accuracy, and fluency. As a quasi-experimental study, it also determined the presence or absence of significant changes in their speaking proficiency levels in terms of pronouncing the words correctly, the accuracy of grammar and fluency of a language given the two methods to the groups of students in the English language, using the traditional and audio-lingual methods. Descriptive and inferential statistics were employed according to the stated specific problems. The study employed a video presentation with prior information about it. In the video, the teacher acts as model one, giving instructions on what is going to be done, and then the students will perform the activity. The students were paired purposively based on their learning capabilities. Observing proper ethics, their performance was audio recorded to help the researcher assess the learner using the modified speaking rubric. The study revealed that those under the traditional method were more fluent than those in the audio-lingual method. With respect to the way in which each method deals with the feelings of the student, the audio-lingual one fails to provide a principle that would relate to this area and follows the assumption that the intrinsic motivation of the students to learn the target language will spring from their interest in the structure of the language. However, the speaking proficiency levels of the students were remarkably reinforced in reading different words through the aid of aural media with their teachers. The study concluded that using an audio-lingual method of teaching is not a stand-alone method but only an aid of the teacher in helping the students improve their speaking proficiency in the English Language. Hence, audio-lingual approach is encouraged to be used in teaching English language, on top of the chalk-talk or traditional method, to improve the speaking proficiency of students.Keywords: audio-lingual, speaking, grammar, pronunciation, accuracy, fluency, proficiency
Procedia PDF Downloads 703648 Identity and Access Management for Medical Cyber-Physical Systems: New Technology and Security Solutions
Authors: Abdulrahman Yarali, Machica McClain
Abstract:
In the context of the increasing use of Cyber-Physical Systems (CPS) across critical infrastructure sectors, this paper addresses a crucial and emerging topic: the integration of Identity and Access Management (IAM) with Internet of Things (IoT) devices in Medical Cyber-Physical Systems (MCPS). It underscores the significance of robust IAM solutions in the expanding interconnection of IoT devices in healthcare settings, leveraging AI, ML, DL, Zero Trust Architecture (ZTA), biometric authentication advancements, and blockchain technologies. The paper advocates for the potential benefits of transitioning from traditional, static IAM frameworks to dynamic, adaptive solutions that can effectively counter sophisticated cyber threats, ensure the integrity and reliability of CPS, and significantly bolster the overall security posture. The paper calls for strategic planning, collaboration, and continuous innovation to harness these benefits. By emphasizing the importance of securing CPS against evolving threats, this research contributes to the ongoing discourse on cybersecurity and advocates for a collaborative approach to foster innovation and enhance the resilience of critical infrastructure in the digital era.Keywords: CPS, IAM, IoT, AI, ML, authentication, models, policies, healthcare
Procedia PDF Downloads 223647 Quantitative Phase Imaging System Based on a Three-Lens Common-Path Interferometer
Authors: Alexander Machikhin, Olga Polschikova, Vitold Pozhar, Alina Ramazanova
Abstract:
White-light quantitative phase imaging is an effective technique for achieving sub-nanometer phase sensitivity. Highly stable interferometers based on common-path geometry have been developed in recent years to solve this task. Some of these methods also apply multispectral approach. The purpose of this research is to suggest a simple and effective interferometer for such systems. We developed a three-lens common-path interferometer, which can be used for quantitative phase imaging with or without multispectral modality. The lens system consists of two components, the first one of which is a compound lens, consisting of two lenses. A pinhole is placed between the components. The lens-in-lens approach enables effective light transmission and high stability of the interferometer. The multispectrality is easily implemented by placing a tunable filter in front of the interferometer. In our work, we used an acousto-optical tunable filter. Some design considerations are discussed and multispectral quantitative phase retrieval is demonstrated.Keywords: acousto-optical tunable filter, common-path interferometry, digital holography, multispectral quantitative phase imaging
Procedia PDF Downloads 3113646 Authorship Profiling of Unidentified Corpora in Saudi Social Media
Authors: Abdulaziz Fageeh
Abstract:
In the bustling digital landscape of Saudi Arabia, a chilling wave of cybercrime has swept across the nation. Among the most nefarious acts are financial blackmail schemes, perpetrated by anonymous actors lurking within the shadows of social media platforms. This chilling reality necessitates the utilization of forensic linguistic techniques to unravel the identities of these virtual villains. This research delves into the complex world of authorship profiling, investigating the effectiveness of various linguistic features in identifying the perpetrators of malicious messages within the Saudi social media environment. By meticulously analyzing patterns of language, vocabulary choice, and stylistic nuances, the study endeavors to uncover the hidden characteristics of the individuals responsible for these heinous acts. Through this linguistic detective work, the research aims to provide valuable insights to investigators and policymakers in the ongoing battle against cybercrime and to shed light on the evolution of malicious online behavior within the Saudi context.Keywords: authorship profiling, arabic linguistics, saudi social media, cybercrime, financial blackmail, linguistic features, forensic linguistics, online threats
Procedia PDF Downloads 133645 An Evaluation of a First Year Introductory Statistics Course at a University in Jamaica
Authors: Ayesha M. Facey
Abstract:
The evaluation sought to determine the factors associated with the high failure rate among students taking a first-year introductory statistics course. By utilizing Tyler’s Objective Based Model, the main objectives were: to assess the effectiveness of the lecturer’s teaching strategies; to determine the proportion of students who attends lectures and tutorials frequently and to determine the impact of infrequent attendance on performance; to determine how the assigned activities assisted in students understanding of the course content; to ascertain the possible issues being faced by students in understanding the course material and obtain possible solutions to the challenges and to determine whether the learning outcomes have been achieved based on an assessment of the second in-course examination. A quantitative survey research strategy was employed and the study population was students enrolled in semester one of the academic year 2015/2016. A convenience sampling approach was employed resulting in a sample of 98 students. Primary data was collected using self-administered questionnaires over a one-week period. Secondary data was obtained from the results of the second in-course examination. Data were entered and analyzed in SPSS version 22 and both univariate and bivariate analyses were conducted on the information obtained from the questionnaires. Univariate analyses provided description of the sample through means, standard deviations and percentages while bivariate analyses were done using Spearman’s Rho correlation coefficient and Chi-square analyses. For secondary data, an item analysis was performed to obtain the reliability of the examination questions, difficulty index and discriminant index. The examination results also provided information on the weak areas of the students and highlighted the learning outcomes that were not achieved. Findings revealed that students were more likely to participate in lectures than tutorials and that attendance was high for both lectures and tutorials. There was a significant relationship between participation in lectures and performance on examination. However, a high proportion of students has been absent from three or more tutorials as well as lectures. A higher proportion of students indicated that they completed the assignments obtained from the lectures sometimes while they rarely completed tutorial worksheets. Students who were more likely to complete their assignments were significantly more likely to perform well on their examination. Additionally, students faced a number of challenges in understanding the course content and the topics of probability, binomial distribution and normal distribution were the most challenging. The item analysis also highlighted these topics as problem areas. Problems doing mathematics and application and analyses were their major challenges faced by students and most students indicated that some of the challenges could be alleviated if additional examples were worked in lectures and they were given more time to solve questions. Analysis of the examination results showed that a number of learning outcomes were not achieved for a number of topics. Based on the findings recommendations were made that suggested adjustments to grade allocations, delivery of lectures and methods of assessment.Keywords: evaluation, item analysis, Tyler’s objective based model, university statistics
Procedia PDF Downloads 1903644 Methodology for the Integration of Object Identification Processes in Handling and Logistic Systems
Authors: L. Kiefer, C. Richter, G. Reinhart
Abstract:
The uprising complexity in production systems due to an increasing amount of variants up to customer innovated products leads to requirements that hierarchical control systems are not able to fulfil. Therefore, factory planners can install autonomous manufacturing systems. The fundamental requirement for an autonomous control is the identification of objects within production systems. In this approach an attribute-based identification is focused for avoiding dose-dependent identification costs. Instead of using an identification mark (ID) like a radio frequency identification (RFID)-Tag, an object type is directly identified by its attributes. To facilitate that it’s recommended to include the identification and the corresponding sensors within handling processes, which connect all manufacturing processes and therefore ensure a high identification rate and reduce blind spots. The presented methodology reduces the individual effort to integrate identification processes in handling systems. First, suitable object attributes and sensor systems for object identification in a production environment are defined. By categorising these sensor systems as well as handling systems, it is possible to match them universal within a compatibility matrix. Based on that compatibility further requirements like identification time are analysed, which decide whether the combination of handling and sensor system is well suited for parallel handling and identification within an autonomous control. By analysing a list of more than thousand possible attributes, first investigations have shown, that five main characteristics (weight, form, colour, amount, and position of subattributes as drillings) are sufficient for an integrable identification. This knowledge limits the variety of identification systems and leads to a manageable complexity within the selection process. Besides the procedure, several tools, as an example a sensor pool are presented. These tools include the generated specific expert knowledge and simplify the selection. The primary tool is a pool of preconfigured identification processes depending on the chosen combination of sensor and handling device. By following the defined procedure and using the created tools, even laypeople out of other scientific fields can choose an appropriate combination of handling devices and sensors which enable parallel handling and identification.Keywords: agent systems, autonomous control, handling systems, identification
Procedia PDF Downloads 1773643 Detection of Autistic Children's Voice Based on Artificial Neural Network
Authors: Royan Dawud Aldian, Endah Purwanti, Soegianto Soelistiono
Abstract:
In this research we have been developed an automatic investigation to classify normal children voice or autistic by using modern computation technology that is computation based on artificial neural network. The superiority of this computation technology is its capability on processing and saving data. In this research, digital voice features are gotten from the coefficient of linear-predictive coding with auto-correlation method and have been transformed in frequency domain using fast fourier transform, which used as input of artificial neural network in back-propagation method so that will make the difference between normal children and autistic automatically. The result of back-propagation method shows that successful classification capability for normal children voice experiment data is 100% whereas, for autistic children voice experiment data is 100%. The success rate using back-propagation classification system for the entire test data is 100%.Keywords: autism, artificial neural network, backpropagation, linier predictive coding, fast fourier transform
Procedia PDF Downloads 4613642 Investigation of Different Conditions to Detect Cycles in Linearly Implicit Quantized State Systems
Authors: Elmongi Elbellili, Ben Lauwens, Daan Huybrechs
Abstract:
The increasing complexity of modern engineering systems presents a challenge to the digital simulation of these systems which usually can be represented by differential equations. The Linearly Implicit Quantized State System (LIQSS) offers an alternative approach to traditional numerical integration techniques for solving Ordinary Differential Equations (ODEs). This method proved effective for handling discontinuous and large stiff systems. However, the inherent discrete nature of LIQSS may introduce oscillations that result in unnecessary computational steps. The current oscillation detection mechanism relies on a condition that checks the significance of the derivatives, but it could be further improved. This paper describes a different cycle detection mechanism and presents the outcomes using LIQSS order one in simulating the Advection Diffusion problem. The efficiency of this new cycle detection mechanism is verified by comparing the performance of the current solver against the new version as well as a reference solution using a Runge-Kutta method of order14.Keywords: numerical integration, quantized state systems, ordinary differential equations, stiffness, cycle detection, simulation
Procedia PDF Downloads 603641 Atomic Decomposition Audio Data Compression and Denoising Using Sparse Dictionary Feature Learning
Authors: T. Bryan , V. Kepuska, I. Kostnaic
Abstract:
A method of data compression and denoising is introduced that is based on atomic decomposition of audio data using “basis vectors” that are learned from the audio data itself. The basis vectors are shown to have higher data compression and better signal-to-noise enhancement than the Gabor and gammatone “seed atoms” that were used to generate them. The basis vectors are the input weights of a Sparse AutoEncoder (SAE) that is trained using “envelope samples” of windowed segments of the audio data. The envelope samples are extracted from the audio data by performing atomic decomposition with Gabor or gammatone seed atoms. This process identifies segments of audio data that are locally coherent with the seed atoms. Envelope samples are extracted by identifying locally coherent audio data segments with Gabor or gammatone seed atoms, found by matching pursuit. The envelope samples are formed by taking the kronecker products of the atomic envelopes with the locally coherent data segments. Oracle signal-to-noise ratio (SNR) verses data compression curves are generated for the seed atoms as well as the basis vectors learned from Gabor and gammatone seed atoms. SNR data compression curves are generated for speech signals as well as early American music recordings. The basis vectors are shown to have higher denoising capability for data compression rates ranging from 90% to 99.84% for speech as well as music. Envelope samples are displayed as images by folding the time series into column vectors. This display method is used to compare of the output of the SAE with the envelope samples that produced them. The basis vectors are also displayed as images. Sparsity is shown to play an important role in producing the highest denoising basis vectors.Keywords: sparse dictionary learning, autoencoder, sparse autoencoder, basis vectors, atomic decomposition, envelope sampling, envelope samples, Gabor, gammatone, matching pursuit
Procedia PDF Downloads 2533640 Decades of Educational Excellence: Case Studies of Successful Family-Owned Higher Educational Institutions
Authors: Maria Luz Macasinag
Abstract:
This study aims to determine and to examine critically successful family-owned higher educational institutions towards identifying the attributes and practices that may likely have led to their success. This research is confined to private, non-sectarian, family-owned higher institutions of learning that have been operating for more than fifty years, had only one founder and had at least two transitions in terms of generation. The criteria for selecting family-owned universities to be part of the cases under investigation include institutions (1) with increasing enrollment over the past five years, with level III accreditation status, (3) with good performance in the Board examinations in most of its programs and (4) with high employability of graduates. The study uses the multiple case study method. A model based on the cross-case analysis of the attributes and practices of all the case studies of successful family- owned higher institutions of learning is the output. The paper provides insights to current and future school owners and administrators in the management of their institutions for competitiveness, sustainability and advancement. This research encourages the evaluation of how the ideas that may lead to the success of schools owned by families in developing a sense of community, a reciprocal relationship among colleagues, the students and other stakeholders will result to the attainment of the vision and mission of the school. The study is beneficial to entrepreneurs and to business students whose know-how may provide insights that would be helpful in guiding prospective school owners. The commission on higher education and the Department of Education stand to benefit from this academic paper for the guidance that they provide to family-owned educational institutions. Banks and other financial institutions may find valuable ideas from this academic paper for the purpose of providing financial assistance to colleges and universities that are family-owned. Researchers in the field of educational management and administration may be able to extract from this study related topics for future research.Keywords: administration practices, attributes, family-owned schools, success factors
Procedia PDF Downloads 2743639 Pavement Management for a Metropolitan Area: A Case Study of Montreal
Authors: Luis Amador Jimenez, Md. Shohel Amin
Abstract:
Pavement performance models are based on projections of observed traffic loads, which makes uncertain to study funding strategies in the long run if history does not repeat. Neural networks can be used to estimate deterioration rates but the learning rate and momentum have not been properly investigated, in addition, economic evolvement could change traffic flows. This study addresses both issues through a case study for roads of Montreal that simulates traffic for a period of 50 years and deals with the measurement error of the pavement deterioration model. Travel demand models are applied to simulate annual average daily traffic (AADT) every 5 years. Accumulated equivalent single axle loads (ESALs) are calculated from the predicted AADT and locally observed truck distributions combined with truck factors. A back propagation Neural Network (BPN) method with a Generalized Delta Rule (GDR) learning algorithm is applied to estimate pavement deterioration models capable of overcoming measurement errors. Linear programming of lifecycle optimization is applied to identify M&R strategies that ensure good pavement condition while minimizing the budget. It was found that CAD 150 million is the minimum annual budget to good condition for arterial and local roads in Montreal. Montreal drivers prefer the use of public transportation for work and education purposes. Vehicle traffic is expected to double within 50 years, ESALS are expected to double the number of ESALs every 15 years. Roads in the island of Montreal need to undergo a stabilization period for about 25 years, a steady state seems to be reached after.Keywords: pavement management system, traffic simulation, backpropagation neural network, performance modeling, measurement errors, linear programming, lifecycle optimization
Procedia PDF Downloads 4603638 Regional Treatment Trends in Canada Derived from Pharmacy Records
Abstract:
Cardiometabolic conditions (hypertension, diabetes, and hyperlipidemia) are major public health concerns. Analysis of all prescription records from about 10 million patients at the largest network of pharmacies in Canada reveals small year-over-year increases in the treatment prevalence of cardiometabolic diseases prior to the COVID-19 pandemic. Cardiometabolic treatment rates increase with age and are higher in males than females. Hypertension treatment rates were 24% in males and 19% in females in 2021. Diabetes treatment rates were 10% in males and 7% in females in 2021. Geospatial analysis using patient addresses reveals interesting differences among provinces and neighborhoods in Canada. Using digital surveys distributed among 8,504 Canadian adults, an increase in hypertension awareness with age and female gender was observed. However, 7% of seniors and 6% of middle-aged Canadians reported uncontrolled blood pressure (>140/90 mmHg). In addition, elevated blood pressure (130-139/80-89 mmHg) was reported by 20% of seniors and 14% of middle-aged Canadians.Keywords: cardiometabolic conditions, diabetes, hypertension, precision public health
Procedia PDF Downloads 1163637 Tip60 Histone Acetyltransferase Activators as Neuroepigenetic Therapeutic Modulators for Alzheimer’s Disease
Authors: Akanksha Bhatnagar, Sandhya Kortegare, Felice Elefant
Abstract:
Context: Alzheimer's disease (AD) is a neurodegenerative disorder that is characterized by progressive cognitive decline and memory loss. The cause of AD is not fully understood, but it is thought to be caused by a combination of genetic, environmental, and lifestyle factors. One of the hallmarks of AD is the loss of neurons in the hippocampus, a brain region that is important for memory and learning. This loss of neurons is thought to be caused by a decrease in histone acetylation, which is a process that regulates gene expression. Research Aim: The research aim of the study was to develop mall molecule compounds that can enhance the activity of Tip60, a histone acetyltransferase that is important for memory and learning. Methodology/Analysis: The researchers used in silico structural modeling and a pharmacophore-based virtual screening approach to design and synthesize small molecule compounds strongly predicted to target and enhance Tip60’s HAT activity. The compounds were then tested in vitro and in vivo to assess their ability to enhance Tip60 activity and rescue cognitive deficits in AD models. Findings: The researchers found that several of the compounds were able to enhance Tip60 activity and rescue cognitive deficits in AD models. The compounds were also developed to cross the blood-brain barrier, which is an important factor for the development of potential AD therapeutics. Theoretical Importance: The findings of this study suggest that Tip60 HAT activators have the potential to be developed as therapeutic agents for AD. The compounds are specific to Tip60, which suggests that they may have fewer side effects than other HDAC inhibitors. Additionally, the compounds are able to cross the blood-brain barrier, which is a major hurdle for the development of AD therapeutics. Data Collection: The study collected data from a variety of sources, including in vitro assays and animal models. The in vitro assays assessed the ability of compounds to enhance Tip60 activity using histone acetyltransferase (HAT) enzyme assays and chromatin immunoprecipitation assays. Animal models were used to assess the ability of the compounds to rescue cognitive deficits in AD models using a variety of behavioral tests, including locomotor ability, sensory learning, and recognition tasks. The human clinical trials will be used to assess the safety and efficacy of the compounds in humans. Questions: The question addressed by this study was whether Tip60 HAT activators could be developed as therapeutic agents for AD. Conclusions: The findings of this study suggest that Tip60 HAT activators have the potential to be developed as therapeutic agents for AD. The compounds are specific to Tip60, which suggests that they may have fewer side effects than other HDAC inhibitors. Additionally, the compounds are able to cross the blood-brain barrier, which is a major hurdle for the development of AD therapeutics. Further research is needed to confirm the safety and efficacy of these compounds in humans.Keywords: Alzheimer's disease, cognition, neuroepigenetics, drug discovery
Procedia PDF Downloads 753636 Challenges for IoT Adoption in India: A Study Based on Foresight Analysis for 2025
Authors: Shruti Chopra, Vikas Rao Vadi
Abstract:
In the era of the digital world, the Internet of Things (IoT) has been receiving significant attention. Its ubiquitous connectivity between humans, machines to machines (M2M) and machines to humans provides it a potential to transform the society and establish an ecosystem to serve new dimensions to the economy of the country. Thereby, this study has attempted to identify the challenges that seem prevalent in IoT adoption in India through the literature survey. Further, the data has been collected by taking the opinions of experts to conduct the foresight analysis and it has been analyzed with the help of scenario planning process – Micmac, Mactor, Multipol, and Smic-Prob. As a methodology, the study has identified the relationship between variables through variable analysis using Micmac and actor analysis using Mactor, this paper has attempted to generate the entire field of possibilities in terms of hypotheses and construct various scenarios through Multipol. And lastly, the findings of the study include final scenarios that are selected using Smic-Prob by assigning the probability to all the scenarios (including the conditional probability). This study may help the practitioners and policymakers to remove the obstacles to successfully implement the IoT in India.Keywords: Internet of Thing (IoT), foresight analysis, scenario planning, challenges, policymaking
Procedia PDF Downloads 1473635 Sound Quality Analysis of Sloshing Noise from a Rectangular Tank
Authors: Siva Teja Golla, B. Venkatesham
Abstract:
The recent technologies in hybrid and high-end cars have subsided the noise from major sources like engines and transmission systems. This resulted in the unmasking of the previously subdued noises. These noises are becoming noticeable to the passengers, causing annoyance to them and affecting the perceived quality of the vehicle. Sloshing in the fuel tank is one such source of noise. Sloshing occurs due to the excitations undergone by the fuel tank due to the vehicle's movement. Sloshing noise occurs due to the interaction of the fluid with the surrounding tank walls or with the fluid itself. The noise resulting from the interaction of the fluid with the structure is ‘Hit noise’, and the noise due to fluid-fluid interaction is ‘Splash noise’. The type of interactions the fluid undergoes inside the tank, and the type of noise generated depends on a variety of factors like the fill level of the tank, type of fluid, presence of objects like baffles inside the tank, type and strength of the excitation, etc. There have been studies done to understand the effect of each of these parameters on the generation of different types of sloshing noises. But little work is done in the psychoacoustic aspect of these sounds. The psychoacoustic study of the sloshing noises gives an understanding of the level of annoyance it can cause to the passengers and helps in taking necessary measures to address it. In view of this, the current paper focuses on the calculation of the psychoacoustic parameters like loudness, sharpness, roughness and fluctuation strength for the sloshing noise. As the noise generation mechanisms for the hit and splash noises are different, these parameters are calculated separately for them. For this, the fluid flow regimes that predominantly cause the hit-and-splash noises are to be separately emulated inside the tank. This is done through a reciprocating test rig, which imposes reciprocating excitation to a rectangular tank filled with the fluid. By varying the frequency of excitation, the fluid flow regimes with the predominant generation of hit-and-splash noises can be separately created inside the tank. These tests are done in a quiet room and the noise generated is captured using microphones and is used for the calculation of psychoacoustic parameters of the sloshing noise. This study also includes the effect of fill level and the presence of baffles inside the tank on these parameters.Keywords: sloshing, hit noise, splash noise, sound quality
Procedia PDF Downloads 303634 An Approach to Autonomous Drones Using Deep Reinforcement Learning and Object Detection
Authors: K. R. Roopesh Bharatwaj, Avinash Maharana, Favour Tobi Aborisade, Roger Young
Abstract:
Presently, there are few cases of complete automation of drones and its allied intelligence capabilities. In essence, the potential of the drone has not yet been fully utilized. This paper presents feasible methods to build an intelligent drone with smart capabilities such as self-driving, and obstacle avoidance. It does this through advanced Reinforcement Learning Techniques and performs object detection using latest advanced algorithms, which are capable of processing light weight models with fast training in real time instances. For the scope of this paper, after researching on the various algorithms and comparing them, we finally implemented the Deep-Q-Networks (DQN) algorithm in the AirSim Simulator. In future works, we plan to implement further advanced self-driving and object detection algorithms, we also plan to implement voice-based speech recognition for the entire drone operation which would provide an option of speech communication between users (People) and the drone in the time of unavoidable circumstances. Thus, making drones an interactive intelligent Robotic Voice Enabled Service Assistant. This proposed drone has a wide scope of usability and is applicable in scenarios such as Disaster management, Air Transport of essentials, Agriculture, Manufacturing, Monitoring people movements in public area, and Defense. Also discussed, is the entire drone communication based on the satellite broadband Internet technology for faster computation and seamless communication service for uninterrupted network during disasters and remote location operations. This paper will explain the feasible algorithms required to go about achieving this goal and is more of a reference paper for future researchers going down this path.Keywords: convolution neural network, natural language processing, obstacle avoidance, satellite broadband technology, self-driving
Procedia PDF Downloads 2513633 A Longitudinal Case Study of Greek as a Second Language
Authors: M. Vassou, A. Karasimos
Abstract:
A primary concern in the field of Second Language Acquisition (SLA) research is to determine the innate mechanisms of second language learning and acquisition through the systematic study of a learner's interlanguage. Errors emerge while a learner attempts to communicate using the target-language and can be seen either as the observable linguistic product of the latent cognitive and language process of mental representations or as an indispensable learning mechanism. Therefore, the study of the learner’s erroneous forms may depict the various strategies and mechanisms that take place during the language acquisition process resulting in deviations from the target-language norms and difficulties in communication. Mapping the erroneous utterances of a late adult learner in the process of acquiring Greek as a second language constitutes one of the main aims of this study. For our research purposes, we created an error-tagged learner corpus composed of the participant’s written texts produced throughout a period of a 4- year instructed language acquisition. Error analysis and interlanguage theory constitute the methodological and theoretical framework, respectively. The research questions pertain to the learner's most frequent errors per linguistic category and per year as well as his choices concerning the Greek Article System. According to the quantitative analysis of the data, the most frequent errors are observed in the categories of the stress system and syntax, whereas a significant fluctuation and/or gradual reduction throughout the 4 years of instructed acquisition indicate the emergence of developmental stages. The findings with regard to the article usage bespeak fossilization of erroneous structures in certain contexts. In general, our results point towards the existence and further development of an established learner’s (inter-) language system governed not only by mother- tongue and target-language influences but also by the learner’s assumptions and set of rules as the result of a complex cognitive process. It is expected that this study will contribute not only to the knowledge in the field of Greek as a second language and SLA generally, but it will also provide an insight into the cognitive mechanisms and strategies developed by multilingual learners of late adulthood.Keywords: Greek as a second language, error analysis, interlanguage, late adult learner
Procedia PDF Downloads 1283632 The Moderating Role of Perceived University Environment in the Formation of Entrepreneurial Intention among Creative Industries Students
Authors: Patrick Ebong Ebewo
Abstract:
The trend of high unemployment levels globally is a growing concern, which suggests that university students especially those studying the creative industries are most likely to face unemployment upon completion of their studies. Therefore the effort of university in fostering entrepreneurial knowledge is equally important to the development of student’s soft skill. The purpose of this paper is to assess the significance of perceived university environment and perceived educational support that influencing University students’ intention in starting their own business in the future. Thus, attempting to answer the question 'How does perceived university environment affect students’ attitude towards entrepreneurship as a career option, perceived entrepreneurial abilities, subjective norm and entrepreneurial intentions?' The study is based on the Theory of Planned Behaviour model adapted from previous studies and empirically tested on graduates at the Tshwane University of Technology. A sample of 150 graduates from the Arts and Design graduates took part in the study and data collected were analysed using structural equation modelling (SEM). Our findings seem to suggest the indirect impact of perceived university environment on entrepreneurial intention through perceived environment support and perceived entrepreneurial abilities. Thus, any increase in perceived university environment might influence students to become entrepreneurs. Based on these results, it is recommended that: (a) Tshwane University of Technology and other universities of technology should establish an ‘Entrepreneurship Internship Programme’ as a tool for stimulated work integrated learning. Post-graduation intervention could be implemented by the development of a ‘Graduate Entrepreneurship Program’ which should be embedded in the Bachelor of Technology (B-Tech now Advance Diploma) and Postgraduate courses; (b) Policymakers should consider the development of a coherent national policy framework that addresses entrepreneurship for the Arts/creative industries sector. This would create the enabling environment for the evolution of Higher Education Institutions from merely Teaching, Learning & Research to becoming drivers for creative entrepreneurship.Keywords: business venture, entrepreneurship education, entrepreneurial intent, university environment
Procedia PDF Downloads 3363631 The Impacts of an Adapted Literature Circle Model on Reading Comprehension, Engagement, and Cooperation in an EFL Reading Course
Authors: Tiantian Feng
Abstract:
There is a dearth of research on the literary circle as a teaching strategy in English as a Foreign Language (EFL) classes in Chinese colleges and universities and even fewer empirical studies on its impacts. In this one-quarter, design-based project, the researcher aims to increase students’ engagement, cooperation, and, on top of that, reading comprehension performance by utilizing a researcher-developed, adapted reading circle model in an EFL reading course at a Chinese college. The model also integrated team-based learning and portfolio assessment, with an emphasis on the specialization of individual responsibilities, contributions, and outcomes in reading projects, with the goal of addressing current issues in EFL classes at Chinese colleges, such as passive learning, test orientation, ineffective and uncooperative teamwork, and lack of dynamics. In this quasi-experimental research, two groups of students enrolled in the course were invited to participate in four in-class team projects, with the intervention class following the adapted literature circle model and team members rotating as Leader, Coordinator, Brain trust, and Reporter. The researcher/instructor used a sequential explanatory mixed-methods approach to quantitatively analyze the final grades for the pre-and post-tests, as well as individual scores for team projects and will code students' artifacts in the next step, with the results to be reported in a subsequent paper(s). Initial analysis showed that both groups saw an increase in final grades, but the intervention group enjoyed a more significant boost, suggesting that the adapted reading circle model is effective in improving students’ reading comprehension performance. This research not only closes the empirical research gap of literature circles in college EFL classes in China but also adds to the pool of effective ways to optimize reading comprehension performance and class performance in college EFL classes.Keywords: literature circle, EFL teaching, college english reading, reading comprehension
Procedia PDF Downloads 1003630 Brain Computer Interface Implementation for Affective Computing Sensing: Classifiers Comparison
Authors: Ramón Aparicio-García, Gustavo Juárez Gracia, Jesús Álvarez Cedillo
Abstract:
A research line of the computer science that involve the study of the Human-Computer Interaction (HCI), which search to recognize and interpret the user intent by the storage and the subsequent analysis of the electrical signals of the brain, for using them in the control of electronic devices. On the other hand, the affective computing research applies the human emotions in the HCI process helping to reduce the user frustration. This paper shows the results obtained during the hardware and software development of a Brain Computer Interface (BCI) capable of recognizing the human emotions through the association of the brain electrical activity patterns. The hardware involves the sensing stage and analogical-digital conversion. The interface software involves algorithms for pre-processing of the signal in time and frequency analysis and the classification of patterns associated with the electrical brain activity. The methods used for the analysis and classification of the signal have been tested separately, by using a database that is accessible to the public, besides to a comparison among classifiers in order to know the best performing.Keywords: affective computing, interface, brain, intelligent interaction
Procedia PDF Downloads 3893629 Improving Subjective Bias Detection Using Bidirectional Encoder Representations from Transformers and Bidirectional Long Short-Term Memory
Authors: Ebipatei Victoria Tunyan, T. A. Cao, Cheol Young Ock
Abstract:
Detecting subjectively biased statements is a vital task. This is because this kind of bias, when present in the text or other forms of information dissemination media such as news, social media, scientific texts, and encyclopedias, can weaken trust in the information and stir conflicts amongst consumers. Subjective bias detection is also critical for many Natural Language Processing (NLP) tasks like sentiment analysis, opinion identification, and bias neutralization. Having a system that can adequately detect subjectivity in text will boost research in the above-mentioned areas significantly. It can also come in handy for platforms like Wikipedia, where the use of neutral language is of importance. The goal of this work is to identify the subjectively biased language in text on a sentence level. With machine learning, we can solve complex AI problems, making it a good fit for the problem of subjective bias detection. A key step in this approach is to train a classifier based on BERT (Bidirectional Encoder Representations from Transformers) as upstream model. BERT by itself can be used as a classifier; however, in this study, we use BERT as data preprocessor as well as an embedding generator for a Bi-LSTM (Bidirectional Long Short-Term Memory) network incorporated with attention mechanism. This approach produces a deeper and better classifier. We evaluate the effectiveness of our model using the Wiki Neutrality Corpus (WNC), which was compiled from Wikipedia edits that removed various biased instances from sentences as a benchmark dataset, with which we also compare our model to existing approaches. Experimental analysis indicates an improved performance, as our model achieved state-of-the-art accuracy in detecting subjective bias. This study focuses on the English language, but the model can be fine-tuned to accommodate other languages.Keywords: subjective bias detection, machine learning, BERT–BiLSTM–Attention, text classification, natural language processing
Procedia PDF Downloads 1303628 An Approach on Intelligent Tolerancing of Car Body Parts Based on Historical Measurement Data
Authors: Kai Warsoenke, Maik Mackiewicz
Abstract:
To achieve a high quality of assembled car body structures, tolerancing is used to ensure a geometric accuracy of the single car body parts. There are two main techniques to determine the required tolerances. The first is tolerance analysis which describes the influence of individually tolerated input values on a required target value. Second is tolerance synthesis to determine the location of individual tolerances to achieve a target value. Both techniques are based on classical statistical methods, which assume certain probability distributions. To ensure competitiveness in both saturated and dynamic markets, production processes in vehicle manufacturing must be flexible and efficient. The dimensional specifications selected for the individual body components and the resulting assemblies have a major influence of the quality of the process. For example, in the manufacturing of forming tools as operating equipment or in the higher level of car body assembly. As part of the metrological process monitoring, manufactured individual parts and assemblies are recorded and the measurement results are stored in databases. They serve as information for the temporary adjustment of the production processes and are interpreted by experts in order to derive suitable adjustments measures. In the production of forming tools, this means that time-consuming and costly changes of the tool surface have to be made, while in the body shop, uncertainties that are difficult to control result in cost-intensive rework. The stored measurement results are not used to intelligently design tolerances in future processes or to support temporary decisions based on real-world geometric data. They offer potential to extend the tolerancing methods through data analysis and machine learning models. The purpose of this paper is to examine real-world measurement data from individual car body components, as well as assemblies, in order to develop an approach for using the data in short-term actions and future projects. For this reason, the measurement data will be analyzed descriptively in the first step in order to characterize their behavior and to determine possible correlations. In the following, a database is created that is suitable for developing machine learning models. The objective is to create an intelligent way to determine the position and number of measurement points as well as the local tolerance range. For this a number of different model types are compared and evaluated. The models with the best result are used to optimize equally distributed measuring points on unknown car body part geometries and to assign tolerance ranges to them. The current results of this investigation are still in progress. However, there are areas of the car body parts which behave more sensitively compared to the overall part and indicate that intelligent tolerancing is useful here in order to design and control preceding and succeeding processes more efficiently.Keywords: automotive production, machine learning, process optimization, smart tolerancing
Procedia PDF Downloads 1173627 The Applications of Wire Print in Composite Material Research and Fabrication Process
Authors: Hsu Yi-Chia, Hoy June-Hao
Abstract:
FDM (Fused Deposition Modeling) is a rapid proofing method without mold, however, high material and time costs have always been a major disadvantage. Wire-printing is the next generation technology that can more flexible, and also easier to apply on a 3D printer and robotic arms printing. It can create its own construction methods. The research is mainly divided into three parts. The first is about the method of parameterizing the generated paths and the conversion of g-code to the wire-printing. The second is about material attempts and the application of effects. Third, is about the improvement of the operation of mechanical equipment and the design of robotic tool-head. The purpose of this study is to develop a new wire-print method that can efficiently generate line segments and paths in three- dimensions space. The parametric modeling software transforms the digital model into a 3D printer or robotic arms g-code, this article uses thermoplastics/ clay/composites materials for testing. The combination of materials and wire-print process makes architects and designers have the ability to research and develop works and construction in the future.Keywords: parametric software, wire print, robotic arms fabrication, composite filament additive manufacturing
Procedia PDF Downloads 1313626 Dynamic Mechanical Analysis of Supercooled Water in Nanoporous Confinement and Biological Systems
Authors: Viktor Soprunyuk, Wilfried Schranz, Patrick Huber
Abstract:
In the present work, we show that Dynamic Mechanical Analysis (DMA) with a measurement frequency range f= 0.2 - 100 Hz is a rather powerful technique for the study of phase transitions (freezing and melting) and glass transitions of water in geometrical confinement. Inserting water into nanoporous host matrices, like e.g. Gelsil (size of pores 2.6 nm and 5 nm) or Vycor (size of pores 10 nm) allows one to study size effects occurring at the nanoscale conveniently in macroscopic bulk samples. One obtains valuable insight concerning confinement induced changes of the dynamics by measuring the temperature and frequency dependencies of the complex Young's modulus Y* for various pore sizes. Solid-liquid transitions or glass-liquid transitions show up in a softening or the real part Y' of the complex Young's modulus, yet with completely different frequency dependencies. Analysing the frequency dependent imaginary part of the Young´s modulus in the glass transition regions for different pore sizes we find a clear-cut 1/d-dependence of the calculated glass transition temperatures which extrapolates to Tg(1/d=0)=136 K, in agreement with the traditional value of water. The results indicate that the main role of the pore diameter is the relative amount of water molecules that are near an interface within a length scale of the order of the dynamic correlation length x. Thus we argue that the observed strong pore size dependence of Tg is an interfacial effect, rather than a finite size effect. We obtained similar signatures of Y* near glass transitions in different biological objects (fruits, vegetables, and bread). The values of the activation energies for these biological materials in the region of glass transition are quite similar to the values of the activation energies of supercooled water in the nanoporous confinement in this region. The present work was supported by the Austrian Science Fund (FWF, project Nr. P 28672 – N36).Keywords: biological systems, liquids, glasses, amorphous systems, nanoporous materials, phase transition
Procedia PDF Downloads 2383625 Application of MALDI-MS to Differentiate SARS-CoV-2 and Non-SARS-CoV-2 Symptomatic Infections in the Early and Late Phases of the Pandemic
Authors: Dmitriy Babenko, Sergey Yegorov, Ilya Korshukov, Aidana Sultanbekova, Valentina Barkhanskaya, Tatiana Bashirova, Yerzhan Zhunusov, Yevgeniya Li, Viktoriya Parakhina, Svetlana Kolesnichenko, Yeldar Baiken, Aruzhan Pralieva, Zhibek Zhumadilova, Matthew S. Miller, Gonzalo H. Hortelano, Anar Turmuhambetova, Antonella E. Chesca, Irina Kadyrova
Abstract:
Introduction: The rapidly evolving COVID-19 pandemic, along with the re-emergence of pathogens causing acute respiratory infections (ARI), has necessitated the development of novel diagnostic tools to differentiate various causes of ARI. MALDI-MS, due to its wide usage and affordability, has been proposed as a potential instrument for diagnosing SARS-CoV-2 versus non-SARS-CoV-2 ARI. The aim of this study was to investigate the potential of MALDI-MS in conjunction with a machine learning model to accurately distinguish between symptomatic infections caused by SARS-CoV-2 and non-SARS-CoV-2 during both the early and later phases of the pandemic. Furthermore, this study aimed to analyze mass spectrometry (MS) data obtained from nasal swabs of healthy individuals. Methods: We gathered mass spectra from 252 samples, comprising 108 SARS-CoV-2-positive samples obtained in 2020 (Covid 2020), 7 SARS-CoV- 2-positive samples obtained in 2023 (Covid 2023), 71 samples from symptomatic individuals without SARS-CoV-2 (Control non-Covid ARVI), and 66 samples from healthy individuals (Control healthy). All the samples were subjected to RT-PCR testing. For data analysis, we employed the caret R package to train and test seven machine-learning algorithms: C5.0, KNN, NB, RF, SVM-L, SVM-R, and XGBoost. We conducted a training process using a five-fold (outer) nested repeated (five times) ten-fold (inner) cross-validation with a randomized stratified splitting approach. Results: In this study, we utilized the Covid 2020 dataset as a case group and the non-Covid ARVI dataset as a control group to train and test various machine learning (ML) models. Among these models, XGBoost and SVM-R demonstrated the highest performance, with accuracy values of 0.97 [0.93, 0.97] and 0.95 [0.95; 0.97], specificity values of 0.86 [0.71; 0.93] and 0.86 [0.79; 0.87], and sensitivity values of 0.984 [0.984; 1.000] and 1.000 [0.968; 1.000], respectively. When examining the Covid 2023 dataset, the Naive Bayes model achieved the highest classification accuracy of 43%, while XGBoost and SVM-R achieved accuracies of 14%. For the healthy control dataset, the accuracy of the models ranged from 0.27 [0.24; 0.32] for k-nearest neighbors to 0.44 [0.41; 0.45] for the Support Vector Machine with a radial basis function kernel. Conclusion: Therefore, ML models trained on MALDI MS of nasopharyngeal swabs obtained from patients with Covid during the initial phase of the pandemic, as well as symptomatic non-Covid individuals, showed excellent classification performance, which aligns with the results of previous studies. However, when applied to swabs from healthy individuals and a limited sample of patients with Covid in the late phase of the pandemic, ML models exhibited lower classification accuracy.Keywords: SARS-CoV-2, MALDI-TOF MS, ML models, nasopharyngeal swabs, classification
Procedia PDF Downloads 1083624 Value in Exchange: The Importance of Users Interaction as the Center of User Experiences
Authors: Ramlan Jantan, Norfadilah Kamaruddin, Shahriman Zainal Abidin
Abstract:
In this era of technology, the co-creation method has become a new development trend. In this light, most design businesses have currently transformed their development strategy from being goods-dominant into service-dominant where more attention is given to the end-users and their roles in the development process. As a result, the conventional development process has been replaced with a more cooperative one. Consequently, numerous studies have been conducted to explore the extension of co-creation method in the design development process and most studies have focused on issues found during the production process. In the meantime, this study aims to investigate potential values established during the pre-production process, which is also known as the ‘circumstances value creation’. User involvement is questioned and crucially debate at the entry level of pre-production process in value in-exchange jointly spheres; thus user experiences took place. Thus, this paper proposed a potential framework of the co-creation method for Malaysian interactive product development. The framework is formulated from both parties involved: the users and designers. The framework will clearly give an explanation of the value of the co-creation method, and it could assist relevant design industries/companies in developing a blueprint for the design process. This paper further contributes to the literature on the co-creation of value and digital ecosystems.Keywords: co-creation method, co-creation framework, co-creation, co-production
Procedia PDF Downloads 1783623 A Simple and Efficient Method for Accurate Measurement and Control of Power Frequency Deviation
Authors: S. J. Arif
Abstract:
In the presented technique, a simple method is given for accurate measurement and control of power frequency deviation. The sinusoidal signal for which the frequency deviation measurement is required is transformed to a low voltage level and passed through a zero crossing detector to convert it into a pulse train. Another stable square wave signal of 10 KHz is obtained using a crystal oscillator and decade dividing assemblies (DDA). These signals are combined digitally and then passed through decade counters to give a unique combination of pulses or levels, which are further encoded to make them equally suitable for both control applications and display units. The developed circuit using discrete components has a resolution of 0.5 Hz and completes measurement within 20 ms. The realized circuit is simulated and synthesized using Verilog HDL and subsequently implemented on FPGA. The results of measurement on FPGA are observed on a very high resolution logic analyzer. These results accurately match the simulation results as well as the results of same circuit implemented with discrete components. The proposed system is suitable for accurate measurement and control of power frequency deviation.Keywords: digital encoder for frequency measurement, frequency deviation measurement, measurement and control systems, power systems
Procedia PDF Downloads 377