Search results for: data to action
25851 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement
Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti
Abstract:
Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing
Procedia PDF Downloads 10825850 Embodied Empowerment: A Design Framework for Augmenting Human Agency in Assistive Technologies
Authors: Melina Kopke, Jelle Van Dijk
Abstract:
Persons with cognitive disabilities, such as Autism Spectrum Disorder (ASD) are often dependent on some form of professional support. Recent transformations in Dutch healthcare have spurred institutions to apply new, empowering methods and tools to enable their clients to cope (more) independently in daily life. Assistive Technologies (ATs) seem promising as empowering tools. While ATs can, functionally speaking, help people to perform certain activities without human assistance, we hold that, from a design-theoretical perspective, such technologies often fail to empower in a deeper sense. Most technologies serve either to prescribe or to monitor users’ actions, which in some sense objectifies them, rather than strengthening their agency. This paper proposes that theories of embodied interaction could help formulating a design vision in which interactive assistive devices augment, rather than replace, human agency and thereby add to a persons’ empowerment in daily life settings. It aims to close the gap between empowerment theory and the opportunities provided by assistive technologies, by showing how embodiment and empowerment theory can be applied in practice in the design of new, interactive assistive devices. Taking a Research-through-Design approach, we conducted a case study of designing to support independently living people with ASD with structuring daily activities. In three iterations we interlaced design action, active involvement and prototype evaluations with future end-users and healthcare professionals, and theoretical reflection. Our co-design sessions revealed the issue of handling daily activities being multidimensional. Not having the ability to self-manage one’s daily life has immense consequences on one’s self-image, and also has major effects on the relationship with professional caregivers. Over the course of the project relevant theoretical principles of both embodiment and empowerment theory together with user-insights, informed our design decisions. This resulted in a system of wireless light units that users can program as a reminder for tasks, but also to record and reflect on their actions. The iterative process helped to gradually refine and reframe our growing understanding of what it concretely means for a technology to empower a person in daily life. Drawing on the case study insights we propose a set of concrete design principles that together form what we call the embodied empowerment design framework. The framework includes four main principles: Enabling ‘reflection-in-action’; making information ‘publicly available’ in order to enable co-reflection and social coupling; enabling the implementation of shared reflections into an ‘endurable-external feedback loop’ embedded in the persons familiar ’lifeworld’; and nudging situated actions with self-created action-affordances. In essence, the framework aims for the self-development of a suitable routine, or ‘situated practice’, by building on a growing shared insight of what works for the person. The framework, we propose, may serve as a starting point for AT designers to create truly empowering interactive products. In a set of follow-up projects involving the participation of persons with ASD, Intellectual Disabilities, Dementia and Acquired Brain Injury, the framework will be applied, evaluated and further refined.Keywords: assistive technology, design, embodiment, empowerment
Procedia PDF Downloads 27825849 Prosperous Digital Image Watermarking Approach by Using DCT-DWT
Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar
Abstract:
In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacksKeywords: watermarking, digital, DCT-DWT, security
Procedia PDF Downloads 42225848 Effects of Covid-19 pandemic in Japan on Japanese People’s and Expatriates’ Lifestyles
Authors: Noriyuki Suyama
Abstract:
This paper looked into consumer behavioral changes by analyzing the data collected by ASMARKS Co., one of a research companies in Japan. The purpose of the paper is to understand the two differences of before vs. after COVID-19 pandemic and Japanese living in Japan. Subsequently, examining the analysis results helped obtain useful insights into new business models for business parties in Japan as a microlevel perspective. The paper also tried to explore future conditions of globalization by taking into consideration nation’s political and economic changes as a macro-level perspective. The COVID-19 has been continuing its spread across the world with more than 60 million confirmed cases in 190 countries. This pandemic with restricted scopes of behavior mandates have disrupted the consumer habits of their lifestyles. Consumers have tendency to learn new ways when they have trouble in taking routine action. For example, the government forces people to refrain from going out, they try to telecommute at home. If the situation come back to normal, people still change their lifestyles to fit in the best. Some of data show typical effects of COVID-19; forceful exposure to digitalized work-life styles; more flexible time at home; importance of trustful and useful information gathering between what's good and bad;etc. in comparison with before vs. after COVID-19 pandemic. In addition, Japanese have less changed their lifestyles than Expatriates living in Japan. For example, while 94% of the expatriates have decreased their outgo because of self-quarantine, only 55% of the Japanese have done. There are more differences in both comparisons in the analysis results. The economic downtrend resulting from COVID-19 is supposed to be at least as devastating if not more so than that of the financial crisis. With unemployment levels in the US taking two weeks to reach what took 6 months in the 2008 crisis, there is no doubt of a global recession some predict could reach 10% or above of GDP. As a result, globalization in the global supply chain of goods and services will end up with negative impact. A lot of governmental financial and economic policies are supposed to focus on their own profits and interests, exclusing other countries interests as is the case with the Recovery Act just after the global financial crisis from 2007 to 2008. Both micro- and macro-levels analysis successfully reveal important connotations and managerial implications of business in Japan for Japanese consumers as well as after COVID-19 global business.Keywords: COVID-19, lifestyle in Japan, expatriates, consumer behavior
Procedia PDF Downloads 14025847 Machine Learning Data Architecture
Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap
Abstract:
Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning
Procedia PDF Downloads 6425846 MIMO PID Controller of a Power Plant Boiler–Turbine Unit
Authors: N. Ben-Mahmoud, M. Elfandi, A. Shallof
Abstract:
This paper presents a methodology to design multivariable PID controllers for multi-input and multi-output systems. The proposed control strategy, which is centralized, combines of PID controllers. The proportional gains in the P controllers act as tuning parameters of (SISO) in order to modify the behavior of the loops almost independently. The design procedure consists of three steps: first, an ideal decoupler including integral action is determined. Second, the decoupler is approximated with PID controllers. Third, the proportional gains are tuned to achieve the specified performance. The proposed method is applied to representative processes.Keywords: boiler turbine, MIMO, PID controller, control by decoupling, anti wind-up techniques
Procedia PDF Downloads 32825845 A Comparison of Image Data Representations for Local Stereo Matching
Authors: André Smith, Amr Abdel-Dayem
Abstract:
The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.Keywords: colour data, local stereo matching, stereo correspondence, disparity map
Procedia PDF Downloads 37025844 Automated Computer-Vision Analysis Pipeline of Calcium Imaging Neuronal Network Activity Data
Authors: David Oluigbo, Erik Hemberg, Nathan Shwatal, Wenqi Ding, Yin Yuan, Susanna Mierau
Abstract:
Introduction: Calcium imaging is an established technique in neuroscience research for detecting activity in neural networks. Bursts of action potentials in neurons lead to transient increases in intracellular calcium visualized with fluorescent indicators. Manual identification of cell bodies and their contours by experts typically takes 10-20 minutes per calcium imaging recording. Our aim, therefore, was to design an automated pipeline to facilitate and optimize calcium imaging data analysis. Our pipeline aims to accelerate cell body and contour identification and production of graphical representations reflecting changes in neuronal calcium-based fluorescence. Methods: We created a Python-based pipeline that uses OpenCV (a computer vision Python package) to accurately (1) detect neuron contours, (2) extract the mean fluorescence within the contour, and (3) identify transient changes in the fluorescence due to neuronal activity. The pipeline consisted of 3 Python scripts that could both be easily accessed through a Python Jupyter notebook. In total, we tested this pipeline on ten separate calcium imaging datasets from murine dissociate cortical cultures. We next compared our automated pipeline outputs with the outputs of manually labeled data for neuronal cell location and corresponding fluorescent times series generated by an expert neuroscientist. Results: Our results show that our automated pipeline efficiently pinpoints neuronal cell body location and neuronal contours and provides a graphical representation of neural network metrics accurately reflecting changes in neuronal calcium-based fluorescence. The pipeline detected the shape, area, and location of most neuronal cell body contours by using binary thresholding and grayscale image conversion to allow computer vision to better distinguish between cells and non-cells. Its results were also comparable to manually analyzed results but with significantly reduced result acquisition times of 2-5 minutes per recording versus 10-20 minutes per recording. Based on these findings, our next step is to precisely measure the specificity and sensitivity of the automated pipeline’s cell body and contour detection to extract more robust neural network metrics and dynamics. Conclusion: Our Python-based pipeline performed automated computer vision-based analysis of calcium image recordings from neuronal cell bodies in neuronal cell cultures. Our new goal is to improve cell body and contour detection to produce more robust, accurate neural network metrics and dynamic graphs.Keywords: calcium imaging, computer vision, neural activity, neural networks
Procedia PDF Downloads 8225843 Formulation, Preparation, and Evaluation of Coated Desloratadine Oral Disintegrating Tablets
Authors: Mohamed A. Etman, Mona G. Abd-Elnasser, Mohamed A. Shams-Eldin, Aly H. Nada
Abstract:
Orally disintegrating tablets (ODTs) are gaining importance as new drug delivery systems and emerged as one of the popular and widely accepted dosage forms, especially for the pediatric and geriatric patients. Their advantages such as administration without water, anywhere, anytime lead to their suitability to geriatric and pediatric patients. They are also suitable for the mentally ill, the bed-ridden and patients who do not have easy access to water. The benefits, in terms of patient compliance, rapid onset of action, increased bioavailability, and good stability make these tablets popular as a dosage form of choice in the current market. These dosage forms dissolve or disintegrate in the oral cavity within a matter of seconds without the need of water or chewing. Desloratadine is a tricyclic antihistaminic, which has a selective and peripheral H1-antagonist action. It is an antagonist at histamine H1 receptors, and an antagonist at all subtypes of the muscarinic acetylcholine receptor. Desloratadine is the major metabolite of loratadine. Twelve different placebos ODT were prepared (F1-F12) using different functional excipients. They were evaluated for their compressibility, hardness and disintegration time. All formulations were non sticky except four formulations; namely (F8, F9, F10, F11). All formulations were compressible with the exception of (F2). Variable disintegration times were found ranging between 20 and 120 seconds. It was found that (F12) showed the least disintegration time (20 secs) without showing any sticking which could be due to the use of high percentage of superdisintegrants. Desloratadine showed bitter taste when formulated as ODT without any treatment. Therefore, different techniques were tried in order to mask its bitter taste. Using Eudragit EPO resulted in complete masking of the bitter taste of the drug and increased the acceptability to volunteers. The compressible non sticky formulations (F1, F3, F4, F5, F6, F7 and F12) were subjected to further evaluation tests after addition of coated desloratadine, including weight uniformity, wetting time, and friability testing.. Fairly good weight uniformity values were observed in all the tested formulations. F12 exhibiting the shortest wetting time (14.7 seconds) and consequently the lowest (20 seconds) disintegration time. Dissolution profile showed that 100% desloratadine release was attained after only 2.5 minutes from the prepared ODT (F12) with dissolution efficiency of 95%.Keywords: Desloratadine, orally disintegrating tablets (ODTs), formulations, taste masking
Procedia PDF Downloads 45425842 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System
Authors: Karima Qayumi, Alex Norta
Abstract:
The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)
Procedia PDF Downloads 43225841 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design
Authors: Qing K. Zhu
Abstract:
Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise
Procedia PDF Downloads 25425840 Integrated Imaging Management System: An Approach in the Collaborative Coastal Resource Management of Bagac, Bataan
Authors: Aljon Pangan
Abstract:
The Philippines being an archipelagic country, is surrounded by coastlines (36,289 km), coastal waters (226,000 km²), oceanic waters (1.93 million km²) and territorial waters (2.2 million km²). Studies show that the Philippine coastal ecosystems are the most productive and biologically diverse in the world, however, plagued by degradation problems due to over-exploitation and illegal activities. The existence of coastal degradation issues in the country led to the emergence of Coastal Resource Management (CRM) as an approach to both national and local government in providing solutions for sustainable coastal resource utilization. CRM applies the idea of planning, implementing and monitoring through the lens of collaborative governance. It utilizes collective action and decision-making to achieve sustainable use of coastal resources. The Municipality of Bagac in Bataan is one of the coastal municipalities in the country who crafts its own CRM Program as a solution to coastal resource degradation and problems. Information and Communications Technology (ICT), particularly Integrated Imaging Management System (IIMS) is one approach that can be applied in the formula of collaborative governance which entails the Government, Private Sector, and Civil Society. IIMS can help policymakers, managers, and citizens in managing coastal resources through analyzed spatial data describing the physical, biological, and socioeconomic characteristics of the coastal areas. Moreover, this study will apply the qualitative approach in deciphering possible impacts of the application of IIMS in the Coastal Resource Management policy making and implementation of the Municipality of Bagac.Keywords: coastal resource management, collaborative governance, integrated imaging management system, information and communication technology
Procedia PDF Downloads 39725839 Introduction of Electronic Health Records to Improve Data Quality in Emergency Department Operations
Authors: Anuruddha Jagoda, Samiddhi Samarakoon, Anil Jasinghe
Abstract:
In its simplest form, data quality can be defined as 'fitness for use' and it is a concept with multi-dimensions. Emergency Departments(ED) require information to treat patients and on the other hand it is the primary source of information regarding accidents, injuries, emergencies etc. Also, it is the starting point of various patient registries, databases and surveillance systems. This interventional study was carried out to improve data quality at the ED of the National Hospital of Sri Lanka (NHSL) by introducing an e health solution to improve data quality. The NHSL is the premier trauma care centre in Sri Lanka. The study consisted of three components. A research study was conducted to assess the quality of data in relation to selected five dimensions of data quality namely accuracy, completeness, timeliness, legibility and reliability. The intervention was to develop and deploy an electronic emergency department information system (eEDIS). Post assessment of the intervention confirmed that all five dimensions of data quality had improved. The most significant improvements are noticed in accuracy and timeliness dimensions.Keywords: electronic health records, electronic emergency department information system, emergency department, data quality
Procedia PDF Downloads 27525838 Data Presentation of Lane-Changing Events Trajectories Using HighD Dataset
Authors: Basma Khelfa, Antoine Tordeux, Ibrahima Ba
Abstract:
We present a descriptive analysis data of lane-changing events in multi-lane roads. The data are provided from The Highway Drone Dataset (HighD), which are microscopic trajectories in highway. This paper describes and analyses the role of the different parameters and their significance. Thanks to HighD data, we aim to find the most frequent reasons that motivate drivers to change lanes. We used the programming language R for the processing of these data. We analyze the involvement and relationship of different variables of each parameter of the ego vehicle and the four vehicles surrounding it, i.e., distance, speed difference, time gap, and acceleration. This was studied according to the class of the vehicle (car or truck), and according to the maneuver it undertook (overtaking or falling back).Keywords: autonomous driving, physical traffic model, prediction model, statistical learning process
Procedia PDF Downloads 26125837 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator
Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain
Abstract:
Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.Keywords: percent depth dose, flatness, symmetry, golden beam data
Procedia PDF Downloads 48925836 Variable-Fidelity Surrogate Modelling with Kriging
Authors: Selvakumar Ulaganathan, Ivo Couckuyt, Francesco Ferranti, Tom Dhaene, Eric Laermans
Abstract:
Variable-fidelity surrogate modelling offers an efficient way to approximate function data available in multiple degrees of accuracy each with varying computational cost. In this paper, a Kriging-based variable-fidelity surrogate modelling approach is introduced to approximate such deterministic data. Initially, individual Kriging surrogate models, which are enhanced with gradient data of different degrees of accuracy, are constructed. Then these Gradient enhanced Kriging surrogate models are strategically coupled using a recursive CoKriging formulation to provide an accurate surrogate model for the highest fidelity data. While, intuitively, gradient data is useful to enhance the accuracy of surrogate models, the primary motivation behind this work is to investigate if it is also worthwhile incorporating gradient data of varying degrees of accuracy.Keywords: Kriging, CoKriging, Surrogate modelling, Variable- fidelity modelling, Gradients
Procedia PDF Downloads 55825835 Robust Barcode Detection with Synthetic-to-Real Data Augmentation
Authors: Xiaoyan Dai, Hsieh Yisan
Abstract:
Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.Keywords: barcode detection, data augmentation, deep learning, image-based processing
Procedia PDF Downloads 16925834 Public Perception of Energy Security in Lithuania: Between Material Interest and Energy Independence
Authors: Dainius Genys, Vylius Leonavicius, Ricardas Krikstolaitis
Abstract:
Energy security problems in Lithuania are analyzed on a regular basis; however, there is no comprehensive research on the very issue of the concept of public energy security. There is a lack of attention not only to social determinants of perception of energy security, but also a lack of a deeper analysis of the public opinion. This article aims to research the Lithuanian public perception of energy security. Complex tasks were set during the sociological study. Survey questionnaire consisted of different sets of questions: view of energy security (risk perception, political orientation, and energy security; comprehensiveness and energy security); view of energy risks and threats (perception of energy safety factors; individual dependence and burden; disobedience and risk); view of the activity of responsible institutions (energy policy assessment; confidence in institutions and energy security), demographic issues. In this article, we will focus on two aspects: a) We will analyze public opinion on the most important aspects of energy security and social factors influencing them; The hypothesis is made that public perception of energy security is related to value orientations: b) We will analyze how public opinion on energy policy executed by the government and confidence in the government are intertwined with the concept of energy security. Data of the survey, conducted on May 10-19 and June 7-17, 2013, when Seimas and the government consisted of the coalition dominated by Social Democrats with Labor, Order and Justice Parties and the Electoral Action of Poles, were used in this article. It is important to note that the survey was conducted prior to Russia’s occupation of the Crimea.Keywords: energy security, public opinion, risk, energy threat, energy security policy
Procedia PDF Downloads 51025833 New Knowledge Co-Creation in Mobile Learning: A Classroom Action Research with Multiple Case Studies Using Mobile Instant Messaging
Authors: Genevieve Lim, Arthur Shelley, Dongcheol Heo
Abstract:
Abstract—Mobile technologies can enhance the learning process as it enables social engagement around concepts beyond the classroom and the curriculum. Early results in this ongoing research is showing that when learning interventions are designed specifically to generate new insights, mobile devices support regulated learning and encourage learners to collaborate, socialize and co-create new knowledge. As students navigate across the space and time boundaries, the fundamental social nature of learning transforms into mobile computer supported collaborative learning (mCSCL). The metacognitive interaction in mCSCL via mobile applications reflects the regulation of learning among the students. These metacognitive experiences whether self-, co- or shared-regulated are significant to the learning outcomes. Despite some insightful empirical studies, there has not yet been significant research that investigates the actual practice and processes of the new knowledge co-creation. This leads to question as to whether mobile learning provides a new channel to leverage learning? Alternatively, does mobile interaction create new types of learning experiences and how do these experiences co-create new knowledge. The purpose of this research is to explore these questions and seek evidence to support one or the other. This paper addresses these questions from the students’ perspective to understand how students interact when constructing knowledge in mCSCL and how students’ self-regulated learning (SRL) strategies support the co-creation of new knowledge in mCSCL. A pilot study has been conducted among international undergraduates to understand students’ perspective of mobile learning and concurrently develops a definition in an appropriate context. Using classroom action research (CAR) with multiple case studies, this study is being carried out in a private university in Thailand to narrow the research gaps in mCSCL and SRL. The findings will allow teachers to see the importance of social interaction for meaningful student engagement and envisage learning outcomes from a knowledge management perspective and what role mobile devices can play in these. The findings will signify important indicators for academics to rethink what is to be learned and how it should be learned. Ultimately, the study will bring new light into the co-creation of new knowledge in a social interactive learning environment and challenges teachers to embrace the 21st century of learning with mobile technologies to deepen and extend learning opportunities.Keywords: mobile computer supported collaborative learning, mobile instant messaging, mobile learning, new knowledge co-creation, self-regulated learning
Procedia PDF Downloads 23225832 Analysis of Delivery of Quad Play Services
Authors: Rahul Malhotra, Anurag Sharma
Abstract:
Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.Keywords: FTTH, quad play, play service, access networks, data rate
Procedia PDF Downloads 41625831 Feedback in the Language Class: An Action Research Process
Authors: Arash Golzari Koloor
Abstract:
Feedback seems to be an inseparable part of teaching a second/foreign language. One type of feedback is corrective feedback which is one type of error treatment in second language classrooms. This study is a report on the types of corrective feedback employed in an IELTS preparation course. The types of feedback, their frequencies, and their effectiveness are enlisted, enumerated, and interpreted. The results showed that explicit correction and recast were the most frequent types of feedback while repetition and elicitation were the least. The results also revealed that metalinguistic feedback, elicitation, and explicit correction were the most effective types of feedback and affected learners performance greatly.Keywords: classroom interaction, corrective feedback, error treatment, oral performance
Procedia PDF Downloads 33425830 Classification of Manufacturing Data for Efficient Processing on an Edge-Cloud Network
Authors: Onyedikachi Ulelu, Andrew P. Longstaff, Simon Fletcher, Simon Parkinson
Abstract:
The widespread interest in 'Industry 4.0' or 'digital manufacturing' has led to significant research requiring the acquisition of data from sensors, instruments, and machine signals. In-depth research then identifies methods of analysis of the massive amounts of data generated before and during manufacture to solve a particular problem. The ultimate goal is for industrial Internet of Things (IIoT) data to be processed automatically to assist with either visualisation or autonomous system decision-making. However, the collection and processing of data in an industrial environment come with a cost. Little research has been undertaken on how to specify optimally what data to capture, transmit, process, and store at various levels of an edge-cloud network. The first step in this specification is to categorise IIoT data for efficient and effective use. This paper proposes the required attributes and classification to take manufacturing digital data from various sources to determine the most suitable location for data processing on the edge-cloud network. The proposed classification framework will minimise overhead in terms of network bandwidth/cost and processing time of machine tool data via efficient decision making on which dataset should be processed at the ‘edge’ and what to send to a remote server (cloud). A fast-and-frugal heuristic method is implemented for this decision-making. The framework is tested using case studies from industrial machine tools for machine productivity and maintenance.Keywords: data classification, decision making, edge computing, industrial IoT, industry 4.0
Procedia PDF Downloads 18225829 Denoising Transient Electromagnetic Data
Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen
Abstract:
Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform
Procedia PDF Downloads 8525828 Attribute Analysis of Quick Response Code Payment Users Using Discriminant Non-negative Matrix Factorization
Authors: Hironori Karachi, Haruka Yamashita
Abstract:
Recently, the system of quick response (QR) code is getting popular. Many companies introduce new QR code payment services and the services are competing with each other to increase the number of users. For increasing the number of users, we should grasp the difference of feature of the demographic information, usage information, and value of users between services. In this study, we conduct an analysis of real-world data provided by Nomura Research Institute including the demographic data of users and information of users’ usages of two services; LINE Pay, and PayPay. For analyzing such data and interpret the feature of them, Nonnegative Matrix Factorization (NMF) is widely used; however, in case of the target data, there is a problem of the missing data. EM-algorithm NMF (EMNMF) to complete unknown values for understanding the feature of the given data presented by matrix shape. Moreover, for comparing the result of the NMF analysis of two matrices, there is Discriminant NMF (DNMF) shows the difference of users features between two matrices. In this study, we combine EMNMF and DNMF and also analyze the target data. As the interpretation, we show the difference of the features of users between LINE Pay and Paypay.Keywords: data science, non-negative matrix factorization, missing data, quality of services
Procedia PDF Downloads 13125827 Using an Empathy Intervention Model to Enhance Empathy and Socially Shared Regulation in Youth with Autism Spectrum Disorder
Authors: Yu-Chi Chou
Abstract:
The purpose of this study was to establish a logical path of an instructional model of empathy and social regulation, providing feasibility evidence on the model implementation in students with autism spectrum disorder (ASD). This newly developed Emotional Bug-Out Bag (BoB) curriculum was designed to enhance the empathy and socially shared regulation of students with ASD. The BoB model encompassed three instructional phases of basic theory lessons (BTL), action plan practices (APP), and final theory practices (FTP) during implementation. Besides, a learning flow (teacher-directed instruction, student self-directed problem-solving, group-based task completion, group-based reflection) was infused into the progress of instructional phases to deliberately promote the social regulatory process in group-working activities. A total of 23 junior high school students with ASD were implemented with the BoB curriculum. To examine the logical path for model implementation, data was collected from the participating students’ self-report scores on the learning nodes and understanding questions. Path analysis using structural equation modeling (SEM) was utilized for analyzing scores on 10 learning nodes and 41 understanding questions through the three phases of the BoB model. Results showed (a) all participants progressed throughout the implementation of the BoB model, and (b) the models of learning nodes and phases were positive and significant as expected, confirming the hypothesized logic path of this curriculum.Keywords: autism spectrum disorder, empathy, regulation, socially shared regulation
Procedia PDF Downloads 6625826 Developing Guidelines for Public Health Nurse Data Management and Use in Public Health Emergencies
Authors: Margaret S. Wright
Abstract:
Background/Significance: During many recent public health emergencies/disasters, public health nursing data has been missing or delayed, potentially impacting the decision-making and response. Data used as evidence for decision-making in response, planning, and mitigation has been erratic and slow, decreasing the ability to respond. Methodology: Applying best practices in data management and data use in public health settings, and guided by the concepts outlined in ‘Disaster Standards of Care’ models leads to the development of recommendations for a model of best practices in data management and use in public health disasters/emergencies by public health nurses. As the ‘patient’ in public health disasters/emergencies is the community (local, regional or national), guidelines for patient documentation are incorporated in the recommendations. Findings: Using model public health nurses could better plan how to prepare for, respond to, and mitigate disasters in their communities, and better participate in decision-making in all three phases bringing public health nursing data to the discussion as part of the evidence base for decision-making.Keywords: data management, decision making, disaster planning documentation, public health nursing
Procedia PDF Downloads 22225825 Enhancing Financial Security: Real-Time Anomaly Detection in Financial Transactions Using Machine Learning
Authors: Ali Kazemi
Abstract:
The digital evolution of financial services, while offering unprecedented convenience and accessibility, has also escalated the vulnerabilities to fraudulent activities. In this study, we introduce a distinct approach to real-time anomaly detection in financial transactions, aiming to fortify the defenses of banking and financial institutions against such threats. Utilizing unsupervised machine learning algorithms, specifically autoencoders and isolation forests, our research focuses on identifying irregular patterns indicative of fraud within transactional data, thus enabling immediate action to prevent financial loss. The data we used in this study included the monetary value of each transaction. This is a crucial feature as fraudulent transactions may have distributions of different amounts than legitimate ones, such as timestamps indicating when transactions occurred. Analyzing transactions' temporal patterns can reveal anomalies (e.g., unusual activity in the middle of the night). Also, the sector or category of the merchant where the transaction occurred, such as retail, groceries, online services, etc. Specific categories may be more prone to fraud. Moreover, the type of payment used (e.g., credit, debit, online payment systems). Different payment methods have varying risk levels associated with fraud. This dataset, anonymized to ensure privacy, reflects a wide array of transactions typical of a global banking institution, ranging from small-scale retail purchases to large wire transfers, embodying the diverse nature of potentially fraudulent activities. By engineering features that capture the essence of transactions, including normalized amounts and encoded categorical variables, we tailor our data to enhance model sensitivity to anomalies. The autoencoder model leverages its reconstruction error mechanism to flag transactions that deviate significantly from the learned normal pattern, while the isolation forest identifies anomalies based on their susceptibility to isolation from the dataset's majority. Our experimental results, validated through techniques such as k-fold cross-validation, are evaluated using precision, recall, and the F1 score alongside the area under the receiver operating characteristic (ROC) curve. Our models achieved an F1 score of 0.85 and a ROC AUC of 0.93, indicating high accuracy in detecting fraudulent transactions without excessive false positives. This study contributes to the academic discourse on financial fraud detection and provides a practical framework for banking institutions seeking to implement real-time anomaly detection systems. By demonstrating the effectiveness of unsupervised learning techniques in a real-world context, our research offers a pathway to significantly reduce the incidence of financial fraud, thereby enhancing the security and trustworthiness of digital financial services.Keywords: anomaly detection, financial fraud, machine learning, autoencoders, isolation forest, transactional data analysis
Procedia PDF Downloads 5725824 Experimental Study of Iron Metal Powder Compacting by Controlled Impact
Authors: Todor N. Penchev, Dimitar N. Karastoianov, Stanislav D. Gyoshev
Abstract:
For compacting of iron powder are used hydraulic presses and high velocity hammers. In this paper are presented initial research on application of an innovative powder compacting method, which uses a hammer working with controlled impact. The results show that by this method achieves the reduction of rebounds and improve efficiency of impact, compared with a high-speed compacting. Depending on the power of the engine (industrial rocket engine), this effect may be amplified to such an extent as to obtain a impact without rebound (sticking impact) and in long-time action of the impact force.Keywords: powder metallurgy, impact, iron powder compacting, rocket engine
Procedia PDF Downloads 52125823 An Embarrassingly Simple Semi-supervised Approach to Increase Recall in Online Shopping Domain to Match Structured Data with Unstructured Data
Authors: Sachin Nagargoje
Abstract:
Complete labeled data is often difficult to obtain in a practical scenario. Even if one manages to obtain the data, the quality of the data is always in question. In shopping vertical, offers are the input data, which is given by advertiser with or without a good quality of information. In this paper, an author investigated the possibility of using a very simple Semi-supervised learning approach to increase the recall of unhealthy offers (has badly written Offer Title or partial product details) in shopping vertical domain. The author found that the semisupervised learning method had improved the recall in the Smart Phone category by 30% on A=B testing on 10% traffic and increased the YoY (Year over Year) number of impressions per month by 33% at production. This also made a significant increase in Revenue, but that cannot be publicly disclosed.Keywords: semi-supervised learning, clustering, recall, coverage
Procedia PDF Downloads 12225822 Idiocenntrism to innovative action, multi-level perspective on moderating effects of emotional self-regulation, trust and CSR
Authors: Shuhong Wang, Xiang Yi
Abstract:
Through a survey of approximately 340 employees from four Chinese companies and employing cross-level analysis, this study reveals that certain cultural syndromes may exert both direct and indirect influences on such behaviors. Notably, individuals with a strong individualistic self-concept are more inclined towards innovative actions compared to their less individualistic counterparts. This research also identifies several moderating factors. For instance, trust amplifies the positive relationship between individualism and innovative actions, particularly at higher trust levels. The paper concludes by highlighting its theoretical contributions, and practical implications, and suggesting directions for future research.Keywords: Innovation, Self-Determination Theory, Trust, Team dynamic, Allocentrism
Procedia PDF Downloads 58