Search results for: continuous data
25144 Applying Semi-Automatic Digital Aerial Survey Technology and Canopy Characters Classification for Surface Vegetation Interpretation of Archaeological Sites
Authors: Yung-Chung Chuang
Abstract:
The cultural layers of archaeological sites are mainly affected by surface land use, land cover, and root system of surface vegetation. For this reason, continuous monitoring of land use and land cover change is important for archaeological sites protection and management. However, in actual operation, on-site investigation and orthogonal photograph interpretation require a lot of time and manpower. For this reason, it is necessary to perform a good alternative for surface vegetation survey in an automated or semi-automated manner. In this study, we applied semi-automatic digital aerial survey technology and canopy characters classification with very high-resolution aerial photographs for surface vegetation interpretation of archaeological sites. The main idea is based on different landscape or forest type can easily be distinguished with canopy characters (e.g., specific texture distribution, shadow effects and gap characters) extracted by semi-automatic image classification. A novel methodology to classify the shape of canopy characters using landscape indices and multivariate statistics was also proposed. Non-hierarchical cluster analysis was used to assess the optimal number of canopy character clusters and canonical discriminant analysis was used to generate the discriminant functions for canopy character classification (seven categories). Therefore, people could easily predict the forest type and vegetation land cover by corresponding to the specific canopy character category. The results showed that the semi-automatic classification could effectively extract the canopy characters of forest and vegetation land cover. As for forest type and vegetation type prediction, the average prediction accuracy reached 80.3%~91.7% with different sizes of test frame. It represented this technology is useful for archaeological site survey, and can improve the classification efficiency and data update rate.Keywords: digital aerial survey, canopy characters classification, archaeological sites, multivariate statistics
Procedia PDF Downloads 14225143 Family Management, Relations Risk and Protective Factors for Adolescent Substance Abuse in South Africa
Authors: Beatrice Wamuyu Muchiri, Monika M. L. Dos Santos
Abstract:
An increasingly recognised prevention approach for substance use entails reduction in risk factors and enhancement of promotive or protective factors in individuals and the environment surrounding them during their growth and development. However, in order to enhance the effectiveness of this approach, continuous study of risk aspects targeting different cultures, social groups and mixture of society has been recommended. This study evaluated the impact of potential risk and protective factors associated with family management and relations on adolescent substance abuse in South Africa. Exploratory analysis and cumulative odds ordinal logistic regression modelling was performed on the data while controlling for demographic and socio-economic characteristics on adolescent substance use. The most intensely used substances were tobacco, cannabis, cocaine, heroin and alcohol in decreasing order of use intensity. The specific protective or risk impact of family management or relations factors varied from substance to substance. Risk factors associated with demographic and socio-economic factors included being male, younger age, being in lower education grades, coloured ethnicity, adolescents from divorced parents and unemployed or fully employed mothers. Significant family relations risk and protective factors against substance use were classified as either family functioning and conflict or family bonding and support. Several family management factors, categorised as parental monitoring, discipline, behavioural control and rewards, demonstrated either risk or protective effect on adolescent substance use. Some factors had either interactive risk or protective impact on substance use or lost significance when analysed jointly with other factors such as controlled variables. Interaction amongst risk or protective factors as well as the type of substance should be considered when further considering interventions based on these risk or protective factors. Studies in other geographical regions, institutions and with better gender balance are recommended to improve upon the representativeness of the results. Several other considerations to be made when formulating interventions, the shortcomings of this study and possible improvements as well as future studies are also suggested.Keywords: risk factors, protective factors, substance use, adolescents
Procedia PDF Downloads 20425142 Spatially Random Sampling for Retail Food Risk Factors Study
Authors: Guilan Huang
Abstract:
In 2013 and 2014, the U.S. Food and Drug Administration (FDA) collected data from selected fast food restaurants and full service restaurants for tracking changes in the occurrence of foodborne illness risk factors. This paper discussed how we customized spatial random sampling method by considering financial position and availability of FDA resources, and how we enriched restaurants data with location. Location information of restaurants provides opportunity for quantitatively determining random sampling within non-government units (e.g.: 240 kilometers around each data-collector). Spatial analysis also could optimize data-collectors’ work plans and resource allocation. Spatial analytic and processing platform helped us handling the spatial random sampling challenges. Our method fits in FDA’s ability to pinpoint features of foodservice establishments, and reduced both time and expense on data collection.Keywords: geospatial technology, restaurant, retail food risk factor study, spatially random sampling
Procedia PDF Downloads 35025141 Automatic MC/DC Test Data Generation from Software Module Description
Authors: Sekou Kangoye, Alexis Todoskoff, Mihaela Barreau
Abstract:
Modified Condition/Decision Coverage (MC/DC) is a structural coverage criterion that is highly recommended or required for safety-critical software coverage. Therefore, many testing standards include this criterion and require it to be satisfied at a particular level of testing (e.g. validation and unit levels). However, an important amount of time is needed to meet those requirements. In this paper we propose to automate MC/DC test data generation. Thus, we present an approach to automatically generate MC/DC test data, from software module description written over a dedicated language. We introduce a new merging approach that provides high MC/DC coverage for the description, with only a little number of test cases.Keywords: domain-specific language, MC/DC, test data generation, safety-critical software coverage
Procedia PDF Downloads 44125140 Blockchain-Based Approach on Security Enhancement of Distributed System in Healthcare Sector
Authors: Loong Qing Zhe, Foo Jing Heng
Abstract:
A variety of data files are now available on the internet due to the advancement of technology across the globe today. As more and more data are being uploaded on the internet, people are becoming more concerned that their private data, particularly medical health records, are being compromised and sold to others for money. Hence, the accessibility and confidentiality of patients' medical records have to be protected through electronic means. Blockchain technology is introduced to offer patients security against adversaries or unauthorised parties. In the blockchain network, only authorised personnel or organisations that have been validated as nodes may share information and data. For any change within the network, including adding a new block or modifying existing information about the block, a majority of two-thirds of the vote is required to confirm its legitimacy. Additionally, a consortium permission blockchain will connect all the entities within the same community. Consequently, all medical data in the network can be safely shared with all authorised entities. Also, synchronization can be performed within the cloud since the data is real-time. This paper discusses an efficient method for storing and sharing electronic health records (EHRs). It also examines the framework of roles within the blockchain and proposes a new approach to maintain EHRs with keyword indexes to search for patients' medical records while ensuring data privacy.Keywords: healthcare sectors, distributed system, blockchain, electronic health records (EHR)
Procedia PDF Downloads 19125139 Demographic Factors Influencing Employees’ Salary Expectations and Labor Turnover
Authors: M. Osipova
Abstract:
Thanks to informational technologies development every sphere of economics is becoming more and more data-centralized as people are generating huge datasets containing information on any aspect of their life. Applying research of such data to human resources management allows getting scarce statistics on labor market state including salary expectations and potential employees’ typical career behavior, and this information can become a reliable basis for management decisions. The following article presents results of career behavior research based on freely accessible resume data. Information used for study is much wider than one usually uses in human resources surveys. That is why there is enough data for statistically significant results even for subgroups analysis.Keywords: human resources management, salary expectations, statistics, turnover
Procedia PDF Downloads 34925138 Effect of Supply Frequency on Pre-Breakdown and Breakdown Phenomena in Unbridged Vacuum Gaps
Authors: T.C. Balachandra, Habibuddin Shaik
Abstract:
This paper presents experimental results leading towards a better understanding of pre-breakdown and breakdown behavior of vacuum gaps under variable frequency alternating excitations. The frequency variation is in the range of 30 to 300 Hz in steps of 10 Hz for a fixed gap spacing of 0.5 mm. The results indicate that the pre-breakdown currents show an inverse relation with the breakdown voltage in general though erratic behavior was observed over a certain range of frequencies. A breakdown voltage peak was observed at 130 Hz. This was pronounced when the electrode pair was of stainless steel and less pronounced when copper and aluminum electrodes were used. The experimental results are explained based on F-N emission, I-F emission, and also thermal interaction due to quasi-continuous shower of anode micro-particles. Further, it is speculated that the ostensible cause for time delay between voltage and current peaks is due to the presence of neutral molecules in the gap.Keywords: anode hot-spots, F-N emission, I-F emission, microparticle, neutral molecules, pre-breakdown conduction, vacuum breakdown
Procedia PDF Downloads 16225137 Effect of Rehabilitative Nursing Program on Pain Intensity and Functional Status among Patients with Discectomy
Authors: Amal Shehata
Abstract:
Low back pain related to disc prolapse is localized in the lumbar area and it may be radiated to the lower extremities, starting from neurons near or around the spinal canal. Most of the population may be affected with disc prolapse within their lifetime and leads to lost productivity, disability and loss of function. The study purpose was to examine the effect of rehabilitative nursing program on pain intensity and functional status among patients with discectomy. Design: Aquasi experimental design was utilized. Setting: The study was carried out at neurosurgery department and out patient's clinic of Menoufia University and Teaching hospitals at Menoufia governorate, Egypt. Instrument of the study: Five Instruments were used for data collection: Structured interviewing questionnaire, Functional assessment instrument, Observational check list, Numeric rating Scale and Oswestry low back pain disability questionnaire. Results: There was an improvement in mean total knowledge score about disease process, discectomy and rehabilitation program in study group (25.32%) than control group (7.32%). There was highly statistically significant improvement in lumbar flexibility among study group (80%) than control group (30%) after rehabilitation program than before. Also there was a decrease in pain score in study group (58% no pain) than control group (28% no pain) after rehabilitation program. There was an improvement in total disability score of study group (zero %) regarding effect of pain on the activity of daily living after rehabilitation program than control group (16%). Conclusion: Application of rehabilitative nursing program for patient with discectomy had proven a positive effect in relation to knowledge score, pain reduction, activity of daily living and functional abilities. Recommendation: A continuous rehabilitative nursing program should be carried out for all patients immediately after discectomy surgery on regular basis. Also A colored illustrated booklet about rehabilitation program should be available and distributed for all patients before surgery.Keywords: discectomy, rehabilitative nursing program, pain intensity, functional status
Procedia PDF Downloads 14125136 Exploring Electroactive Polymers for Dynamic Data Physicalization
Authors: Joanna Dauner, Jan Friedrich, Linda Elsner, Kora Kimpel
Abstract:
Active materials such as Electroactive Polymers (EAPs) are promising for the development of novel shape-changing interfaces. This paper explores the potential of EAPs in a multilayer unimorph structure from a design perspective to investigate the visual qualities of the material for dynamic data visualization and data physicalization. We discuss various concepts of how the material can be used for this purpose. Multilayer unimorph EAPs are of particular interest to designers because they can be easily prototyped using everyday materials and tools. By changing the structure and geometry of the EAPs, their movement and behavior can be modified. We present the results of our preliminary user testing, where we evaluated different movement patterns. As a result, we introduce a prototype display built with EAPs for dynamic data physicalization. Finally, we discuss the potentials and drawbacks and identify further open research questions for the design discipline.Keywords: electroactive polymer, shape-changing interfaces, smart material interfaces, data physicalization
Procedia PDF Downloads 9925135 Non-Invasive Data Extraction from Machine Display Units Using Video Analytics
Authors: Ravneet Kaur, Joydeep Acharya, Sudhanshu Gaur
Abstract:
Artificial Intelligence (AI) has the potential to transform manufacturing by improving shop floor processes such as production, maintenance and quality. However, industrial datasets are notoriously difficult to extract in a real-time, streaming fashion thus, negating potential AI benefits. The main example is some specialized industrial controllers that are operated by custom software which complicates the process of connecting them to an Information Technology (IT) based data acquisition network. Security concerns may also limit direct physical access to these controllers for data acquisition. To connect the Operational Technology (OT) data stored in these controllers to an AI application in a secure, reliable and available way, we propose a novel Industrial IoT (IIoT) solution in this paper. In this solution, we demonstrate how video cameras can be installed in a factory shop floor to continuously obtain images of the controller HMIs. We propose image pre-processing to segment the HMI into regions of streaming data and regions of fixed meta-data. We then evaluate the performance of multiple Optical Character Recognition (OCR) technologies such as Tesseract and Google vision to recognize the streaming data and test it for typical factory HMIs and realistic lighting conditions. Finally, we use the meta-data to match the OCR output with the temporal, domain-dependent context of the data to improve the accuracy of the output. Our IIoT solution enables reliable and efficient data extraction which will improve the performance of subsequent AI applications.Keywords: human machine interface, industrial internet of things, internet of things, optical character recognition, video analytics
Procedia PDF Downloads 10925134 Generating 3D Anisotropic Centroidal Voronoi Tessellations
Authors: Alexandre Marin, Alexandra Bac, Laurent Astart
Abstract:
New numerical methods for PDE resolution (such as Finite Volumes (FV) or Virtual Elements Method (VEM)) open new needs in terms of meshing of domains of interest, and in particular, polyhedral meshes have many advantages. One way to build such meshes consists of constructing Restricted Voronoi Diagrams (RVDs) whose boundaries respect the domain of interest. By minimizing a function defined for RVDs, the shapes of cells can be controlled, e.g., elongated according to user-defined directions or adjusted to comply with given aspect ratios (anisotropy) and density variations. In this paper, our contribution is threefold: First, we introduce a new gradient formula for the Voronoi tessellation energy under a continuous anisotropy field. Second, we describe a meshing algorithm based on the optimisation of this function that we validate against state-of-the-art approaches. Finally, we propose a hierarchical approach to speed up our meshing algorithm.Keywords: anisotropic Voronoi diagrams, meshes for numerical simulations, optimisation, volumic polyhedral meshing
Procedia PDF Downloads 11625133 Impact of Digitized Monitoring & Evaluation System in Technical Vocational Education and Training
Authors: Abdul Ghani Rajput
Abstract:
Although monitoring and evaluation concept adopted by Technical Vocational Education and Training (TVET) organization to track the progress over the continuous interval of time based on planned interventions and subsequently, evaluating it for the impact, quality assurance and sustainability. In digital world, TVET providers are giving preference to have real time information to do monitoring of training activities. Identifying the benefits and challenges of digitized monitoring & evaluation real time information system has not been sufficiently tackled in this date. This research paper looks at the impact of digitized M&E in TVET sector by analyzing two case studies and describe the benefits and challenges of using digitized M&E system. Finally, digitized M&E have been identified as carriers for high potential of TVET sector.Keywords: digitized M&E, innovation, quality assurance, TVET
Procedia PDF Downloads 23025132 Research and Implementation of Cross-domain Data Sharing System in Net-centric Environment
Authors: Xiaoqing Wang, Jianjian Zong, Li Li, Yanxing Zheng, Jinrong Tong, Mao Zhan
Abstract:
With the rapid development of network and communication technology, a great deal of data has been generated in different domains of a network. These data show a trend of increasing scale and more complex structure. Therefore, an effective and flexible cross-domain data-sharing system is needed. The Cross-domain Data Sharing System(CDSS) in a net-centric environment is composed of three sub-systems. The data distribution sub-system provides data exchange service through publish-subscribe technology that supports asynchronism and multi-to-multi communication, which adapts to the needs of the dynamic and large-scale distributed computing environment. The access control sub-system adopts Attribute-Based Access Control(ABAC) technology to uniformly model various data attributes such as subject, object, permission and environment, which effectively monitors the activities of users accessing resources and ensures that legitimate users get effective access control rights within a legal time. The cross-domain access security negotiation subsystem automatically determines the access rights between different security domains in the process of interactive disclosure of digital certificates and access control policies through trust policy management and negotiation algorithms, which provides an effective means for cross-domain trust relationship establishment and access control in a distributed environment. The CDSS’s asynchronous,multi-to-multi and loosely-coupled communication features can adapt well to data exchange and sharing in dynamic, distributed and large-scale network environments. Next, we will give CDSS new features to support the mobile computing environment.Keywords: data sharing, cross-domain, data exchange, publish-subscribe
Procedia PDF Downloads 12425131 Routing Protocol in Ship Dynamic Positioning Based on WSN Clustering Data Fusion System
Authors: Zhou Mo, Dennis Chow
Abstract:
In the dynamic positioning system (DPS) for vessels, the reliable information transmission between each note basically relies on the wireless protocols. From the perspective of cluster-based routing protocols for wireless sensor networks, the data fusion technology based on the sleep scheduling mechanism and remaining energy in network layer is proposed, which applies the sleep scheduling mechanism to the routing protocols, considering the remaining energy of node and location information when selecting cluster-head. The problem of uneven distribution of nodes in each cluster is solved by the Equilibrium. At the same time, Classified Forwarding Mechanism as well as Redelivery Policy strategy is adopted to avoid congestion in the transmission of huge amount of data, reduce the delay in data delivery and enhance the real-time response. In this paper, a simulation test is conducted to improve the routing protocols, which turn out to reduce the energy consumption of nodes and increase the efficiency of data delivery.Keywords: DPS for vessel, wireless sensor network, data fusion, routing protocols
Procedia PDF Downloads 52425130 Advanced Data Visualization Techniques for Effective Decision-making in Oil and Gas Exploration and Production
Authors: Deepak Singh, Rail Kuliev
Abstract:
This research article explores the significance of advanced data visualization techniques in enhancing decision-making processes within the oil and gas exploration and production domain. With the oil and gas industry facing numerous challenges, effective interpretation and analysis of vast and diverse datasets are crucial for optimizing exploration strategies, production operations, and risk assessment. The article highlights the importance of data visualization in managing big data, aiding the decision-making process, and facilitating communication with stakeholders. Various advanced data visualization techniques, including 3D visualization, augmented reality (AR), virtual reality (VR), interactive dashboards, and geospatial visualization, are discussed in detail, showcasing their applications and benefits in the oil and gas sector. The article presents case studies demonstrating the successful use of these techniques in optimizing well placement, real-time operations monitoring, and virtual reality training. Additionally, the article addresses the challenges of data integration and scalability, emphasizing the need for future developments in AI-driven visualization. In conclusion, this research emphasizes the immense potential of advanced data visualization in revolutionizing decision-making processes, fostering data-driven strategies, and promoting sustainable growth and improved operational efficiency within the oil and gas exploration and production industry.Keywords: augmented reality (AR), virtual reality (VR), interactive dashboards, real-time operations monitoring
Procedia PDF Downloads 8625129 The Data Quality Model for the IoT based Real-time Water Quality Monitoring Sensors
Authors: Rabbia Idrees, Ananda Maiti, Saurabh Garg, Muhammad Bilal Amin
Abstract:
IoT devices are the basic building blocks of IoT network that generate enormous volume of real-time and high-speed data to help organizations and companies to take intelligent decisions. To integrate this enormous data from multisource and transfer it to the appropriate client is the fundamental of IoT development. The handling of this huge quantity of devices along with the huge volume of data is very challenging. The IoT devices are battery-powered and resource-constrained and to provide energy efficient communication, these IoT devices go sleep or online/wakeup periodically and a-periodically depending on the traffic loads to reduce energy consumption. Sometime these devices get disconnected due to device battery depletion. If the node is not available in the network, then the IoT network provides incomplete, missing, and inaccurate data. Moreover, many IoT applications, like vehicle tracking and patient tracking require the IoT devices to be mobile. Due to this mobility, If the distance of the device from the sink node become greater than required, the connection is lost. Due to this disconnection other devices join the network for replacing the broken-down and left devices. This make IoT devices dynamic in nature which brings uncertainty and unreliability in the IoT network and hence produce bad quality of data. Due to this dynamic nature of IoT devices we do not know the actual reason of abnormal data. If data are of poor-quality decisions are likely to be unsound. It is highly important to process data and estimate data quality before bringing it to use in IoT applications. In the past many researchers tried to estimate data quality and provided several Machine Learning (ML), stochastic and statistical methods to perform analysis on stored data in the data processing layer, without focusing the challenges and issues arises from the dynamic nature of IoT devices and how it is impacting data quality. A comprehensive review on determining the impact of dynamic nature of IoT devices on data quality is done in this research and presented a data quality model that can deal with this challenge and produce good quality of data. This research presents the data quality model for the sensors monitoring water quality. DBSCAN clustering and weather sensors are used in this research to make data quality model for the sensors monitoring water quality. An extensive study has been done in this research on finding the relationship between the data of weather sensors and sensors monitoring water quality of the lakes and beaches. The detailed theoretical analysis has been presented in this research mentioning correlation between independent data streams of the two sets of sensors. With the help of the analysis and DBSCAN, a data quality model is prepared. This model encompasses five dimensions of data quality: outliers’ detection and removal, completeness, patterns of missing values and checks the accuracy of the data with the help of cluster’s position. At the end, the statistical analysis has been done on the clusters formed as the result of DBSCAN, and consistency is evaluated through Coefficient of Variation (CoV).Keywords: clustering, data quality, DBSCAN, and Internet of things (IoT)
Procedia PDF Downloads 13925128 New Security Approach of Confidential Resources in Hybrid Clouds
Authors: Haythem Yahyaoui, Samir Moalla, Mounir Bouden, Skander ghorbel
Abstract:
Nowadays, Cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime, also an optimized and secured access to the resources and gives more security for the data which stored in the platform, however, some companies do not trust Cloud providers, in their point of view, providers can access and modify some confidential data such as bank accounts, many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, although, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some modifications on the data before sending them to the Cloud in the objective to make them unreadable. This work aims on enhancing the quality of service of providers and improving the trust of the customers.Keywords: cloud, confidentiality, cryptography, security issues, trust issues
Procedia PDF Downloads 37825127 What Factors Contributed to the Adaptation Gap during School Transition in Japan?
Authors: Tadaaki Tomiie, Hiroki Shinkawa
Abstract:
The present study was aimed to examine the structure of children’s adaptation during school transition and to identify a commonality and dissimilarity at the elementary and junior high school. 1,983 students in the 6th grade and 2,051 students in the 7th grade were extracted by stratified two-stage random sampling and completed the ASSESS that evaluated the school adaptation from the view point of ‘general satisfaction’, ‘teachers’ support’, ‘friends’ support’, ‘anti-bullying relationship’, ‘prosocial skills’, and ‘academic adaptation’. The 7th graders tend to be worse adaptation than the 6th graders. A structural equation modeling showed the goodness of fit for each grades. Both models were very similar but the 7th graders’ model showed a lower coefficient at the pass from ‘teachers’ support’ to ‘friends’ support’. The role of ‘teachers’ support’ was decreased to keep a good relation in junior high school. We also discussed how we provide a continuous assistance for prevention of the 7th graders’ gap.Keywords: school transition, social support, psychological adaptation, K-12
Procedia PDF Downloads 38525126 Estimation of Chronic Kidney Disease Using Artificial Neural Network
Authors: Ilker Ali Ozkan
Abstract:
In this study, an artificial neural network model has been developed to estimate chronic kidney failure which is a common disease. The patients’ age, their blood and biochemical values, and 24 input data which consists of various chronic diseases are used for the estimation process. The input data have been subjected to preprocessing because they contain both missing values and nominal values. 147 patient data which was obtained from the preprocessing have been divided into as 70% training and 30% testing data. As a result of the study, artificial neural network model with 25 neurons in the hidden layer has been found as the model with the lowest error value. Chronic kidney failure disease has been able to be estimated accurately at the rate of 99.3% using this artificial neural network model. The developed artificial neural network has been found successful for the estimation of chronic kidney failure disease using clinical data.Keywords: estimation, artificial neural network, chronic kidney failure disease, disease diagnosis
Procedia PDF Downloads 44725125 Impact of a Professional Learning Community on the Continuous Professional Development of Teacher Educators in Myanmar
Authors: Moet Moet Myint lay
Abstract:
Professional learning communities provide ongoing professional development for teachers, where they become learning leaders and actively participate in school improvement. The development of professional knowledge requires a significant focus on professional competence in the work of teachers, and a solid foundation of professional knowledge and skills is necessary for members of society to become intelligent members. Continuing professional development (CPD) plays a vital role in improving educational outcomes, as its importance has been proven over the years. This article explores the need for CPD for teachers in Myanmar and the utility of professional learning communities in improving teacher quality. This study aims to explore a comprehensive understanding of professional learning communities to support the continuing professional development of teacher educators in improving the quality of education. The research questions are: (1) How do teacher educators in Myanmar understand the concept of professional learning communities for continuing professional development? (2) What CPD training is required for all teachers in teachers' colleges? Quantitative research methods were used in this study. Survey data were collected from 50 participants (teacher trainers) from five educational institutions. The analysis shows that professional learning communities when done well, can have a lasting impact on teacher quality. Furthermore, the creation of professional learning communities is the best indicator of professional development in existing education systems. Some research suggests that teacher professional development is closely related to teacher professional skills and school improvement. As a result of the collective learning process, teachers gain a deeper understanding of the subject matter, increase their knowledge, and develop their professional teaching skills. This will help improve student performance and school quality in the future. The lack of clear understanding and knowledge about PLC among school leaders and leads teachers to believe that PLC activities are not beneficial. Lack of time, teacher accountability, leadership skills, and negative attitudes of participating teachers were the most frequently cited challenges in implementing PLCs. As a result of these findings, educators and stakeholders can use them to implement professional learning communities.Keywords: professional learning communities, continuing professional development, teacher education, competence, school improvement
Procedia PDF Downloads 5925124 Remote Patient Monitoring for Covid-19
Authors: Launcelot McGrath
Abstract:
The Coronavirus disease 2019 (COVID-19) has spread rapidly around the world, resulting in high mortality rates and very large numbers of people requiring medical treatment in ICU. Management of patient hospitalisation is a critical aspect to control this disease and reduce chaos in the healthcare systems. Remote monitoring provides a solution to protect vulnerable and elderly high-risk patients. Continuous remote monitoring of oxygen saturation, respiratory rate, heart rate, and temperature, etc., provides medical systems with up-to-the-minute information about their patients' statuses. Remote monitoring also limits the spread of infection by reducing hospital overcrowding. This paper examines the potential of remote monitoring for Covid-19 to assist in the rapid identification of patients at risk, facilitate the detection of patient deterioration, and enable early interventions.Keywords: remote monitoring, patient care, oxygen saturation, Covid-19, hospital management
Procedia PDF Downloads 10825123 Translanguaging as a Decolonial Move in South African Bilingual Classrooms
Authors: Malephole Philomena Sefotho
Abstract:
Nowadays, it is a fact that the majority of people, worldwide, are bilingual rather than monolingual due to the surge of globalisation and mobility. Consequently, bilingual education is a topical issue of discussion among researchers. Several studies that have focussed on it have highlighted the importance and need for incorporating learners’ linguistic repertoires in multilingual classrooms and move away from the colonial approach which is a monolingual bias – one language at a time. Researchers pointed out that a systematic approach that involves the concurrent use of languages and not a separation of languages must be implemented in bilingual classroom settings. Translanguaging emerged as a systematic approach that assists learners to make meaning of their world and it involves allowing learners to utilize all their linguistic resources in their classrooms. The South African language policy also room for diverse languages use in bi/multilingual classrooms. This study, therefore, sought to explore how teachers apply translanguaging in bilingual classrooms in incorporating learners’ linguistic repertoires. It further establishes teachers’ perspectives in the use of more than one language in teaching and learning. The participants for this study were language teachers who teach at bilingual primary schools in Johannesburg in South Africa. Semi-structured interviews were conducted to establish their perceptions on the concurrent use of languages. Qualitative research design was followed in analysing data. The findings showed that teachers were reluctant to allow translanguaging to take place in their classrooms even though they realise the importance thereof. Not allowing bilingual learners to use their linguistic repertoires has resulted in learners’ negative attitude towards their languages and contributed in learners’ loss of their identity. This article, thus recommends a drastic change to decolonised approaches in teaching and learning in multilingual settings and translanguaging as a decolonial move where learners are allowed to translanguage freely in their classroom settings for better comprehension and making meaning of concepts and/or related ideas. It further proposes continuous conversations be encouraged to bring eminent cultural and linguistic genocide to a halt.Keywords: bilingualism, decolonisation, linguistic repertoires, translanguaging
Procedia PDF Downloads 17925122 Further Analysis of Global Robust Stability of Neural Networks with Multiple Time Delays
Authors: Sabri Arik
Abstract:
In this paper, we study the global asymptotic robust stability of delayed neural networks with norm-bounded uncertainties. By employing the Lyapunov stability theory and Homeomorphic mapping theorem, we derive some new types of sufficient conditions ensuring the existence, uniqueness and global asymptotic stability of the equilibrium point for the class of neural networks with discrete time delays under parameter uncertainties and with respect to continuous and slopebounded activation functions. An important aspect of our results is their low computational complexity as the reported results can be verified by checking some properties symmetric matrices associated with the uncertainty sets of network parameters. The obtained results are shown to be generalization of some of the previously published corresponding results. Some comparative numerical examples are also constructed to compare our results with some closely related existing literature results.Keywords: neural networks, delayed systems, lyapunov functionals, stability analysis
Procedia PDF Downloads 52825121 Impact of Map Generalization in Spatial Analysis
Authors: Lin Li, P. G. R. N. I. Pussella
Abstract:
When representing spatial data and their attributes on different types of maps, the scale plays a key role in the process of map generalization. The process is consisted with two main operators such as selection and omission. Once some data were selected, they would undergo of several geometrical changing processes such as elimination, simplification, smoothing, exaggeration, displacement, aggregation and size reduction. As a result of these operations at different levels of data, the geometry of the spatial features such as length, sinuosity, orientation, perimeter and area would be altered. This would be worst in the case of preparation of small scale maps, since the cartographer has not enough space to represent all the features on the map. What the GIS users do is when they wanted to analyze a set of spatial data; they retrieve a data set and does the analysis part without considering very important characteristics such as the scale, the purpose of the map and the degree of generalization. Further, the GIS users use and compare different maps with different degrees of generalization. Sometimes, GIS users are going beyond the scale of the source map using zoom in facility and violate the basic cartographic rule 'it is not suitable to create a larger scale map using a smaller scale map'. In the study, the effect of map generalization for GIS analysis would be discussed as the main objective. It was used three digital maps with different scales such as 1:10000, 1:50000 and 1:250000 which were prepared by the Survey Department of Sri Lanka, the National Mapping Agency of Sri Lanka. It was used common features which were on above three maps and an overlay analysis was done by repeating the data with different combinations. Road data, River data and Land use data sets were used for the study. A simple model, to find the best place for a wild life park, was used to identify the effects. The results show remarkable effects on different degrees of generalization processes. It can see that different locations with different geometries were received as the outputs from this analysis. The study suggests that there should be reasonable methods to overcome this effect. It can be recommended that, as a solution, it would be very reasonable to take all the data sets into a common scale and do the analysis part.Keywords: generalization, GIS, scales, spatial analysis
Procedia PDF Downloads 32825120 Identity Verification Based on Multimodal Machine Learning on Red Green Blue (RGB) Red Green Blue-Depth (RGB-D) Voice Data
Authors: LuoJiaoyang, Yu Hongyang
Abstract:
In this paper, we experimented with a new approach to multimodal identification using RGB, RGB-D and voice data. The multimodal combination of RGB and voice data has been applied in tasks such as emotion recognition and has shown good results and stability, and it is also the same in identity recognition tasks. We believe that the data of different modalities can enhance the effect of the model through mutual reinforcement. We try to increase the three modalities on the basis of the dual modalities and try to improve the effectiveness of the network by increasing the number of modalities. We also implemented the single-modal identification system separately, tested the data of these different modalities under clean and noisy conditions, and compared the performance with the multimodal model. In the process of designing the multimodal model, we tried a variety of different fusion strategies and finally chose the fusion method with the best performance. The experimental results show that the performance of the multimodal system is better than that of the single modality, especially in dealing with noise, and the multimodal system can achieve an average improvement of 5%.Keywords: multimodal, three modalities, RGB-D, identity verification
Procedia PDF Downloads 7025119 The Importance of SEEQ in Teaching Evaluation of Undergraduate Engineering Education in India
Authors: Aabha Chaubey, Bani Bhattacharya
Abstract:
Evaluation of the quality of teaching in engineering education in India needs to be conducted on a continuous basis to achieve the best teaching quality in technical education. Quality teaching is an influential factor in technical education which impacts largely on learning outcomes of the students. Present study is not exclusively theory-driven, but it draws on various specific concepts and constructs in the domain of technical education. These include teaching and learning in higher education, teacher effectiveness, and teacher evaluation and performance management in higher education. Student Evaluation of Education Quality (SEEQ) was proposed as one of the evaluation instruments of the quality teaching in engineering education. SEEQ is one of the popular and standard instrument widely utilized all over the world and bears the validity and reliability in educational world. The present study was designed to evaluate the teaching quality through SEEQ in the context of technical education in India, including its validity and reliability based on the collected data. The multiple dimensionality of SEEQ that is present in every teaching and learning process made it quite suitable to collect the feedback of students regarding the quality of instructions and instructor. The SEEQ comprises of 9 original constructs i.e.; learning value, teacher enthusiasm, organization, group interaction, and individual rapport, breadth of coverage, assessment, assignments and overall rating of particular course and instructor with total of 33 items. In the present study, a total of 350 samples comprising first year undergraduate students from Indian Institute of Technology, Kharagpur (IIT, Kharagpur, India) were included for the evaluation of the importance of SEEQ. They belonged to four different courses of different streams of engineering studies. The above studies depicted the validity and reliability of SEEQ was based upon the collected data. This further needs Confirmatory Factor Analysis (CFA) and Analysis of Moment structure (AMOS) for various scaled instrument like SEEQ Cronbach’s alpha which are associated with SPSS for the examination of the internal consistency. The evaluation of the effectiveness of SEEQ in CFA is implemented on the basis of fit indices such as CMIN/df, CFI, GFI, AGFI and RMSEA readings. The major findings of this study showed the fitness indices such as ChiSq = 993.664,df = 390,ChiSq/df = 2.548,GFI = 0.782,AGFI = 0.736,CFI = 0.848,RMSEA = 0.062,TLI = 0.945,RMR = 0.029,PCLOSE = 0.006. The final analysis of the fit indices presented positive construct validity and stability, on the other hand a higher reliability was also depicted which indicated towards internal consistency. Thus, the study suggests the effectivity of SEEQ as the indicator of the quality evaluation instrument in teaching-learning process in engineering education in India. Therefore, it is expected that with the continuation of this research in engineering education there remains a possibility towards the betterment of the quality of the technical education in India. It is also expected that this study will provide an empirical and theoretical logic towards locating a construct or factor related to teaching, which has the greatest impact on teaching and learning process in a particular course or stream in engineering education.Keywords: confirmatory factor analysis, engineering education, SEEQ, teaching and learning process
Procedia PDF Downloads 42125118 Non-Linear Causality Inference Using BAMLSS and Bi-CAM in Finance
Authors: Flora Babongo, Valerie Chavez
Abstract:
Inferring causality from observational data is one of the fundamental subjects, especially in quantitative finance. So far most of the papers analyze additive noise models with either linearity, nonlinearity or Gaussian noise. We fill in the gap by providing a nonlinear and non-gaussian causal multiplicative noise model that aims to distinguish the cause from the effect using a two steps method based on Bayesian additive models for location, scale and shape (BAMLSS) and on causal additive models (CAM). We have tested our method on simulated and real data and we reached an accuracy of 0.86 on average. As real data, we considered the causality between financial indices such as S&P 500, Nasdaq, CAC 40 and Nikkei, and companies' log-returns. Our results can be useful in inferring causality when the data is heteroskedastic or non-injective.Keywords: causal inference, DAGs, BAMLSS, financial index
Procedia PDF Downloads 15125117 Vibration-Based Data-Driven Model for Road Health Monitoring
Authors: Guru Prakash, Revanth Dugalam
Abstract:
A road’s condition often deteriorates due to harsh loading such as overload due to trucks, and severe environmental conditions such as heavy rain, snow load, and cyclic loading. In absence of proper maintenance planning, this results in potholes, wide cracks, bumps, and increased roughness of roads. In this paper, a data-driven model will be developed to detect these damages using vibration and image signals. The key idea of the proposed methodology is that the road anomaly manifests in these signals, which can be detected by training a machine learning algorithm. The use of various machine learning techniques such as the support vector machine and Radom Forest method will be investigated. The proposed model will first be trained and tested with artificially simulated data, and the model architecture will be finalized by comparing the accuracies of various models. Once a model is fixed, the field study will be performed, and data will be collected. The field data will be used to validate the proposed model and to predict the future road’s health condition. The proposed will help to automate the road condition monitoring process, repair cost estimation, and maintenance planning process.Keywords: SVM, data-driven, road health monitoring, pot-hole
Procedia PDF Downloads 8625116 General Architecture for Automation of Machine Learning Practices
Authors: U. Borasi, Amit Kr. Jain, Rakesh, Piyush Jain
Abstract:
Data collection, data preparation, model training, model evaluation, and deployment are all processes in a typical machine learning workflow. Training data needs to be gathered and organised. This often entails collecting a sizable dataset and cleaning it to remove or correct any inaccurate or missing information. Preparing the data for use in the machine learning model requires pre-processing it after it has been acquired. This often entails actions like scaling or normalising the data, handling outliers, selecting appropriate features, reducing dimensionality, etc. This pre-processed data is then used to train a model on some machine learning algorithm. After the model has been trained, it needs to be assessed by determining metrics like accuracy, precision, and recall, utilising a test dataset. Every time a new model is built, both data pre-processing and model training—two crucial processes in the Machine learning (ML) workflow—must be carried out. Thus, there are various Machine Learning algorithms that can be employed for every single approach to data pre-processing, generating a large set of combinations to choose from. Example: for every method to handle missing values (dropping records, replacing with mean, etc.), for every scaling technique, and for every combination of features selected, a different algorithm can be used. As a result, in order to get the optimum outcomes, these tasks are frequently repeated in different combinations. This paper suggests a simple architecture for organizing this largely produced “combination set of pre-processing steps and algorithms” into an automated workflow which simplifies the task of carrying out all possibilities.Keywords: machine learning, automation, AUTOML, architecture, operator pool, configuration, scheduler
Procedia PDF Downloads 5825115 Formation of Protective Silicide-Aluminide Coating on Gamma-TiAl Advanced Material
Authors: S. Nouri
Abstract:
In this study, the Si-aluminide coating was prepared on gamma-TiAl [Ti-45Al-2Nb-2Mn-1B (at. %)] via liquid-phase slurry procedure. The high temperature oxidation resistance of this diffusion coating was evaluated at 1100 °C for 400 hours. The results of the isothermal oxidation showed that the formation of Si-aluminide coating can remarkably improve the high temperature oxidation of bare gamma-TiAl alloy. The identification of oxide scale microstructure showed that the formation of protective Al2O3+SiO2 mixed oxide scale along with a continuous, compact and uniform layer of Ti5Si3 beneath the surface oxide scale can act as an oxygen diffusion barrier during the high temperature oxidation. The other possible mechanisms related to the formation of Si-aluminide coating and oxide scales were also discussed.Keywords: Gamma-TiAl alloy, high temperature oxidation, Si-aluminide coating, slurry procedure
Procedia PDF Downloads 178