Search results for: data warehouse
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24545

Search results for: data warehouse

20015 A Scalable Model of Fair Socioeconomic Relations Based on Blockchain and Machine Learning Algorithms-1: On Hyperinteraction and Intuition

Authors: Merey M. Sarsengeldin, Alexandr S. Kolokhmatov, Galiya Seidaliyeva, Alexandr Ozerov, Sanim T. Imatayeva

Abstract:

This series of interdisciplinary studies is an attempt to investigate and develop a scalable model of fair socioeconomic relations on the base of blockchain using positive psychology techniques and Machine Learning algorithms for data analytics. In this particular study, we use hyperinteraction approach and intuition to investigate their influence on 'wisdom of crowds' via created mobile application which was created for the purpose of this research. Along with the public blockchain and private Decentralized Autonomous Organization (DAO) which were elaborated by us on the base of Ethereum blockchain, a model of fair financial relations of members of DAO was developed. We developed a smart contract, so-called, Fair Price Protocol and use it for implementation of model. The data obtained from mobile application was analyzed by ML algorithms. A model was tested on football matches.

Keywords: blockchain, Naïve Bayes algorithm, hyperinteraction, intuition, wisdom of crowd, decentralized autonomous organization

Procedia PDF Downloads 157
20014 Teaching Behaviours of Effective Secondary Mathematics Teachers: A Study in Dhaka, Bangladesh

Authors: Asadullah Sheikh, Kerry Barnett, Paul Ayres

Abstract:

Despite significant progress in access, equity and public examination success, poor student performance in mathematics in secondary schools has become a major concern in Bangladesh. A substantial body of research has emphasised the important contribution of teaching practices to student achievement. However, this has not been investigated in Bangladesh. Therefore, the study sought to find out the effectiveness of mathematics teaching practices as a means of improving secondary school mathematics in Dhaka Municipality City (DMC) area, Bangladesh. The purpose of this study was twofold, first, to identify the 20 highest performing secondary schools in mathematics in DMC, and second, to investigate the teaching practices of mathematics teachers in these schools. A two-phase mixed method approach was adopted. In the first phase, secondary source data were obtained from the Board of Intermediate and Secondary Education (BISE), Dhaka and value-added measures used to identify the 20 highest performing secondary schools in mathematics. In the second phase, a concurrent mixed method design, where qualitative methods were embedded within a dominant quantitative approach was utilised. A purposive sampling strategy was used to select fifteen teachers from the 20 highest performing secondary schools. The main sources of data were classroom teaching observations, and teacher interviews. The data from teacher observations were analysed with descriptive and nonparametric statistics. The interview data were analysed qualitatively. The main findings showed teachers adopt a direct teaching approach which incorporates orientation, structuring, modelling, practice, questioning and teacher-student interaction that creates an individualistic learning environment. The variation in developmental levels of teaching skill indicate that teachers do not necessarily use the qualitative (i.e., focus, stage, quality and differentiation) aspects of teaching behaviours effectively. This is the first study to investigate teaching behaviours of effective secondary mathematics teachers within Dhaka, Bangladesh. It contributes in an international dimension to the field of educational effectiveness and raise questions about existing constructivist approaches. Further, it contributes to important insights about teaching behaviours that can be used to inform the development of evidence-based policy and practice on quality teaching in Bangladesh.

Keywords: effective teaching, mathematics, secondary schools, student achievement, value-added measures

Procedia PDF Downloads 225
20013 Comparison of Computed Tomography Dose Index, Dose Length Product and Effective Dose Among Male and Female Patients From Contrast Enhanced Computed Tomography Pancreatitis Protocol

Authors: Babina Aryal

Abstract:

Background: The diagnosis of pancreatitis is generally based on clinical and laboratory findings; however, Computed Tomography (CT) is an imaging technique of choice specially Contrast Enhanced Computed Tomography (CECT) shows morphological characteristic findings that allow for establishing the diagnosis of pancreatitis and determining the extent of disease severity which is done along with the administration of appropriate contrast medium. The purpose of this study was to compare Computed Tomography Dose Index (CTDI), Dose Length Product (DLP) and Effective Dose (ED) among male and female patients from Contrast Enhanced Computed Tomography (CECT) Pancreatitis Protocol. Methods: This retrospective study involved data collection based on clinical/laboratory/ultrasonography diagnosis of Pancreatitis and has undergone CECT Abdomen pancreatitis protocol. data collection involved detailed information about a patient's Age and Gender, Clinical history, Individual Computed Tomography Dose Index and Dose Length Product and effective dose. Results: We have retrospectively collected dose data from 150 among which 127 were males and 23 were females. The values obtained from the display of the CT screen were measured, calculated and compared to determine whether the CTDI, DLP and ED values were similar or not. CTDI for females was more as compared to males. The differences in CTDI values for females and males were 32.2087 and 37.1609 respectively. DLP values and Effective dose for both the genders did not show significant differences. Conclusion: This study concluded that there were no more significant changes in the DLP and ED values among both the genders however we noticed that female patients had more CTDI than males.

Keywords: computed tomography, contrast enhanced computed tomography, computed tomography dose index, dose length product, effective dose

Procedia PDF Downloads 96
20012 High-Fidelity Materials Screening with a Multi-Fidelity Graph Neural Network and Semi-Supervised Learning

Authors: Akeel A. Shah, Tong Zhang

Abstract:

Computational approaches to learning the properties of materials are commonplace, motivated by the need to screen or design materials for a given application, e.g., semiconductors and energy storage. Experimental approaches can be both time consuming and costly. Unfortunately, computational approaches such as ab-initio electronic structure calculations and classical or ab-initio molecular dynamics are themselves can be too slow for the rapid evaluation of materials, often involving thousands to hundreds of thousands of candidates. Machine learning assisted approaches have been developed to overcome the time limitations of purely physics-based approaches. These approaches, on the other hand, require large volumes of data for training (hundreds of thousands on many standard data sets such as QM7b). This means that they are limited by how quickly such a large data set of physics-based simulations can be established. At high fidelity, such as configuration interaction, composite methods such as G4, and coupled cluster theory, gathering such a large data set can become infeasible, which can compromise the accuracy of the predictions - many applications require high accuracy, for example band structures and energy levels in semiconductor materials and the energetics of charge transfer in energy storage materials. In order to circumvent this problem, multi-fidelity approaches can be adopted, for example the Δ-ML method, which learns a high-fidelity output from a low-fidelity result such as Hartree-Fock or density functional theory (DFT). The general strategy is to learn a map between the low and high fidelity outputs, so that the high-fidelity output is obtained a simple sum of the physics-based low-fidelity and correction, Although this requires a low-fidelity calculation, it typically requires far fewer high-fidelity results to learn the correction map, and furthermore, the low-fidelity result, such as Hartree-Fock or semi-empirical ZINDO, is typically quick to obtain, For high-fidelity outputs the result can be an order of magnitude or more in speed up. In this work, a new multi-fidelity approach is developed, based on a graph convolutional network (GCN) combined with semi-supervised learning. The GCN allows for the material or molecule to be represented as a graph, which is known to improve accuracy, for example SchNet and MEGNET. The graph incorporates information regarding the numbers of, types and properties of atoms; the types of bonds; and bond angles. They key to the accuracy in multi-fidelity methods, however, is the incorporation of low-fidelity output to learn the high-fidelity equivalent, in this case by learning their difference. Semi-supervised learning is employed to allow for different numbers of low and high-fidelity training points, by using an additional GCN-based low-fidelity map to predict high fidelity outputs. It is shown on 4 different data sets that a significant (at least one order of magnitude) increase in accuracy is obtained, using one to two orders of magnitude fewer low and high fidelity training points. One of the data sets is developed in this work, pertaining to 1000 simulations of quinone molecules (up to 24 atoms) at 5 different levels of fidelity, furnishing the energy, dipole moment and HOMO/LUMO.

Keywords: .materials screening, computational materials, machine learning, multi-fidelity, graph convolutional network, semi-supervised learning

Procedia PDF Downloads 9
20011 Education Quality Assurance Administration of Suan Sunandha Rajabhat University

Authors: Nopadol Burananuth, Tawatpupisit Pattaradapa

Abstract:

The objective of this research is to study opinion of staff responsible for Quality Assurance. Research sample is 50 staff at Suan Sunandha Rajabhat University related to Quality Assurance works from each faculty and organization within the university. Data were analyzed using the computer program. The statistics used in data analysis were frequency, percentage, mean and standard deviation. The results reveal that most of the respondents were female, 92%, aged between 31-40 years, 44%. Most of them have been working on Quality Assurance for 1-3 years, 44%. The staff opinion survey showed that the operation received the highest score. In terms of Planning, committee appointment and job descriptions received the highest mean score. For Checking, acknowledging the results and reviewing quality in education received the highest mean score. For Acting, participating in the meeting in order to revise approach to Quality Assurance received the highest mean score. For Doing, planning an internal quality assurance by assigning period, budget and responsibilities received the highest mean score.

Keywords: education quality assurance, administration, staff, Suan Sunandha Rajabhat University

Procedia PDF Downloads 380
20010 The Place of Instructional Materials in Quality Education at Primary School Level in Katsina State, Nigeria

Authors: Murtala Sale

Abstract:

The use of instructional materials is an indispensable tool that enhances qualitative teaching and learning especially at the primary level. Instructional materials are used to facilitate comprehension of ideas in the learners as well as ensure long term retention of ideas and topics taught to pupils. This study examined the relevance of using instructional materials in primary schools in Katsina State, Nigeria. It employed survey design using cluster sampling technique. The questionnaire was used to gather data for analysis, and statistical and frequency tables were used to analyze the data gathered. The results show that teachers and students alike have realized the effectiveness of modern instructional materials in teaching and learning for the attainment of set objectives in the basic primary education policy. It also discovered that reluctance in the use of instructional materials will hamper the achievement of qualitative primary education. The study therefore suggests that there should be the provision of adequate and up-to-date instructional materials to all primary schools in Katsina State for effective teaching and learning process.

Keywords: instructional materials, effective teaching, learning quality, indispensable aspect

Procedia PDF Downloads 235
20009 Disablism in Saudi Mainstream Schools: Disabled Teachers’ Experiences and Perspectives

Authors: Ali Aldakhil

Abstract:

This paper explores the many faces of the barriers and exclusionary attitudes and practices that disabled teachers and students experience in a school where they teach or attend. Critical disability studies and inclusive education theory were used to conceptualise this inquiry and ground it in the literature. These theories were used because they magnify and expose the problems of disability/disablism as within-society instead of within-individual. Similarly, disability-first language was used in this study because it seeks to expose the social oppression and discrimination of disabled. Data were generated through conducting in-depth semi-structured interviews with six disabled teachers who teach disabled children in a Saudi mainstream school. Thematic analysis of data concludes that the school is fettered by disabling barriers, attitudes, and practices, which reflect the dominant culture of disablism that disabled people encounter in the Saudi society on a daily basis. This leads to the conclusion that overall deconstruction and reformation of Saudi mainstream schools are needed, including non-disabled people’s attitudes, policy, spaces, and overall arrangements of teaching and learning.

Keywords: disablism, disability studies, mainstream schools, Saudi Arabia

Procedia PDF Downloads 141
20008 Sensitivity Analysis of Oil Spills Modeling with ADIOS II for Iranian Fields in Persian Gulf

Authors: Farzingohar Mehrnaz, Yasemi Mehran, Esmaili Zinat, Baharlouian Maedeh

Abstract:

Aboozar (Ardeshir) and Bahregansar are the two important Iranian oilfields in Persian Gulf waters. The operation activities cause to create spills which impacted on the marine environment. Assumed spills are molded by ADIOS II (Automated Data Inquiry for Oil Spills) which is NOAA’s weathering oil software. Various atmospheric and marine data with different oil types are used for the modeling. Numerous scenarios for 100 bbls with mean daily air temperature and wind speed are input for 5 days. To find the model sensitivity in each setting, one parameter is changed, but the others stayed constant. In both fields, the evaporated and dispersed output values increased hence the remaining rate is reduced. The results clarified that wind speed first, second air temperature and finally oil type respectively were the most effective factors on the oil weathering process. The obtained results can help the emergency systems to predict the floating (dispersed and remained) volume spill in order to find the suitable cleanup tools and methods.

Keywords: ADIOS, modeling, oil spill, sensitivity analysis

Procedia PDF Downloads 285
20007 A Digital Twin Approach to Support Real-time Situational Awareness and Intelligent Cyber-physical Control in Energy Smart Buildings

Authors: Haowen Xu, Xiaobing Liu, Jin Dong, Jianming Lian

Abstract:

Emerging smart buildings often employ cyberinfrastructure, cyber-physical systems, and Internet of Things (IoT) technologies to increase the automation and responsiveness of building operations for better energy efficiency and lower carbon emission. These operations include the control of Heating, Ventilation, and Air Conditioning (HVAC) and lighting systems, which are often considered a major source of energy consumption in both commercial and residential buildings. Developing energy-saving control models for optimizing HVAC operations usually requires the collection of high-quality instrumental data from iterations of in-situ building experiments, which can be time-consuming and labor-intensive. This abstract describes a digital twin approach to automate building energy experiments for optimizing HVAC operations through the design and development of an adaptive web-based platform. The platform is created to enable (a) automated data acquisition from a variety of IoT-connected HVAC instruments, (b) real-time situational awareness through domain-based visualizations, (c) adaption of HVAC optimization algorithms based on experimental data, (d) sharing of experimental data and model predictive controls through web services, and (e) cyber-physical control of individual instruments in the HVAC system using outputs from different optimization algorithms. Through the digital twin approach, we aim to replicate a real-world building and its HVAC systems in an online computing environment to automate the development of building-specific model predictive controls and collaborative experiments in buildings located in different climate zones in the United States. We present two case studies to demonstrate our platform’s capability for real-time situational awareness and cyber-physical control of the HVAC in the flexible research platforms within the Oak Ridge National Laboratory (ORNL) main campus. Our platform is developed using adaptive and flexible architecture design, rendering the platform generalizable and extendable to support HVAC optimization experiments in different types of buildings across the nation.

Keywords: energy-saving buildings, digital twins, HVAC, cyber-physical system, BIM

Procedia PDF Downloads 85
20006 The Role of Brand Loyalty in Generating Positive Word of Mouth among Malaysian Hypermarket Customers

Authors: S. R. Nikhashemi, Laily Haj Paim, Ali Khatibi

Abstract:

Structural Equation Modeling (SEM) was used to test a hypothesized model explaining Malaysian hypermarket customers’ perceptions of brand trust (BT), customer perceived value (CPV) and perceived service quality (PSQ) on building their brand loyalty (CBL) and generating positive word-of-mouth communication (WOM). Self-administered questionnaires were used to collect data from 374 Malaysian hypermarket customers from Mydin, Tesco, Aeon Big and Giant in Kuala Lumpur, a metropolitan city of Malaysia. The data strongly supported the model exhibiting that BT, CPV and PSQ are prerequisite factors in building customer brand loyalty, while PSQ has the strongest effect on prediction of customer brand loyalty compared to other factors. Besides, the present study suggests the effect of the aforementioned factors via customer brand loyalty strongly contributes to generate positive word of mouth communication.

Keywords: brand trust, perceived value, Perceived Service Quality, Brand loyalty, positive word of mouth communication

Procedia PDF Downloads 468
20005 Workflow Based Inspection of Geometrical Adaptability from 3D CAD Models Considering Production Requirements

Authors: Tobias Huwer, Thomas Bobek, Gunter Spöcker

Abstract:

Driving forces for enhancements in production are trends like digitalization and individualized production. Currently, such developments are restricted to assembly parts. Thus, complex freeform surfaces are not addressed in this context. The need for efficient use of resources and near-net-shape production will require individualized production of complex shaped workpieces. Due to variations between nominal model and actual geometry, this can lead to changes in operations in Computer-aided process planning (CAPP) to make CAPP manageable for an adaptive serial production. In this context, 3D CAD data can be a key to realizing that objective. Along with developments in the geometrical adaptation, a preceding inspection method based on CAD data is required to support the process planner by finding objective criteria to make decisions about the adaptive manufacturability of workpieces. Nowadays, this kind of decisions is depending on the experience-based knowledge of humans (e.g. process planners) and results in subjective decisions – leading to a variability of workpiece quality and potential failure in production. In this paper, we present an automatic part inspection method, based on design and measurement data, which evaluates actual geometries of single workpiece preforms. The aim is to automatically determine the suitability of the current shape for further machining, and to provide a basis for an objective decision about subsequent adaptive manufacturability. The proposed method is realized by a workflow-based approach, keeping in mind the requirements of industrial applications. Workflows are a well-known design method of standardized processes. Especially in applications like aerospace industry standardization and certification of processes are an important aspect. Function blocks, providing a standardized, event-driven abstraction to algorithms and data exchange, will be used for modeling and execution of inspection workflows. Each analysis step of the inspection, such as positioning of measurement data or checking of geometrical criteria, will be carried out by function blocks. One advantage of this approach is its flexibility to design workflows and to adapt algorithms specific to the application domain. In general, within the specified tolerance range it will be checked if a geometrical adaption is possible. The development of particular function blocks is predicated on workpiece specific information e.g. design data. Furthermore, for different product lifecycle phases, appropriate logics and decision criteria have to be considered. For example, tolerances for geometric deviations are different in type and size for new-part production compared to repair processes. In addition to function blocks, appropriate referencing systems are important. They need to support exact determination of position and orientation of the actual geometries to provide a basis for precise analysis. The presented approach provides an inspection methodology for adaptive and part-individual process chains. The analysis of each workpiece results in an inspection protocol and an objective decision about further manufacturability. A representative application domain is the product lifecycle of turbine blades containing a new-part production and a maintenance process. In both cases, a geometrical adaptation is required to calculate individual production data. In contrast to existing approaches, the proposed initial inspection method provides information to decide between different potential adaptive machining processes.

Keywords: adaptive, CAx, function blocks, turbomachinery

Procedia PDF Downloads 286
20004 Trends of Seasonal and Annual Rainfall in the South-Central Climatic Zone of Bangladesh Using Mann-Kendall Trend Test

Authors: M. T. Islam, S. H. Shakif, R. Hasan, S. H. Kobi

Abstract:

Investigation of rainfall trends is crucial considering climate change, food security, and the economy of a particular region. This research aims to study seasonal and annual precipitation trends and their abrupt changes over time in the south-central climatic zone of Bangladesh using monthly time series data of 50 years (1970-2019). A trend-free pre-whitening method has been employed to make necessary adjustments for autocorrelations in the rainfall data. Trends in rainfall and their intensity have been observed using the non-parametric Mann-Kendall test and Theil-Sen estimator. Significant changes and fluctuation points in the data series have been detected using the sequential Mann-Kendall test at the 95% confidence limit. The study findings show that most of the rainfall stations in the study area have a decreasing precipitation pattern throughout all seasons. The maximum decline in the rainfall intensity has been found for the Tangail station (-8.24 mm/year) during monsoon. Madaripur and Chandpur stations have shown slight positive trends in post-monsoon rainfall. In terms of annual precipitation, a negative rainfall pattern has been identified in each station, with a maximum decrement (-) of 14.48 mm/year at Chandpur. However, all the trends are statistically non-significant within the 95% confidence interval, and their monotonic association with time ranges from very weak to weak. From the sequential Mann-Kendall test, the year of changing points for annual and seasonal downward precipitation trends occur mostly after the 90s for Dhaka and Barishal stations. For Chandpur, the fluctuation points arrive after the mid-70s in most cases.

Keywords: trend analysis, Mann-Kendall test, Theil-Sen estimator, sequential Mann-Kendall test, rainfall trend

Procedia PDF Downloads 65
20003 Effectiveness of Blended Learning in Public School During Covid-19: A Way Forward

Authors: Sumaira Taj

Abstract:

Blended learning is emerged as a prerequisite approach for teaching in all schools after the outbreak of the COVID-19 pandemic. However, how much public elementary and secondary schools in Pakistan are ready for adapting this approach and what should be done to prepare schools and students for blended learning are the questions that this paper attempts to answer. Mixed-method research methodology was used to collect data from 40 teachers, 500 students, and 10 mothers. Descriptive statistics was used to analyze quantitative data. As for as readiness is concerned, schools lack resources for blended/ virtual/ online classes from infra-structure to skills, parents’ literacy level hindered students’ learning process and teachers’ skills presented challenges in a smooth and swift shift of the schools from face-to-face learning to blended learning. It is recommended to establish a conducive environment in schools by providing all required resources and skills. Special trainings should be organized for low literacy level parents. Multiple ways should be adopted to benefit all students.

Keywords: blended learning, challenges in online classes, education in covid-19, public schools in pakistan

Procedia PDF Downloads 153
20002 Improving 99mTc-tetrofosmin Myocardial Perfusion Images by Time Subtraction Technique

Authors: Yasuyuki Takahashi, Hayato Ishimura, Masao Miyagawa, Teruhito Mochizuki

Abstract:

Quantitative measurement of myocardium perfusion is possible with single photon emission computed tomography (SPECT) using a semiconductor detector. However, accumulation of 99mTc-tetrofosmin in the liver may make it difficult to assess that accurately in the inferior myocardium. Our idea is to reduce the high accumulation in the liver by using dynamic SPECT imaging and a technique called time subtraction. We evaluated the performance of a new SPECT system with a cadmium-zinc-telluride solid-state semi- conductor detector (Discovery NM 530c; GE Healthcare). Our system acquired list-mode raw data over 10 minutes for a typical patient. From the data, ten SPECT images were reconstructed, one for every minute of acquired data. Reconstruction with the semiconductor detector was based on an implementation of a 3-D iterative Bayesian reconstruction algorithm. We studied 20 patients with coronary artery disease (mean age 75.4 ± 12.1 years; range 42-86; 16 males and 4 females). In each subject, 259 MBq of 99mTc-tetrofosmin was injected intravenously. We performed both a phantom and a clinical study using dynamic SPECT. An approximation to a liver-only image is obtained by reconstructing an image from the early projections during which time the liver accumulation dominates (0.5~2.5 minutes SPECT image-5~10 minutes SPECT image). The extracted liver-only image is then subtracted from a later SPECT image that shows both the liver and the myocardial uptake (5~10 minutes SPECT image-liver-only image). The time subtraction of liver was possible in both a phantom and the clinical study. The visualization of the inferior myocardium was improved. In past reports, higher accumulation in the myocardium due to the overlap of the liver is un-diagnosable. Using our time subtraction method, the image quality of the 99mTc-tetorofosmin myocardial SPECT image is considerably improved.

Keywords: 99mTc-tetrofosmin, dynamic SPECT, time subtraction, semiconductor detector

Procedia PDF Downloads 316
20001 Coconut Shells as the Alternative Equipment for Foot Reflexology

Authors: Nichanant Sermsri, Chananchida Yuktirat

Abstract:

This research was the experimental research. Its purpose was to find out how coconut shells can be adapted to be equipment for foot and calf reflexology. The sample group was 58 female street vendors in Thewet Market, Dusit District, Bangkok, selected by selection criteria and voluntary. The data collecting tool in this research was the Visual Analogue Scale. The massaging tool made from coconut shells (designed and produced by the research team) was the key equipment for this research. The duration of the research was 1 month. The research team assessed the level of exhaustion and heart rate among sample group before and after the massage, then analyzed the data by mean, standard deviation and paired sample t-test. We found out from the research that 1) The level of exhaustion decreased 4.529 levels after the massage. The standard deviation was 1.6195. The heart rates went down 11.67 times/minute. The standard deviation was 6.742. 2) The level of exhaustion and heart rate after the massage decreased with the statistically significance at 0.01.

Keywords: foot reflexology, massaging plate, coconut shells, ecological sciences

Procedia PDF Downloads 177
20000 Securing Healthcare IoT Devices and Enabling SIEM Integration: Addressing

Authors: Mubarak Saadu Nabunkari, Abdullahi Abdu Ibrahim, Muhammad Ilyas

Abstract:

This study looks at how Internet of Things (IoT) devices are used in healthcare to monitor and treat patients better. However, using these devices in healthcare comes with security problems. The research explores using Security Information and Event Management (SIEM) systems with healthcare IoT devices to solve these security challenges. Reviewing existing literature shows the current state of IoT security and emphasizes the need for better protection. The main worry is that healthcare IoT devices can be easily hacked, putting patient data and device functionality at risk. To address this, the research suggests a detailed security framework designed for these devices. This framework, based on literature and best practices, includes important security measures like authentication, data encryption, access controls, and anomaly detection. Adding SIEM systems to this framework helps detect threats in real time and respond quickly to incidents, making healthcare IoT devices more secure. The study highlights the importance of this integration and offers guidance for implementing healthcare IoT securely, efficiently, and effectively.

Keywords: cyber security, threat intelligence, forensics, heath care

Procedia PDF Downloads 43
19999 Enhancing Nursing Teams' Learning: The Role of Team Accountability and Team Resources

Authors: Sarit Rashkovits, Anat Drach- Zahavy

Abstract:

The research considers the unresolved question regarding the link between nursing team accountability and team learning and the resulted team performance in nursing teams. Empirical findings reveal disappointing evidence regarding improvement in healthcare safety and quality. Therefore, there is a need in advancing managerial knowledge regarding the factors that enhance constant healthcare teams' proactive improvement efforts, meaning team learning. We first aim to identify the organizational resources that are needed for team learning in nursing teams; second, to test the moderating role of nursing teams' learning resources in the team accountability-team learning link; and third, to test the moderated mediation model suggesting that nursing teams' accountability affects team performance by enhancing team learning when relevant resources are available to the team. We point on the intervening role of three team learning resources, namely time availability, team autonomy and performance data on the relation between team accountability and team learning and test the proposed moderated mediation model on 44 nursing teams (462 nurses and 44 nursing managers). The results showed that, as was expected, there was a positive significant link between team accountability and team learning and the subsequent team performance when time availability and team autonomy were high rather than low. Nevertheless, the positive team accountability- team learning link was significant when team performance feedback was low rather than high. Accordingly, there was a positive mediated effect of team accountability on team performance via team learning when either time availability or team autonomy were high and the availability of team performance data was low. Nevertheless, this mediated effect was negative when time availability and team autonomy were low and the availability of team performance data was high. We conclude that nurturing team accountability is not enough for achieving nursing teams' learning and the subsequent improved team performance. Rather there is need to provide nursing teams with adequate time, autonomy, and be cautious with performance feedback, as the latter may motivate nursing teams to repeat routine work strategies rather than explore improved ones.

Keywords: nursing teams' accountability, nursing teams' learning, performance feedback, teams' autonomy

Procedia PDF Downloads 249
19998 Modeling Floodplain Vegetation Response to Groundwater Variability Using ArcSWAT Hydrological Model, Moderate Resolution Imaging Spectroradiometer - Normalised Difference Vegetation Index Data, and Machine Learning

Authors: Newton Muhury, Armando A. Apan, Tek Maraseni

Abstract:

This study modelled the relationships between vegetation response and available water below the soil surface using the Terra’s Moderate Resolution Imaging Spectroradiometer (MODIS) generated Normalised Difference Vegetation Index (NDVI) and soil water content (SWC) data. The Soil & Water Assessment Tool (SWAT) interface known as ArcSWAT was used in ArcGIS for the groundwater analysis. The SWAT model was calibrated and validated in SWAT-CUP software using 10 years (2001-2010) of monthly streamflow data. The average Nash-Sutcliffe Efficiency during the calibration and validation was 0.54 and 0.51, respectively, indicating that the model performances were good. Twenty years (2001-2020) of monthly MODIS NDVI data for three different types of vegetation (forest, shrub, and grass) and soil water content for 43 sub-basins were analysed using the WEKA, machine learning tool with a selection of two supervised machine learning algorithms, i.e., support vector machine (SVM) and random forest (RF). The modelling results show that different types of vegetation response and soil water content vary in the dry and wet season. For example, the model generated high positive relationships (r=0.76, 0.73, and 0.81) between the measured and predicted NDVI values of all vegetation in the study area against the groundwater flow (GW), soil water content (SWC), and the combination of these two variables, respectively, during the dry season. However, these relationships were reduced by 36.8% (r=0.48) and 13.6% (r=0.63) against GW and SWC, respectively, in the wet season. On the other hand, the model predicted a moderate positive relationship (r=0.63) between shrub vegetation type and soil water content during the dry season, which was reduced by 31.7% (r=0.43) during the wet season. Our models also predicted that vegetation in the top location (upper part) of the sub-basin is highly responsive to GW and SWC (r=0.78, and 0.70) during the dry season. The results of this study indicate the study region is suitable for seasonal crop production in dry season. Moreover, the results predicted that the growth of vegetation in the top-point location is highly dependent on groundwater flow in both dry and wet seasons, and any instability or long-term drought can negatively affect these floodplain vegetation communities. This study has enriched our knowledge of vegetation responses to groundwater in each season, which will facilitate better floodplain vegetation management.

Keywords: ArcSWAT, machine learning, floodplain vegetation, MODIS NDVI, groundwater

Procedia PDF Downloads 101
19997 Shock and Particle Velocity Determination from Microwave Interrogation

Authors: Benoit Rougier, Alexandre Lefrancois, Herve Aubert

Abstract:

Microwave interrogation in the range 10-100 GHz is identified as an advanced technique to investigate simultaneously shock and particle velocity measurements. However, it requires the understanding of electromagnetic wave propagation in a multi-layered moving media. The existing models limit their approach to wave guides or evaluate the velocities with a fitting method, restricting therefore the domain of validity and the precision of the results. Moreover, few data of permittivity on high explosives at these frequencies under dynamic compression have been reported. In this paper, shock and particle velocities are computed concurrently for steady and unsteady shocks for various inert and reactive materials, via a propagation model based on Doppler shifts and signal amplitude. Refractive index of the material under compression is also calculated. From experimental data processing, it is demonstrated that Hugoniot curve can be evaluated. The comparison with published results proves the accuracy of the proposed method. This microwave interrogation technique seems promising for shock and detonation waves studies.

Keywords: electromagnetic propagation, experimental setup, Hugoniot measurement, shock propagation

Procedia PDF Downloads 202
19996 A Method to Evaluate and Compare Web Information Extractors

Authors: Patricia Jiménez, Rafael Corchuelo, Hassan A. Sleiman

Abstract:

Web mining is gaining importance at an increasing pace. Currently, there are many complementary research topics under this umbrella. Their common theme is that they all focus on applying knowledge discovery techniques to data that is gathered from the Web. Sometimes, these data are relatively easy to gather, chiefly when it comes from server logs. Unfortunately, there are cases in which the data to be mined is the data that is displayed on a web document. In such cases, it is necessary to apply a pre-processing step to first extract the information of interest from the web documents. Such pre-processing steps are performed using so-called information extractors, which are software components that are typically configured by means of rules that are tailored to extracting the information of interest from a web page and structuring it according to a pre-defined schema. Paramount to getting good mining results is that the technique used to extract the source information is exact, which requires to evaluate and compare the different proposals in the literature from an empirical point of view. According to Google Scholar, about 4 200 papers on information extraction have been published during the last decade. Unfortunately, they were not evaluated within a homogeneous framework, which leads to difficulties to compare them empirically. In this paper, we report on an original information extraction evaluation method. Our contribution is three-fold: a) this is the first attempt to provide an evaluation method for proposals that work on semi-structured documents; the little existing work on this topic focuses on proposals that work on free text, which has little to do with extracting information from semi-structured documents. b) It provides a method that relies on statistically sound tests to support the conclusions drawn; the previous work does not provide clear guidelines or recommend statistically sound tests, but rather a survey that collects many features to take into account as well as related work; c) We provide a novel method to compute the performance measures regarding unsupervised proposals; otherwise they would require the intervention of a user to compute them by using the annotations on the evaluation sets and the information extracted. Our contributions will definitely help researchers in this area make sure that they have advanced the state of the art not only conceptually, but from an empirical point of view; it will also help practitioners make informed decisions on which proposal is the most adequate for a particular problem. This conference is a good forum to discuss on our ideas so that we can spread them to help improve the evaluation of information extraction proposals and gather valuable feedback from other researchers.

Keywords: web information extractors, information extraction evaluation method, Google scholar, web

Procedia PDF Downloads 237
19995 The Development of Supported Employment in Malaysia

Authors: Chu Shi Wei

Abstract:

Supported employment in Malaysia is in the early stages of development. The development of supported employment in Malaysia is an important step towards the inclusion of individuals with disabilities who have previously lacked the necessary support for employment in the open labour market as they were confined to sheltered workshops. There is a paradigm shift from sheltered to supported employment as the sheltered workshop is based on the medical model of disability, which focuses on the disability of the individual and segregated training institutions. The paradigm shift revolves around the social model of disability, which emphasizes the abilities of the individual and the removal of the barriers in the environment by the provision of support. This study explores the development of supported employment by utilizing a mixed methods approach which consists of collecting quantitative data through a survey and interviewing participants to collect qualitative data. Job coaches from six employment sectors participated in the survey and interview. The findings of the study indicate that the role of job coaches is integral to the development of supported employment. The role of job coaches includes job matching, on-the-job training, and developing natural supports to foster greater diversity and inclusion in the workplace.

Keywords: supported employment, disabilities, diversity, development

Procedia PDF Downloads 50
19994 Gender Bias in Natural Language Processing: Machines Reflect Misogyny in Society

Authors: Irene Yi

Abstract:

Machine learning, natural language processing, and neural network models of language are becoming more and more prevalent in the fields of technology and linguistics today. Training data for machines are at best, large corpora of human literature and at worst, a reflection of the ugliness in society. Machines have been trained on millions of human books, only to find that in the course of human history, derogatory and sexist adjectives are used significantly more frequently when describing females in history and literature than when describing males. This is extremely problematic, both as training data, and as the outcome of natural language processing. As machines start to handle more responsibilities, it is crucial to ensure that they do not take with them historical sexist and misogynistic notions. This paper gathers data and algorithms from neural network models of language having to deal with syntax, semantics, sociolinguistics, and text classification. Results are significant in showing the existing intentional and unintentional misogynistic notions used to train machines, as well as in developing better technologies that take into account the semantics and syntax of text to be more mindful and reflect gender equality. Further, this paper deals with the idea of non-binary gender pronouns and how machines can process these pronouns correctly, given its semantic and syntactic context. This paper also delves into the implications of gendered grammar and its effect, cross-linguistically, on natural language processing. Languages such as French or Spanish not only have rigid gendered grammar rules, but also historically patriarchal societies. The progression of society comes hand in hand with not only its language, but how machines process those natural languages. These ideas are all extremely vital to the development of natural language models in technology, and they must be taken into account immediately.

Keywords: gendered grammar, misogynistic language, natural language processing, neural networks

Procedia PDF Downloads 100
19993 The Role of Smartphones on Iranian Couples' Relationship: An Analysis

Authors: Niloofar Hooman

Abstract:

The present study aims at investigating the positive and negative effects of using Smartphones on couples committed relationships. Despite the fact that many couples may benefit from the positive aspects of Smartphones, it is not clear how their feeling of trust, intimacy and connection in their relationships get affected by Smartphones. This is important as it highlights the ambivalent influences of Smartphones on couple’s relationships. On the one hand, Smartphones can enhance their social and emotional interactions and on the other hand, they can cause mistrust and isolation between them. Trust, intimacy and honesty are of important factors through which a stable relationship can be constructed. Nevertheless, some characteristics of Smartphones such as being fluid and personalized can harm the relationship and consequently destroy it. Thus, it is necessary to investigate how Iranian couples in committed relationships use Smartphone to manage their relationship and how couples feel Smartphone have enhanced or detracted a sense of trust, intimacy and connection with their partner? In the first phase of the study, in-depth-interview will be conducted with 30 couples and data will be analyzed using NVIVO software. In the next phase of the study, 1500 participants aged 20 and above will be selected based on cluster sampling. Data will be analyzed both qualitatively and quantitatively.

Keywords: couple, family, internet, intimacy, Smartphone, trust

Procedia PDF Downloads 373
19992 Dynamic Evaluation of Shallow Lake Habitat Quality Based on InVEST Model: A Case in Baiyangdian Lake

Authors: Shengjun Yan, Xuan Wang

Abstract:

Water level changes in a shallow lake always introduce dramatic land pattern changes. To achieve sustainable ecosystem service, it is necessary to evaluate habitat quality dynamic and its spatio-temporal variation resulted from water level changes, which can provide a scientific basis for protection of biodiversity and planning of wetland ecological system. Landsat data in the spring was chosen to obtain landscape data at different times based on the high, moderate and low water level of Baiyangdian Shallow Lake. We used the InVEST to evaluate the habitat quality, habitat degradation, and habitat scarcity. The result showed that: 1) the water level of shallow lake changes from high to low lead to an obvious landscape pattern changes and habitat degradation, 2) the most change area occurred in northwestward and southwest of Baiyangdian Shallow Lake, which there was a 21 percent of suitable habitat and 42 percent of moderately suitable habitat lost. Our findings show that the changes of water level in the shallow lake would have a strong relationship with the habitat quality.

Keywords: habitat quality, habitat degradation, water level changes, shallow lake

Procedia PDF Downloads 234
19991 Evaluation of Cultural Landscape Perception in Waterfront Historic Districts Based on Multi-source Data - Taking Venice and Suzhou as Examples

Authors: Shuyu Zhang

Abstract:

The waterfront historical district, as a type of historical districts on the verge of waters such as the sea, lake, and river, have a relatively special urban form. In the past preservation and renewal of traditional historic districts, there have been many discussions on the land range, and the waterfront and marginal spaces are easily overlooked. However, the waterfront space of the historic districts, as a cultural landscape heritage combining historical buildings and landscape elements, has strong ecological and sustainable values. At the same time, Suzhou and Venice, as sister water cities in history, have more waterfront spaces that can be compared in urban form and other levels. Therefore, this paper focuses on the waterfront historic districts in Venice and Suzhou, establishes quantitative evaluation indicators for environmental perception, makes analogies, and promotes the renewal and activation of the entire historical district by improving the spatial quality and vitality of the waterfront area. First, this paper uses multi-source data for analysis, such as Baidu Maps and Google Maps API to crawl the street view of the waterfront historic districts, uses machine learning algorithms to analyze the proportion of cultural landscape elements such as green viewing rate in the street view pictures, and uses space syntax software to make quantitative selectivity analysis, so as to establish environmental perception evaluation indicators for the waterfront historic districts. Finally, by comparing and summarizing the waterfront historic districts in Venice and Suzhou, it reveals their similarities and differences, characteristics and conclusions, and hopes to provide a reference for the heritage preservation and renewal of other waterfront historic districts.

Keywords: waterfront historical district, cultural landscape, perception, multi-source Data

Procedia PDF Downloads 179
19990 Testing the Moderating Effect of Sub Ethnic on Household Investment Behaviour

Authors: Widayat Widayat

Abstract:

Nowday, in the modern investment era, household behavior on investment is a topic that is quite warm. The development of the modern investment, indicated by the emergence of a variety of investment instruments, such as stocks, bonds and various forms of derivatives, affected on the complexity of choosing an investment, especially for traditional societies. Various studies show that there is more than one factor acting as a behavioral antesenden decide to choose an investment instrument. One of the factors, which contribute in determining the investment option is ethnic. Society with a particular sub-culture tend to prefer investing their particular instrument. This is because they have the values, norms and different social environmental. This article is designed to test the impact of sub-cultures between Osing-Java as moderator, in investing. The study was conducted in Banyuwangi, East Java Province of Indonesia. Data were collected using questionnaires, which is given to the head of the household respondents were selected as samples. Sample of households selected by multistage sampling method. The data have been collected processed using SmartPLS software and testing moderating effects using grouped sample test. The result showed that sub-ethnic and has a significant role in determining the investment.

Keywords: investment behaviour, household, moderating, sub ethnic

Procedia PDF Downloads 356
19989 Adsorption of Cd2+ from Aqueous Solutions Using Chitosan Obtained from a Mixture of Littorina littorea and Achatinoidea Shells

Authors: E. D. Paul, O. F. Paul, J. E. Toryila, A. J. Salifu, C. E. Gimba

Abstract:

Adsorption of Cd2+ ions from aqueous solution by Chitosan, a natural polymer, obtained from a mixture of the exoskeletons of Littorina littorea (Periwinkle) and Achatinoidea (Snail) was studied at varying adsorbent dose, contact time, metal ion concentrations, temperature and pH using batch adsorption method. The equilibrium adsorption isotherms were determined between 298 K and 345 K. The adsorption data were adjusted to Langmuir, Freundlich and the pseudo second order kinetic models. It was found that the Langmuir isotherm model most fitted the experimental data, with a maximum monolayer adsorption of 35.1 mgkg⁻¹ at 308 K. The entropy and enthalpy of adsorption were -0.1121 kJmol⁻¹K⁻¹ and -11.43 kJmol⁻¹ respectively. The Freundlich adsorption model, gave Kf and n values consistent with good adsorption. The pseudo-second order reaction model gave a straight line plot with rate constant of 1.291x 10⁻³ kgmg⁻¹ min⁻¹. The qe value was 21.98 mgkg⁻¹, indicating that the adsorption of Cadmium ion by the chitosan composite followed the pseudo-second order kinetic model.

Keywords: adsorption, chitosan, littorina littorea, achatinoidea, natural polymer

Procedia PDF Downloads 386
19988 Personalize E-Learning System Based on Clustering and Sequence Pattern Mining Approach

Authors: H. S. Saini, K. Vijayalakshmi, Rishi Sayal

Abstract:

Network-based education has been growing rapidly in size and quality. Knowledge clustering becomes more important in personalized information retrieval for web-learning. A personalized-Learning service after the learners’ knowledge has been classified with clustering. Through automatic analysis of learners’ behaviors, their partition with similar data level and interests may be discovered so as to produce learners with contents that best match educational needs for collaborative learning. We present a specific mining tool and a recommender engine that we have integrated in the online learning in order to help the teacher to carry out the whole e-learning process. We propose to use sequential pattern mining algorithms to discover the most used path by the students and from this information can recommend links to the new students automatically meanwhile they browse in the course. We have Developed a specific author tool in order to help the teacher to apply all the data mining process. We tend to report on many experiments with real knowledge so as to indicate the quality of using both clustering and sequential pattern mining algorithms together for discovering personalized e-learning systems.

Keywords: e-learning, cluster, personalization, sequence, pattern

Procedia PDF Downloads 411
19987 Factors Constraining the Utilization of Risk Management Strategies in the Execution of Public Construction Projects in North East Nigeria

Authors: S. U. Kunya, S. A. Mohammad

Abstract:

Construction projects in Nigeria are characterized with risks emanating from delays and accompanying cost-overruns. The aim of the study was to identify and assess factors constraining the utilization of risk management strategies in the execution of public construction project in North-East Nigeria. Data was collected with the aid of a well-structured questionnaire administered to three identified projects in the North-east. Data collected were analysed using the severity index. Findings revealed political involvement, selection of inexperienced contractors and lack of coordinated public sector strategy as the most severe factors constraining the utilization of risk management strategies. The study recommended that: formulation of laws to prevent negative political meddling in construction projects; selection of experienced, risk-informed contractors; and comprehensive risk assessment and planning on all public construction projects.

Keywords: factors, Nigeria, north-east, public projects, risk management, strategies, utilization

Procedia PDF Downloads 507
19986 Forecasting Regional Data Using Spatial Vars

Authors: Taisiia Gorshkova

Abstract:

Since the 1980s, spatial correlation models have been used more often to model regional indicators. An increasingly popular method for studying regional indicators is modeling taking into account spatial relationships between objects that are part of the same economic zone. In 2000s the new class of model – spatial vector autoregressions was developed. The main difference between standard and spatial vector autoregressions is that in the spatial VAR (SpVAR), the values of indicators at time t may depend on the values of explanatory variables at the same time t in neighboring regions and on the values of explanatory variables at time t-k in neighboring regions. Thus, VAR is a special case of SpVAR in the absence of spatial lags, and the spatial panel data model is a special case of spatial VAR in the absence of time lags. Two specifications of SpVAR were applied to Russian regional data for 2000-2017. The values of GRP and regional CPI are used as endogenous variables. The lags of GRP, CPI and the unemployment rate were used as explanatory variables. For comparison purposes, the standard VAR without spatial correlation was used as “naïve” model. In the first specification of SpVAR the unemployment rate and the values of depending variables, GRP and CPI, in neighboring regions at the same moment of time t were included in equations for GRP and CPI respectively. To account for the values of indicators in neighboring regions, the adjacency weight matrix is used, in which regions with a common sea or land border are assigned a value of 1, and the rest - 0. In the second specification the values of depending variables in neighboring regions at the moment of time t were replaced by these values in the previous time moment t-1. According to the results obtained, when inflation and GRP of neighbors are added into the model both inflation and GRP are significantly affected by their previous values, and inflation is also positively affected by an increase in unemployment in the previous period and negatively affected by an increase in GRP in the previous period, which corresponds to economic theory. GRP is not affected by either the inflation lag or the unemployment lag. When the model takes into account lagged values of GRP and inflation in neighboring regions, the results of inflation modeling are practically unchanged: all indicators except the unemployment lag are significant at a 5% significance level. For GRP, in turn, GRP lags in neighboring regions also become significant at a 5% significance level. For both spatial and “naïve” VARs the RMSE were calculated. The minimum RMSE are obtained via SpVAR with lagged explanatory variables. Thus, according to the results of the study, it can be concluded that SpVARs can accurately model both the actual values of macro indicators (particularly CPI and GRP) and the general situation in the regions

Keywords: forecasting, regional data, spatial econometrics, vector autoregression

Procedia PDF Downloads 129