Search results for: multiple data
26611 The Impact of Professional Development in the Area of Technology Enhanced Learning on Higher Education Teaching Practices Across Atlantic Technological University – Research Methodology and Preliminary Findings
Authors: Annette Cosgrove
Abstract:
The objectives of this research study is to examine the impact of professional development in Technology Enhanced Learning (TEL) and the digitisation of learning in teaching communities across multiple higher education sites in the ATU (Atlantic Technological University *) ( 2020-2025), including the proposal of an evidence based digital teaching model for use in a future pandemic. The research strategy undertaken for this PhD Study is a multi-site study using mixed methods. Qualitative & quantitative methods are being used in the study to collect data. A pilot study was carried out initially , feedback collected and the research instrument was edited to reflect this feedback, before being administered. The purpose of the staff questionnaire is to evaluate the impact of professional development in the area of TEL, and to capture the practitioners views on the perceived impact on their teaching practice in the higher education sector across ATU (West of Ireland – 5 Higher education locations ). The phenomenon being explored is ‘ the impact of professional development in the area of technology enhanced learning and on teaching practice in a higher education institution.’ The research methodology chosen for this study is an Action based Research Study. The researcher has chosen this approach as it is a prime strategy for developing educational theory and enhancing educational practice . This study includes quantitative and qualitative methods to elicit data which will quantify the impact that continuous professional development in the area of digital teaching practice and technologies has on the practitioner’s teaching practice in higher education. The research instruments / data collection tools for this study include a lecturer survey with a targeted TEL Practice group ( Pre and post covid experience) and semi-structured interviews with lecturers.. This research is currently being conducted across the ATU multisite campus and targeting Higher education lecturers that have completed formal CPD in the area of digital teaching. ATU, a west of Ireland university is the focus of the study , The research questionnaire has been deployed, with 75 respondents to date across the ATU - the primary questionnaire and semi- formal interviews are ongoing currently – the purpose being to evaluate the impact of formal professional development in the area of TEL and its perceived impact on the practitioners teaching practice in the area of digital teaching and learning . This paper will present initial findings, reflections and data from this ongoing research study.Keywords: TEL, DTL, digital teaching, digital assessment
Procedia PDF Downloads 7026610 Education in Personality Development and Grooming for Airline Business Program's Students of International College, Suan Sunandha Rajabhat University
Authors: Taksina Bunbut
Abstract:
Personality and grooming are vital for creating professionalism and safety image for all staffs in the airline industry. Airline Business Program also has an aim to educate students through the subject Personality Development and Grooming in order to elevate the quality of students to meet standard requirements of the airline industry. However, students agree that there are many difficulties that cause unsuccessful learning experience in this subject. The research is to study problems that can afflict students from getting good results in the classroom. Furthermore, exploring possible solutions to overcome challenges are also included in this study. The research sample consists of 140 students who attended the class of Personality Development and Grooming. The employed research instrument is a questionnaire. Statistic for data analysis is t-test and Multiple Regression Analysis. The result found that although students are satisfied with teaching and learning of this subject, they considered that teaching in English and teaching topics in social etiquette in different cultures are difficult for them to understand.Keywords: personality development, grooming, Airline Business Program, soft skill
Procedia PDF Downloads 23826609 Inviscid Steady Flow Simulation Around a Wing Configuration Using MB_CNS
Authors: Muhammad Umar Kiani, Muhammad Shahbaz, Hassan Akbar
Abstract:
Simulation of a high speed inviscid steady ideal air flow around a 2D/axial-symmetry body was carried out by the use of mb_cns code. mb_cns is a program for the time-integration of the Navier-Stokes equations for two-dimensional compressible flows on a multiple-block structured mesh. The flow geometry may be either planar or axisymmetric and multiply-connected domains can be modeled by patching together several blocks. The main simulation code is accompanied by a set of pre and post-processing programs. The pre-processing programs scriptit and mb_prep start with a short script describing the geometry, initial flow state and boundary conditions and produce a discretized version of the initial flow state. The main flow simulation program (or solver as it is sometimes called) is mb_cns. It takes the files prepared by scriptit and mb_prep, integrates the discrete form of the gas flow equations in time and writes the evolved flow data to a set of output files. This output data may consist of the flow state (over the whole domain) at a number of instants in time. After integration in time, the post-processing programs mb_post and mb_cont can be used to reformat the flow state data and produce GIF or postscript plots of flow quantities such as pressure, temperature and Mach number. The current problem is an example of supersonic inviscid flow. The flow domain for the current problem (strake configuration wing) is discretized by a structured grid and a finite-volume approach is used to discretize the conservation equations. The flow field is recorded as cell-average values at cell centers and explicit time stepping is used to update conserved quantities. MUSCL-type interpolation and one of three flux calculation methods (Riemann solver, AUSMDV flux splitting and the Equilibrium Flux Method, EFM) are used to calculate inviscid fluxes across cell faces.Keywords: steady flow simulation, processing programs, simulation code, inviscid flux
Procedia PDF Downloads 42926608 The Relationship between First-Day Body Temperature and Mortality in Traumatic Patients
Authors: Neda Valizadeh, Mani Mofidi, Sama Haghighi, Ali Hashemaghaee, Soudabeh Shafiee Ardestani
Abstract:
Background: There are many systems and parameters to evaluate trauma patients in the emergency department. Most of these evaluations are to distinguish patients with worse conditions so that the care systems have a better prediction of condition for a better care-giving. The purpose of this study is to determine the relationship between axillary body temperature and mortality in patients hospitalized in the intensive care unit (ICU) with multiple traumas and with other clinical and para-clinical factors. Methods: All patients between 16 and 75 years old with multiple traumas who were admitted into Emergency Department then hospitalized in the ICU were included in our study. An axillary temperature in the first and the second day of admission, Glasgow cola scale (GCS), systolic blood pressure, Serum glucose levels, and white blood cell counts of all patients at the admission day were recorded and their relationship with mortality were analyzed by SPSS software with suitable statistical tests. Results: Axillary body temperatures in the first and second day were statistically lower in expired traumatic patients (p=0.001 and p<0,001 respectively). Patients with lower GCS had a significantly lower first-day temperature and a significantly higher mortality. (p=0.006 and p=0.006 respectively). Furthermore, the first-day axillary temperature was significantly lower in patients with a lower first-day systolic blood pressure (p=0.014). Conclusion: Our results showed that lower axillary body temperature in the first day is associated with higher mortality, lower GCS, and lower systolic blood pressure. Thus, this could be used as a predictor of mortality in evaluation of traumatic patients in emergency settings.Keywords: fever, trauma, mortality, emergency
Procedia PDF Downloads 37626607 Analyzing On-Line Process Data for Industrial Production Quality Control
Authors: Hyun-Woo Cho
Abstract:
The monitoring of industrial production quality has to be implemented to alarm early warning for unusual operating conditions. Furthermore, identification of their assignable causes is necessary for a quality control purpose. For such tasks many multivariate statistical techniques have been applied and shown to be quite effective tools. This work presents a process data-based monitoring scheme for production processes. For more reliable results some additional steps of noise filtering and preprocessing are considered. It may lead to enhanced performance by eliminating unwanted variation of the data. The performance evaluation is executed using data sets from test processes. The proposed method is shown to provide reliable quality control results, and thus is more effective in quality monitoring in the example. For practical implementation of the method, an on-line data system must be available to gather historical and on-line data. Recently large amounts of data are collected on-line in most processes and implementation of the current scheme is feasible and does not give additional burdens to users.Keywords: detection, filtering, monitoring, process data
Procedia PDF Downloads 55926606 A Review of Travel Data Collection Methods
Authors: Muhammad Awais Shafique, Eiji Hato
Abstract:
Household trip data is of crucial importance for managing present transportation infrastructure as well as to plan and design future facilities. It also provides basis for new policies implemented under Transportation Demand Management. The methods used for household trip data collection have changed with passage of time, starting with the conventional face-to-face interviews or paper-and-pencil interviews and reaching to the recent approach of employing smartphones. This study summarizes the step-wise evolution in the travel data collection methods. It provides a comprehensive review of the topic, for readers interested to know the changing trends in the data collection field.Keywords: computer, smartphone, telephone, travel survey
Procedia PDF Downloads 31326605 An Efficient Subcarrier Scheduling Algorithm for Downlink OFDMA-Based Wireless Broadband Networks
Authors: Hassen Hamouda, Mohamed Ouwais Kabaou, Med Salim Bouhlel
Abstract:
The growth of wireless technology made opportunistic scheduling a widespread theme in recent research. Providing high system throughput without reducing fairness allocation is becoming a very challenging task. A suitable policy for resource allocation among users is of crucial importance. This study focuses on scheduling multiple streaming flows on the downlink of a WiMAX system based on orthogonal frequency division multiple access (OFDMA). In this paper, we take the first step in formulating and analyzing this problem scrupulously. As a result, we proposed a new scheduling scheme based on Round Robin (RR) Algorithm. Because of its non-opportunistic process, RR does not take in account radio conditions and consequently it affect both system throughput and multi-users diversity. Our contribution called MORRA (Modified Round Robin Opportunistic Algorithm) consists to propose a solution to this issue. MORRA not only exploits the concept of opportunistic scheduler but also takes into account other parameters in the allocation process. The first parameter is called courtesy coefficient (CC) and the second is called Buffer Occupancy (BO). Performance evaluation shows that this well-balanced scheme outperforms both RR and MaxSNR schedulers and demonstrate that choosing between system throughput and fairness is not required.Keywords: OFDMA, opportunistic scheduling, fairness hierarchy, courtesy coefficient, buffer occupancy
Procedia PDF Downloads 30026604 A Business-to-Business Collaboration System That Promotes Data Utilization While Encrypting Information on the Blockchain
Authors: Hiroaki Nasu, Ryota Miyamoto, Yuta Kodera, Yasuyuki Nogami
Abstract:
To promote Industry 4.0 and Society 5.0 and so on, it is important to connect and share data so that every member can trust it. Blockchain (BC) technology is currently attracting attention as the most advanced tool and has been used in the financial field and so on. However, the data collaboration using BC has not progressed sufficiently among companies on the supply chain of manufacturing industry that handle sensitive data such as product quality, manufacturing conditions, etc. There are two main reasons why data utilization is not sufficiently advanced in the industrial supply chain. The first reason is that manufacturing information is top secret and a source for companies to generate profits. It is difficult to disclose data even between companies with transactions in the supply chain. In the blockchain mechanism such as Bitcoin using PKI (Public Key Infrastructure), in order to confirm the identity of the company that has sent the data, the plaintext must be shared between the companies. Another reason is that the merits (scenarios) of collaboration data between companies are not specifically specified in the industrial supply chain. For these problems this paper proposes a Business to Business (B2B) collaboration system using homomorphic encryption and BC technique. Using the proposed system, each company on the supply chain can exchange confidential information on encrypted data and utilize the data for their own business. In addition, this paper considers a scenario focusing on quality data, which was difficult to collaborate because it is a top secret. In this scenario, we show a implementation scheme and a benefit of concrete data collaboration by proposing a comparison protocol that can grasp the change in quality while hiding the numerical value of quality data.Keywords: business to business data collaboration, industrial supply chain, blockchain, homomorphic encryption
Procedia PDF Downloads 13626603 Predictors and Prevention of Sports’ Injuries among Male Professional Footballers in Nigeria
Authors: Timothy A. Oloyede
Abstract:
The study assessed the influence of playing field, climatic conditions, rate of exposure to matches, skill level and competition level on the occurrence and severity of football injuries. The prospective outline of the study was as follows: after a baseline examination and measurements were performed ascertaining possible predictors of injury, all players were followed up weekly for one year to register subsequent injuries and complaints. Four hundred and thirty-five out of 455 subjects completed the weekly follow-ups over one year. Multiple regression analysis was employed to analyse the data collected. Results showed that playing field, climatic conditions, rate of exposure to matches skill level and competition level were predictors of injuries among the professional footballer. Playing on natural grass, acclimatization, reduction of physical overload, among others, were strategies postulated for preventing injuries.Keywords: sports’ injuries, predictors of sports’ injuries, intrinsic risk factors, extrinsic risk factors, injury mechanism, professional footballer
Procedia PDF Downloads 25326602 Study on the Stability of Large Space Expandable Parabolic Cylindrical Antenna
Authors: Chuanzhi Chen, Wenjing Yu
Abstract:
Parabolic cylindrical deployable antenna has the characteristics of wide cutting width, strong directivity, high gain, and easy automatic beam scanning. While, due to its large size, high flexibility, and strong coupling, the deployment process of parabolic cylindrical deployable antenna presents such problems as unsynchronized deployment speed, large local deformation and discontinuous switching of deployment state. A large deployable parabolic cylindrical antenna is taken as the research object, and the problem of unfolding process instability of cylindrical antenna is studied in the paper, which is caused by multiple factors such as multiple closed loops, elastic deformation, motion friction, and gap collision. Firstly, the multi-flexible system dynamics model of large-scale parabolic cylindrical antenna is established to study the influence of friction and elastic deformation on the stability of large multi-closed loop antenna. Secondly, the evaluation method of antenna expansion stability is studied, and the quantitative index of antenna configuration design is proposed to provide a theoretical basis for improving the overall performance of the antenna. Finally, through simulation analysis and experiment, the development dynamics and stability of large-scale parabolic cylindrical antennas are verified by in-depth analysis, and the principles for improving the stability of antenna deployment are summarized.Keywords: multibody dynamics, expandable parabolic cylindrical antenna, stability, flexible deformation
Procedia PDF Downloads 14626601 Digimesh Wireless Sensor Network-Based Real-Time Monitoring of ECG Signal
Authors: Sahraoui Halima, Dahani Ameur, Tigrine Abedelkader
Abstract:
DigiMesh technology represents a pioneering advancement in wireless networking, offering cost-effective and energy-efficient capabilities. Its inherent simplicity and adaptability facilitate the seamless transfer of data between network nodes, extending the range and ensuring robust connectivity through autonomous self-healing mechanisms. In light of these advantages, this study introduces a medical platform harnessed with DigiMesh wireless network technology characterized by low power consumption, immunity to interference, and user-friendly operation. The primary application of this platform is the real-time, long-distance monitoring of Electrocardiogram (ECG) signals, with the added capacity for simultaneous monitoring of ECG signals from multiple patients. The experimental setup comprises key components such as Raspberry Pi, E-Health Sensor Shield, and Xbee DigiMesh modules. The platform is composed of multiple ECG acquisition devices labeled as Sensor Node 1 and Sensor Node 2, with a Raspberry Pi serving as the central hub (Sink Node). Two communication approaches are proposed: Single-hop and multi-hop. In the Single-hop approach, ECG signals are directly transmitted from a sensor node to the sink node through the XBee3 DigiMesh RF Module, establishing peer-to-peer connections. This approach was tested in the first experiment to assess the feasibility of deploying wireless sensor networks (WSN). In the multi-hop approach, two sensor nodes communicate with the server (Sink Node) in a star configuration. This setup was tested in the second experiment. The primary objective of this research is to evaluate the performance of both Single-hop and multi-hop approaches in diverse scenarios, including open areas and obstructed environments. Experimental results indicate the DigiMesh network's effectiveness in Single-hop mode, with reliable communication over distances of approximately 300 meters in open areas. In the multi-hop configuration, the network demonstrated robust performance across approximately three floors, even in the presence of obstacles, without the need for additional router devices. This study offers valuable insights into the capabilities of DigiMesh wireless technology for real-time ECG monitoring in healthcare applications, demonstrating its potential for use in diverse medical scenarios.Keywords: DigiMesh protocol, ECG signal, real-time monitoring, medical platform
Procedia PDF Downloads 7926600 A Cross-Cultural Investigation of Self-Compassion in Adolescents Across Gender
Authors: H. N. Cheung
Abstract:
Self-compassion encourages one to accept oneself, reduce self-criticism and self-judgment, and see one’s shortcomings and setbacks in a balanced view. Adolescent self-compassion is a crucial protective factor against mental illness. It is, however, affected by gender. Given the scarcity of self-compassion scales for adolescents, the current study evaluates the Self-Compassion Scale for Youth (SCS-Y) in a large cross-cultural sample and investigates how the subscales of SCS-Y relate to the dimensions of depressive symptoms across gender. Through the internet-based Qualtrics, a total of 2881 teenagers aged 12 to 18 years were recruited from Hong Kong (HK), China, and the United Kingdom. A Multiple Indicator Multiple Cause (MIMIC) model was used to evaluate measurement invariance of the SCS-Y, and differential item functioning (DIF) was checked across gender. Upon the establishment of the best model, a multigroup structural equation model (SEM) was built between factors of SCS-Y and Multidimensional depression assessment scale (MDAS) which assesses four dimensions of depressive symptoms (emotional, cognitive, somatic and interpersonal). The SCS-Y was shown to have good reliability and validity. The MIMIC model produced a good model fit for a hypothetical six-factor model (CFI = 0.980; TLI = 0.974; RMSEA = 0.038) and no item was flagged for DIF across gender. A gender difference was observed between SCS-Y factors and depression dimensions. Conclusions: The SCS-Y exhibits good psychometric characteristics, including measurement invariance across gender. The study also highlights the gender difference between self-compassion factors and depression dimensions.Keywords: self compassion, gender, depression, structural equation modelling, MIMIC model
Procedia PDF Downloads 7126599 Dataset Quality Index:Development of Composite Indicator Based on Standard Data Quality Indicators
Authors: Sakda Loetpiparwanich, Preecha Vichitthamaros
Abstract:
Nowadays, poor data quality is considered one of the majority costs for a data project. The data project with data quality awareness almost as much time to data quality processes while data project without data quality awareness negatively impacts financial resources, efficiency, productivity, and credibility. One of the processes that take a long time is defining the expectations and measurements of data quality because the expectation is different up to the purpose of each data project. Especially, big data project that maybe involves with many datasets and stakeholders, that take a long time to discuss and define quality expectations and measurements. Therefore, this study aimed at developing meaningful indicators to describe overall data quality for each dataset to quick comparison and priority. The objectives of this study were to: (1) Develop a practical data quality indicators and measurements, (2) Develop data quality dimensions based on statistical characteristics and (3) Develop Composite Indicator that can describe overall data quality for each dataset. The sample consisted of more than 500 datasets from public sources obtained by random sampling. After datasets were collected, there are five steps to develop the Dataset Quality Index (SDQI). First, we define standard data quality expectations. Second, we find any indicators that can measure directly to data within datasets. Thirdly, each indicator aggregates to dimension using factor analysis. Next, the indicators and dimensions were weighted by an effort for data preparing process and usability. Finally, the dimensions aggregate to Composite Indicator. The results of these analyses showed that: (1) The developed useful indicators and measurements contained ten indicators. (2) the developed data quality dimension based on statistical characteristics, we found that ten indicators can be reduced to 4 dimensions. (3) The developed Composite Indicator, we found that the SDQI can describe overall datasets quality of each dataset and can separate into 3 Level as Good Quality, Acceptable Quality, and Poor Quality. The conclusion, the SDQI provide an overall description of data quality within datasets and meaningful composition. We can use SQDI to assess for all data in the data project, effort estimation, and priority. The SDQI also work well with Agile Method by using SDQI to assessment in the first sprint. After passing the initial evaluation, we can add more specific data quality indicators into the next sprint.Keywords: data quality, dataset quality, data quality management, composite indicator, factor analysis, principal component analysis
Procedia PDF Downloads 13926598 Why is the Recurrence Rate of Residual or Recurrent Disease Following Endoscopic Mucosal Resection (EMR) of the Oesophageal Dysplasia’s and T1 Tumours Higher in the Greater Midlands Cancer Network?
Authors: Harshadkumar Rajgor, Jeff Butterworth
Abstract:
Background: Barretts oesophagus increases the risk of developing oesophageal adenocarcinoma. Over the last 40 years, there has been a 6 fold increase in the incidence of oesophageal adenocarcinoma in the western world and the incidence rates are increasing at a greater rate than cancers of the colon, breast and lung. Endoscopic mucosal resection (EMR) is a relatively new technique being used by 2 centres in the greater midlands cancer network. EMR can be used for curative or staging purposes, for high-grade dysplasia’s and T1 tumours of the oesophagus. EMR is also suitable for those who are deemed high risk for oesophagectomy. EMR has a recurrence rate of 21% according to the Wiesbaden data. Method: A retrospective study of prospectively collected data was carried out involving 24 patients who had EMR for curative or staging purposes. Complications of residual or recurrent disease following EMR that required further treatment were investigated. Results: In 54% of cases residual or recurrent disease was suspected. 96% of patients were given clear and concise information regarding their diagnosis of high-grade dysplasia or T1 tumours. All 24 patients consulted the same specialist healthcare team. Conclusion: EMR is a safe and effective treatment for patients who have high-grade dysplasia and T1NO tumours. In 54% of cases residual or recurrent disease was suspected. Initially, only single resections were undertaken. Multiple resections are now being carried out to reduce the risk of recurrence. Complications from EMR remain low in this series and consisted of a single episode of post procedural bleeding.Keywords: endoscopic mucosal resection, oesophageal dysplasia, T1 tumours, cancer network
Procedia PDF Downloads 31626597 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm
Authors: Ameur Abdelkader, Abed Bouarfa Hafida
Abstract:
Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.Keywords: predictive analysis, big data, predictive analysis algorithms, CART algorithm
Procedia PDF Downloads 14226596 Shedding Light on the Black Box: Explaining Deep Neural Network Prediction of Clinical Outcome
Authors: Yijun Shao, Yan Cheng, Rashmee U. Shah, Charlene R. Weir, Bruce E. Bray, Qing Zeng-Treitler
Abstract:
Deep neural network (DNN) models are being explored in the clinical domain, following the recent success in other domains such as image recognition. For clinical adoption, outcome prediction models require explanation, but due to the multiple non-linear inner transformations, DNN models are viewed by many as a black box. In this study, we developed a deep neural network model for predicting 1-year mortality of patients who underwent major cardio vascular procedures (MCVPs), using temporal image representation of past medical history as input. The dataset was obtained from the electronic medical data warehouse administered by Veteran Affairs Information and Computing Infrastructure (VINCI). We identified 21,355 veterans who had their first MCVP in 2014. Features for prediction included demographics, diagnoses, procedures, medication orders, hospitalizations, and frailty measures extracted from clinical notes. Temporal variables were created based on the patient history data in the 2-year window prior to the index MCVP. A temporal image was created based on these variables for each individual patient. To generate the explanation for the DNN model, we defined a new concept called impact score, based on the presence/value of clinical conditions’ impact on the predicted outcome. Like (log) odds ratio reported by the logistic regression (LR) model, impact scores are continuous variables intended to shed light on the black box model. For comparison, a logistic regression model was fitted on the same dataset. In our cohort, about 6.8% of patients died within one year. The prediction of the DNN model achieved an area under the curve (AUC) of 78.5% while the LR model achieved an AUC of 74.6%. A strong but not perfect correlation was found between the aggregated impact scores and the log odds ratios (Spearman’s rho = 0.74), which helped validate our explanation.Keywords: deep neural network, temporal data, prediction, frailty, logistic regression model
Procedia PDF Downloads 15326595 Canopy Temperature Acquired from Daytime and Nighttime Aerial Data as an Indicator of Trees’ Health Status
Authors: Agata Zakrzewska, Dominik Kopeć, Adrian Ochtyra
Abstract:
The growing number of new cameras, sensors, and research methods allow for a broader application of thermal data in remote sensing vegetation studies. The aim of this research was to check whether it is possible to use thermal infrared data with a spectral range (3.6-4.9 μm) obtained during the day and the night to assess the health condition of selected species of deciduous trees in an urban environment. For this purpose, research was carried out in the city center of Warsaw (Poland) in 2020. During the airborne data acquisition, thermal data, laser scanning, and orthophoto map images were collected. Synchronously with airborne data, ground reference data were obtained for 617 studied species (Acer platanoides, Acer pseudoplatanus, Aesculus hippocastanum, Tilia cordata, and Tilia × euchlora) in different health condition states. The results were as follows: (i) healthy trees are cooler than trees in poor condition and dying both in the daytime and nighttime data; (ii) the difference in the canopy temperatures between healthy and dying trees was 1.06oC of mean value on the nighttime data and 3.28oC of mean value on the daytime data; (iii) condition classes significantly differentiate on both daytime and nighttime thermal data, but only on daytime data all condition classes differed statistically significantly from each other. In conclusion, the aerial thermal data can be considered as an alternative to hyperspectral data, a method of assessing the health condition of trees in an urban environment. Especially data obtained during the day, which can differentiate condition classes better than data obtained at night. The method based on thermal infrared and laser scanning data fusion could be a quick and efficient solution for identifying trees in poor health that should be visually checked in the field.Keywords: middle wave infrared, thermal imagery, tree discoloration, urban trees
Procedia PDF Downloads 11526594 3D Human Face Reconstruction in Unstable Conditions
Authors: Xiaoyuan Suo
Abstract:
3D object reconstruction is a broad research area within the computer vision field involving many stages and still open problems. One of the existing challenges in this field lies with micromotion, such as the facial expressions on the appearance of the human or animal face. Similar literatures in this field focuses on 3D reconstruction in stable conditions such as an existing image or photos taken in a rather static environment, while the purpose of this work is to discuss a flexible scan system using multiple cameras that can correctly reconstruct 3D stable and moving objects -- human face with expression in particular. Further, a mathematical model is proposed at the end of this literature to automate the 3D object reconstruction process. The reconstruction process takes several stages. Firstly, a set of simple 2D lines would be projected onto the object and hence a set of uneven curvy lines can be obtained, which represents the 3D numerical data of the surface. The lines and their shapes will help to identify object’s 3D construction in pixels. With the two-recorded angles and their distance from the camera, a simple mathematical calculation would give the resulting coordinate of each projected line in an absolute 3D space. This proposed research will benefit many practical areas, including but not limited to biometric identification, authentications, cybersecurity, preservation of cultural heritage, drama acting especially those with rapid and complex facial gestures, and many others. Specifically, this will (I) provide a brief survey of comparable techniques existing in this field. (II) discuss a set of specialized methodologies or algorithms for effective reconstruction of 3D objects. (III)implement, and testing the developed methodologies. (IV) verify findings with data collected from experiments. (V) conclude with lessons learned and final thoughts.Keywords: 3D photogrammetry, 3D object reconstruction, facial expression recognition, facial recognition
Procedia PDF Downloads 15026593 Email Phishing Detection Using Natural Language Processing and Convolutional Neural Network
Abstract:
Phishing is one of the oldest and best known scams on the Internet. It can be defined as any type of telecommunications fraud that uses social engineering tricks to obtain confidential data from its victims. It’s a cybercrime aimed at stealing your sensitive information. Phishing is generally done via private email, so scammers impersonate large companies or other trusted entities to encourage victims to voluntarily provide information such as login credentials or, worse yet, credit card numbers. The COVID-19 theme is used by cybercriminals in multiple malicious campaigns like phishing. In this environment, messaging filtering solutions have become essential to protect devices that will now be used outside of the secure perimeter. Despite constantly updating methods to avoid these cyberattacks, the end result is currently insufficient. Many researchers are looking for optimal solutions to filter phishing emails, but we still need good results. In this work, we concentrated on solving the problem of detecting phishing emails using the different steps of NLP preprocessing, and we proposed and trained a model using one-dimensional CNN. Our study results show that our model obtained an accuracy of 99.99%, which demonstrates how well our model is working.Keywords: phishing, e-mail, NLP preprocessing, CNN, e-mail filtering
Procedia PDF Downloads 12626592 Youth Friendly Health Services for Rural Thai Teenagers
Authors: C. Sridawruang
Abstract:
Young people today has sexual activities differing from those of earlier generations, in that teenagers are likely to have multiple partners, and are frequently in short-term relationships or with partners that are not well known to them. The proportion of teenage mothers in Thailand has increased. Young people were not specifically addressed during the overall very successful HIV-prevention campaigns. Because of this missed opportunity, they are still unaware of the risk of unsafe sexual behavior. Aims: To describe the reproductive health care services in perspectives of rural Thai teenagers Methods: This survey was one part of a mixed method approach taken using survey and focus groups with 439 teenagers aged 12-18 years in 5 villages, Udon Thani, Thailand. The standard questionnaire survey had been used for collecting data. The numeric data was checked and analyzed by using descriptive statistics. Results: Most teenager respondents stated that they do not know where sexual reproductive health services provided for them. Most teenagers felt difficult to access and talk with health staff about sexual related issues. They stated that discussing, or consulting with health providers might not be safe. Teenagers might lose opportunities to access and get advice from health care services. The mean knowledge score of contraception and condom reproductive was 6.34 from a total score 11. Most teenagers especially girls expressed a need for counseling services and reported a need for telephone services. Conclusions: The need of appropriate information focusing on sexual relationships and contraception should be designed to help young people make wise decisions and there should be set health care services for Thai teenagers to make sure that teenagers could access easily. Health care providers need to be trained to improve their knowledge, attitudes and skills in reproductive health care practices for Thai teenagers.Keywords: youth friendly health services, rural, Thai, teenagers
Procedia PDF Downloads 34026591 CPU Architecture Based on Static Hardware Scheduler Engine and Multiple Pipeline Registers
Authors: Ionel Zagan, Vasile Gheorghita Gaitan
Abstract:
The development of CPUs and of real-time systems based on them made it possible to use time at increasingly low resolutions. Together with the scheduling methods and algorithms, time organizing has been improved so as to respond positively to the need for optimization and to the way in which the CPU is used. This presentation contains both a detailed theoretical description and the results obtained from research on improving the performances of the nMPRA (Multi Pipeline Register Architecture) processor by implementing specific functions in hardware. The proposed CPU architecture has been developed, simulated and validated by using the FPGA Virtex-7 circuit, via a SoC project. Although the nMPRA processor hardware structure with five pipeline stages is very complex, the present paper presents and analyzes the tests dedicated to the implementation of the CPU and of the memory on-chip for instructions and data. In order to practically implement and test the entire SoC project, various tests have been performed. These tests have been performed in order to verify the drivers for peripherals and the boot module named Bootloader.Keywords: hardware scheduler, nMPRA processor, real-time systems, scheduling methods
Procedia PDF Downloads 26726590 Hierarchical Clustering Algorithms in Data Mining
Authors: Z. Abdullah, A. R. Hamdan
Abstract:
Clustering is a process of grouping objects and data into groups of clusters to ensure that data objects from the same cluster are identical to each other. Clustering algorithms in one of the areas in data mining and it can be classified into partition, hierarchical, density based, and grid-based. Therefore, in this paper, we do a survey and review for four major hierarchical clustering algorithms called CURE, ROCK, CHAMELEON, and BIRCH. The obtained state of the art of these algorithms will help in eliminating the current problems, as well as deriving more robust and scalable algorithms for clustering.Keywords: clustering, unsupervised learning, algorithms, hierarchical
Procedia PDF Downloads 88526589 Nondestructive Evaluation of Hidden Delamination in Glass Fiber Composite Using Terahertz Spectroscopy
Authors: Chung-Hyeon Ryu, Do-Hyoung Kim, Hak-Sung Kim
Abstract:
As the use of the composites was increased, the detecting method of hidden damages which have an effect on performance of the composite was important. Terahertz (THz) spectroscopy was assessed as one of the new powerful nondestructive evaluation (NDE) techniques for fiber reinforced composite structures because it has many advantages which can overcome the limitations of conventional NDE techniques such as x-rays or ultrasound. The THz wave offers noninvasive, noncontact and nonionizing methods evaluating composite damages, also it gives a broad range of information about the material properties. In additions, it enables to detect the multiple-delaminations of various nonmetallic materials. In this study, the pulse type THz spectroscopy imaging system was devised and used for detecting and evaluating the hidden delamination in the glass fiber reinforced plastic (GFRP) composite laminates. The interaction between THz and the GFRP composite was analyzed respect to the type of delamination, including their thickness, size and numbers of overlaps among multiple-delaminations in through-thickness direction. Both of transmission and reflection configurations were used for evaluation of hidden delaminations and THz wave propagations through the delaminations were also discussed. From these results, various hidden delaminations inside of the GFRP composite were successfully detected using time-domain THz spectroscopy imaging system and also compared to the results of C-scan inspection. It is expected that THz NDE technique will be widely used to evaluate the reliability of composite structures.Keywords: terahertz, delamination, glass fiber reinforced plastic composites, terahertz spectroscopy
Procedia PDF Downloads 59226588 Development of Gully Erosion Prediction Model in Sokoto State, Nigeria, using Remote Sensing and Geographical Information System Techniques
Authors: Nathaniel Bayode Eniolorunda, Murtala Abubakar Gada, Sheikh Danjuma Abubakar
Abstract:
The challenge of erosion in the study area is persistent, suggesting the need for a better understanding of the mechanisms that drive it. Thus, the study evolved a predictive erosion model (RUSLE_Sok), deploying Remote Sensing (RS) and Geographical Information System (GIS) tools. The nature and pattern of the factors of erosion were characterized, while soil losses were quantified. Factors’ impacts were also measured, and the morphometry of gullies was described. Data on the five factors of RUSLE and distances to settlements, rivers and roads (K, R, LS, P, C, DS DRd and DRv) were combined and processed following standard RS and GIS algorithms. Harmonized World Soil Data (HWSD), Shuttle Radar Topographical Mission (SRTM) image, Climate Hazards Group InfraRed Precipitation with Station data (CHIRPS), Sentinel-2 image accessed and processed within the Google Earth Engine, road network and settlements were the data combined and calibrated into the factors for erosion modeling. A gully morphometric study was conducted at some purposively selected sites. Factors of soil erosion showed low, moderate, to high patterns. Soil losses ranged from 0 to 32.81 tons/ha/year, classified into low (97.6%), moderate (0.2%), severe (1.1%) and very severe (1.05%) forms. The multiple regression analysis shows that factors statistically significantly predicted soil loss, F (8, 153) = 55.663, p < .0005. Except for the C-Factor with a negative coefficient, all other factors were positive, with contributions in the order of LS>C>R>P>DRv>K>DS>DRd. Gullies are generally from less than 100m to about 3km in length. Average minimum and maximum depths at gully heads are 0.6 and 1.2m, while those at mid-stream are 1 and 1.9m, respectively. The minimum downstream depth is 1.3m, while that for the maximum is 4.7m. Deeper gullies exist in proximity to rivers. With minimum and maximum gully elevation values ranging between 229 and 338m and an average slope of about 3.2%, the study area is relatively flat. The study concluded that major erosion influencers in the study area are topography and vegetation cover and that the RUSLE_Sok well predicted soil loss more effectively than ordinary RUSLE. The adoption of conservation measures such as tree planting and contour ploughing on sloppy farmlands was recommended.Keywords: RUSLE_Sok, Sokoto, google earth engine, sentinel-2, erosion
Procedia PDF Downloads 7526587 Assessment of Work-Related Stress and Its Predictors in Ethiopian Federal Bureau of Investigation in Addis Ababa
Authors: Zelalem Markos Borko
Abstract:
Work-related stress is a reaction that occurs when the work weight progress toward becoming excessive. Therefore, unless properly managed, stress leads to high employee turnover, decreased performance, illness and absenteeism. Yet, little has been addressed regarding work-related stress and its predictors in the study area. Therefore, the objective of this study was to assess stress prevalence and its predictors in the study area. To that effect, a cross-sectional study design was conducted on 281 employees from the Ethiopian Federal Bureau of Investigation by using stratified random sampling techniques. Survey questionnaire scales were employed to collect data. Data were analyzed by percentage, Pearson correlation coefficients, simple linear regression, multiple linear regressions, independent t-test and one-way ANOVA statistical techniques. In the present study13.9% of participants faced high stress, whereas 13.5% of participants faced low stress and the rest 72.6% of officers experienced moderate stress. There is no significant group difference among workers due to age, gender, marital status, educational level, years of service and police rank. This study concludes that factors such as role conflict, performance over-utilization, role ambiguity, and qualitative and quantitative role overload together predict 39.6% of work-related stress. This indicates that 60.4% of the variation in stress is explained by other factors, so other additional research should be done to identify additional factors predicting stress. To prevent occupational stress among police, the Ethiopian Federal Bureau of Investigation should develop strategies based on factors that will help to develop stress reduction management.Keywords: work-related stress, Ethiopian federal bureau of investigation, predictors, Addis Ababa
Procedia PDF Downloads 7026586 Prevalance and Factors Associated with Domestic Violence among Preganant Women in Southwest Ethiopia
Authors: Bediru Abamecha
Abstract:
Background: Domestic violence is a global problem that occurs regardless of culture, ethnicity or socio-economic class. It is known to be responsible for numerous hospital visits undertaken by women. Violence on pregnant women is a health and social problem that poses particular risks to the woman and her unborn child. Objective: The Objective of this study will be to assess prevalence of domestic violence and its correalates among pregnant women in Manna Woreda of Jimma Zone. Methods: Simple Random Sampling technique will be used to select 12 kebeles (48% of the study area) and Systematic Sampling will be used to reach to the house hold in selected kebeles in manna woreda of Jimma zone, south west Ethiopia from february 15-25, 2011. An in-depth interview will be conducted on Women affairs, police office and Nurses working and minimum of 4FGD with 6-8 members on pregnant women and selected male from the community. SPSS version 16.0 will be used to enter, clean and analyze the data. Descriptive statistics such as mean or median for continuous variables and percent for categorical variables will be made. Bivariate analysis will be used to check the association between independent variables and domestic violence. Variables found to have association with domestic violence will be entered to multiple logistic regressions for controlling the possible effect of confounders and finally the variables which had significance association will be identified on basis of OR, with 95% CI. All statistical significance will be considered at p<0.05. The qualitative data will be summarized manually and thematic analysis will be performed and finally both will be triangulated.Keywords: ante natal care, ethiopian demographic and health survey, domestic violence, statistical package for social science
Procedia PDF Downloads 51826585 Simulation Studies of High-Intensity, Nanosecond Pulsed Electric Fields Induced Dynamic Membrane Electroporation
Authors: Jiahui Song
Abstract:
The application of an electric field can cause poration at cell membranes. This includes the outer plasma membrane, as well as the membranes of intracellular organelles. In order to analyze and predict such electroporation effects, it becomes necessary to first evaluate the electric fields and the transmembrane voltages. This information can then be used to assess changes in the pore formation energy that finally yields the pore distributions and their radii based on the Smolchowski equation. The dynamic pore model can be achieved by including a dynamic aspect and a dependence on the pore population density into the pore formation energy equation. These changes make the pore formation energy E(r) self-adjusting in response to pore formation without causing uncontrolled growth and expansion. By using dynamic membrane tension, membrane electroporation in response to a 180kV/cm trapezoidal pulse with a 10 ns on time and 1.5 ns rise- and fall-times is discussed. Poration is predicted to occur at times beyond the peak at around 9.2 ns. Modeling also yields time-dependent distributions of the membrane pore population after multiple pulses. It shows that the pore distribution shifts to larger values of the radius with multiple pulsing. Molecular dynamics (MD) simulations are also carried out for a fixed field of 0.5 V/nm to demonstrate nanopore formation from a microscopic point of view. The result shows that the pore is predicted to be about 0.9 nm in diameter and somewhat narrower at the central point.Keywords: high-intensity, nanosecond, dynamics, electroporation
Procedia PDF Downloads 15926584 Dissimilarity Measure for General Histogram Data and Its Application to Hierarchical Clustering
Authors: K. Umbleja, M. Ichino
Abstract:
Symbolic data mining has been developed to analyze data in very large datasets. It is also useful in cases when entry specific details should remain hidden. Symbolic data mining is quickly gaining popularity as datasets in need of analyzing are becoming ever larger. One type of such symbolic data is a histogram, which enables to save huge amounts of information into a single variable with high-level of granularity. Other types of symbolic data can also be described in histograms, therefore making histogram a very important and general symbolic data type - a method developed for histograms - can also be applied to other types of symbolic data. Due to its complex structure, analyzing histograms is complicated. This paper proposes a method, which allows to compare two histogram-valued variables and therefore find a dissimilarity between two histograms. Proposed method uses the Ichino-Yaguchi dissimilarity measure for mixed feature-type data analysis as a base and develops a dissimilarity measure specifically for histogram data, which allows to compare histograms with different number of bins and bin widths (so called general histogram). Proposed dissimilarity measure is then used as a measure for clustering. Furthermore, linkage method based on weighted averages is proposed with the concept of cluster compactness to measure the quality of clustering. The method is then validated with application on real datasets. As a result, the proposed dissimilarity measure is found producing adequate and comparable results with general histograms without the loss of detail or need to transform the data.Keywords: dissimilarity measure, hierarchical clustering, histograms, symbolic data analysis
Procedia PDF Downloads 16226583 Enhancing Emotional Intelligence through Non-Verbal Communication Training in Higher Education Exchange Programs: A Longitudinal Study
Authors: Maciej Buczowski
Abstract:
This study investigates the impact of non-verbal communication training on enhancing the emotional intelligence (EI) of participants in higher education exchange programs. Recognizing the vital role EI plays in academic and professional success, particularly in multicultural environments, this research aims to explore the interplay between non-verbal cues and EI. Utilizing a longitudinal mixed-methods approach, the study will assess EI development over time among international students and faculty members. Participants will undergo a comprehensive non-verbal communication training program, covering modules on recognizing and interpreting emotional expressions, understanding cultural variations, and using non-verbal cues to manage interpersonal dynamics. EI levels will be measured using established instruments such as the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT) and the Emotional Quotient Inventory (EQ-i), supplemented by qualitative data from interviews and focus groups. A control group will be included to validate the intervention's effectiveness. Data collection at multiple time points (pre-training, mid-training, post-training, and follow-up) will enable tracking of EI changes. The study hypothesizes significant improvements in participants' EI, particularly in emotional awareness, empathy, and relationship management, leading to better academic performance and increased satisfaction with the exchange experience. This research aims to provide insights into the relationship between non-verbal communication and EI, potentially influencing the design of exchange programs to include EI development components and enhancing the effectiveness of international education initiatives.Keywords: emotional intelligence, higher education exchange program, non-verbal communication, intercultural communication, cognitive linguistics
Procedia PDF Downloads 2426582 WiFi Data Offloading: Bundling Method in a Canvas Business Model
Authors: Majid Mokhtarnia, Alireza Amini
Abstract:
Mobile operators deal with increasing in the data traffic as a critical issue. As a result, a vital responsibility of the operators is to deal with such a trend in order to create added values. This paper addresses a bundling method in a Canvas business model in a WiFi Data Offloading (WDO) strategy by which some elements of the model may be affected. In the proposed method, it is supposed to sell a number of data packages for subscribers in which there are some packages with a free given volume of data-offloaded WiFi complimentary. The paper on hands analyses this method in the views of attractiveness and profitability. The results demonstrate that the quality of implementation of the WDO strongly affects the final result and helps the decision maker to make the best one.Keywords: bundling, canvas business model, telecommunication, WiFi data offloading
Procedia PDF Downloads 200