Search results for: cardio data analysis
40529 Maker-Based Learning in Secondary Mathematics: Investigating Students’ Proportional Reasoning Understanding through Digital Making
Authors: Juan Torralba
Abstract:
Student digital artifacts were investigated, utilizing a qualitative exploratory research design to understand the ways in which students represented their knowledge of seventh-grade proportionality concepts as they participated in maker-based activities that culminated in the creation of digital 3-dimensional models of their dream homes. Representations of the geometric and numeric dimensions of proportionality were analyzed in the written, verbal, and visual data collected from the students. A directed content analysis approach was utilized in the data analysis, as this work aimed to build upon existing research in the field of maker-based STEAM Education. The results from this work show that students can represent their understanding of proportional reasoning through open-ended written responses more accurately than through verbal descriptions or digital artifacts. The geometric and numeric dimensions of proportionality and their respective components of attributes of similarity representation and percents, rates, and ratios representations were the most represented by the students than any other across the data, suggesting a maker-based instructional approach to teaching proportionality in the middle grades may be promising in helping students gain a solid foundation in those components. Recommendations for practice and research are discussed.Keywords: learning through making, maker-based education, maker education in the middle grades, making in mathematics, the maker movement
Procedia PDF Downloads 7140528 Legal Issues of Collecting and Processing Big Health Data in the Light of European Regulation 679/2016
Authors: Ioannis Iglezakis, Theodoros D. Trokanas, Panagiota Kiortsi
Abstract:
This paper aims to explore major legal issues arising from the collection and processing of Health Big Data in the light of the new European secondary legislation for the protection of personal data of natural persons, placing emphasis on the General Data Protection Regulation 679/2016. Whether Big Health Data can be characterised as ‘personal data’ or not is really the crux of the matter. The legal ambiguity is compounded by the fact that, even though the processing of Big Health Data is premised on the de-identification of the data subject, the possibility of a combination of Big Health Data with other data circulating freely on the web or from other data files cannot be excluded. Another key point is that the application of some provisions of GPDR to Big Health Data may both absolve the data controller of his legal obligations and deprive the data subject of his rights (e.g., the right to be informed), ultimately undermining the fundamental right to the protection of personal data of natural persons. Moreover, data subject’s rights (e.g., the right not to be subject to a decision based solely on automated processing) are heavily impacted by the use of AI, algorithms, and technologies that reclaim health data for further use, resulting in sometimes ambiguous results that have a substantial impact on individuals. On the other hand, as the COVID-19 pandemic has revealed, Big Data analytics can offer crucial sources of information. In this respect, this paper identifies and systematises the legal provisions concerned, offering interpretative solutions that tackle dangers concerning data subject’s rights while embracing the opportunities that Big Health Data has to offer. In addition, particular attention is attached to the scope of ‘consent’ as a legal basis in the collection and processing of Big Health Data, as the application of data analytics in Big Health Data signals the construction of new data and subject’s profiles. Finally, the paper addresses the knotty problem of role assignment (i.e., distinguishing between controller and processor/joint controllers and joint processors) in an era of extensive Big Health data sharing. The findings are the fruit of a current research project conducted by a three-member research team at the Faculty of Law of the Aristotle University of Thessaloniki and funded by the Greek Ministry of Education and Religious Affairs.Keywords: big health data, data subject rights, GDPR, pandemic
Procedia PDF Downloads 12940527 Qualitative Approaches to Mindfulness Meditation Practices in Higher Education
Authors: Patrizia Barroero, Saliha Yagoubi
Abstract:
Mindfulness meditation practices in the context of higher education are becoming more and more common. Some of the reported benefits of mediation interventions and workshops include: improved focus, general well-being, diminished stress, and even increased resilience and grit. A series of workshops free to students, faculty, and staff was offered twice a week over two semesters at Hudson County Community College, New Jersey. The results of an exploratory study based on participants’ subjective reactions to these workshops will be presented. A qualitative approach was used to collect and analyze the data and a hermeneutic phenomenological perspective served as a framework for the research design and data collection and analysis. The data collected includes three recorded videos of semi-structured interviews and several written surveys submitted by volunteer participants.Keywords: mindfulness meditation practices, stress reduction, resilience, grit, higher education success, qualitative research
Procedia PDF Downloads 7540526 Worldwide GIS Based Earthquake Information System/Alarming System for Microzonation/Liquefaction and It’s Application for Infrastructure Development
Authors: Rajinder Kumar Gupta, Rajni Kant Agrawal, Jaganniwas
Abstract:
One of the most frightening phenomena of nature is the occurrence of earthquake as it has terrible and disastrous effects. Many earthquakes occur every day worldwide. There is need to have knowledge regarding the trends in earthquake occurrence worldwide. The recoding and interpretation of data obtained from the establishment of the worldwide system of seismological stations made this possible. From the analysis of recorded earthquake data, the earthquake parameters and source parameters can be computed and the earthquake catalogues can be prepared. These catalogues provide information on origin, time, epicenter locations (in term of latitude and longitudes) focal depths, magnitude and other related details of the recorded earthquakes. Theses catalogues are used for seismic hazard estimation. Manual interpretation and analysis of these data is tedious and time consuming. A geographical information system is a computer based system designed to store, analyzes and display geographic information. The implementation of integrated GIS technology provides an approach which permits rapid evaluation of complex inventor database under a variety of earthquake scenario and allows the user to interactively view results almost immediately. GIS technology provides a powerful tool for displaying outputs and permit to users to see graphical distribution of impacts of different earthquake scenarios and assumptions. An endeavor has been made in present study to compile the earthquake data for the whole world in visual Basic on ARC GIS Plate form so that it can be used easily for further analysis to be carried out by earthquake engineers. The basic data on time of occurrence, location and size of earthquake has been compiled for further querying based on various parameters. A preliminary analysis tool is also provided in the user interface to interpret the earthquake recurrence in region. The user interface also includes the seismic hazard information already worked out under GHSAP program. The seismic hazard in terms of probability of exceedance in definite return periods is provided for the world. The seismic zones of the Indian region are included in the user interface from IS 1893-2002 code on earthquake resistant design of buildings. The City wise satellite images has been inserted in Map and based on actual data the following information could be extracted in real time: • Analysis of soil parameters and its effect • Microzonation information • Seismic hazard and strong ground motion • Soil liquefaction and its effect in surrounding area • Impacts of liquefaction on buildings and infrastructure • Occurrence of earthquake in future and effect on existing soil • Propagation of earth vibration due of occurrence of Earthquake GIS based earthquake information system has been prepared for whole world in Visual Basic on ARC GIS Plate form and further extended micro level based on actual soil parameters. Individual tools has been developed for liquefaction, earthquake frequency etc. All information could be used for development of infrastructure i.e. multi story structure, Irrigation Dam & Its components, Hydro-power etc in real time for present and future.Keywords: GIS based earthquake information system, microzonation, analysis and real time information about liquefaction, infrastructure development
Procedia PDF Downloads 31640525 Application of Gamma Frailty Model in Survival of Liver Cirrhosis Patients
Authors: Elnaz Saeedi, Jamileh Abolaghasemi, Mohsen Nasiri Tousi, Saeedeh Khosravi
Abstract:
Goals and Objectives: A typical analysis of survival data involves the modeling of time-to-event data, such as the time till death. A frailty model is a random effect model for time-to-event data, where the random effect has a multiplicative influence on the baseline hazard function. This article aims to investigate the use of gamma frailty model with concomitant variable in order to individualize the prognostic factors that influence the liver cirrhosis patients’ survival times. Methods: During the one-year study period (May 2008-May 2009), data have been used from the recorded information of patients with liver cirrhosis who were scheduled for liver transplantation and were followed up for at least seven years in Imam Khomeini Hospital in Iran. In order to determine the effective factors for cirrhotic patients’ survival in the presence of latent variables, the gamma frailty distribution has been applied. In this article, it was considering the parametric model, such as Exponential and Weibull distributions for survival time. Data analysis is performed using R software, and the error level of 0.05 was considered for all tests. Results: 305 patients with liver cirrhosis including 180 (59%) men and 125 (41%) women were studied. The age average of patients was 39.8 years. At the end of the study, 82 (26%) patients died, among them 48 (58%) were men and 34 (42%) women. The main cause of liver cirrhosis was found hepatitis 'B' with 23%, followed by cryptogenic with 22.6% were identified as the second factor. Generally, 7-year’s survival was 28.44 months, for dead patients and for censoring was 19.33 and 31.79 months, respectively. Using multi-parametric survival models of progressive and regressive, Exponential and Weibull models with regard to the gamma frailty distribution were fitted to the cirrhosis data. In both models, factors including, age, bilirubin serum, albumin serum, and encephalopathy had a significant effect on survival time of cirrhotic patients. Conclusion: To investigate the effective factors for the time of patients’ death with liver cirrhosis in the presence of latent variables, gamma frailty model with parametric distributions seems desirable.Keywords: frailty model, latent variables, liver cirrhosis, parametric distribution
Procedia PDF Downloads 26140524 Adaptive Data Approximations Codec (ADAC) for AI/ML-based Cyber-Physical Systems
Authors: Yong-Kyu Jung
Abstract:
The fast growth in information technology has led to de-mands to access/process data. CPSs heavily depend on the time of hardware/software operations and communication over the network (i.e., real-time/parallel operations in CPSs (e.g., autonomous vehicles). Since data processing is an im-portant means to overcome the issue confronting data management, reducing the gap between the technological-growth and the data-complexity and channel-bandwidth. An adaptive perpetual data approximation method is intro-duced to manage the actual entropy of the digital spectrum. An ADAC implemented as an accelerator and/or apps for servers/smart-connected devices adaptively rescales digital contents (avg.62.8%), data processing/access time/energy, encryption/decryption overheads in AI/ML applications (facial ID/recognition).Keywords: adaptive codec, AI, ML, HPC, cyber-physical, cybersecurity
Procedia PDF Downloads 7840523 Exploring Causes of Homelessness and Shelter Entry: A Case Study Analysis of Shelter Data in New York
Authors: Lindsay Fink, Sarha Smith-Moyo, Leanne W. Charlesworth
Abstract:
In recent years, the number of individuals experiencing homelessness has increased in the United States. This paper analyzes 2019 data from 16 different emergency shelters in Monroe County, located in Upstate New York. The data were collected through the County’s Homeless Management Information System (HMIS), and individuals were de-identified and de-duplicated for analysis. The purpose of this study is to explore the basic characteristics of the homeless population in Monroe County, and the dynamics of shelter use. The results of this study showed gender as a significant factor when analyzing the relationship between demographic variables and recorded reasons for shelter entry. Results also indicated that age and ethnicity did not significantly influence odds of re-entering a shelter, but did significantly influence reasons for shelter entry. Overall, the most common recorded cause of shelter entry in 2019 in the examined county was eviction by primary tenant. Recommendations to better address recurrent shelter entry and potential chronic homelessness include more consideration for the diversity existing within the homeless population, and the dynamics leading to shelter stays, including enhanced funding and training for shelter staff, as well as expanded access to permanent supportive housing programs.Keywords: chronic homelessness, homeless shelter stays, permanent supportive housing, shelter population dynamics
Procedia PDF Downloads 15640522 The Use of Social Stories and Digital Technology as Interventions for Autistic Children; A State-Of-The-Art Review and Qualitative Data Analysis
Authors: S. Hussain, C. Grieco, M. Brosnan
Abstract:
Background and Aims: Autism is a complex neurobehavioural disorder, characterised by impairments in the development of language and communication skills. The study involved a state-of-art systematic review, in addition to qualitative data analysis, to establish the evidence for social stories as an intervention strategy for autistic children. An up-to-date review of the use of digital technologies in the delivery of interventions to autistic children was also carried out; to propose the efficacy of digital technologies and the use of social stories to improve intervention outcomes for autistic children. Methods: Two student researchers reviewed a range of randomised control trials and observational studies. The aim of the review was to establish if there was adequate evidence to justify recommending social stories to autistic patients. Students devised their own search strategies to be used across a range of search engines, including Ovid-Medline, Google Scholar and PubMed. Students then critically appraised the generated literature. Additionally, qualitative data obtained from a comprehensive online questionnaire on social stories was also thematically analysed. The thematic analysis was carried out independently by each researcher, using a ‘bottom-up’ approach, meaning contributors read and analysed responses to questions and devised semantic themes from reading the responses to a given question. The researchers then placed each response into a semantic theme or sub-theme. The students then joined to discuss the merging of their theme headings. The Inter-rater reliability (IRR) was calculated before and after theme headings were merged, giving IRR for pre- and post-discussion. Lastly, the thematic analysis was assessed by a third researcher, who is a professor of psychology and the director for the ‘Centre for Applied Autism Research’ at the University of Bath. Results: A review of the literature, as well as thematic analysis of qualitative data found supporting evidence for social story use. The thematic analysis uncovered some interesting themes from the questionnaire responses, relating to the reasons why social stories were used and the factors influencing their effectiveness in each case. However, overall, the evidence for digital technologies interventions was limited, and the literature could not prove a causal link between better intervention outcomes for autistic children and the use of technologies. However, they did offer valid proposed theories for the suitability of digital technologies for autistic children. Conclusions: Overall, the review concluded that there was adequate evidence to justify advising the use of social stories with autistic children. The role of digital technologies is clearly a fast-emerging field and appears to be a promising method of intervention for autistic children; however, it should not yet be considered an evidence-based approach. The students, using this research, developed ideas on social story interventions which aim to help autistic children.Keywords: autistic children, digital technologies, intervention, social stories
Procedia PDF Downloads 12140521 Case Study of Sexual Violence Victim Assessment in Semarang Regency
Authors: Sujana T, Kurniasari MD, Ayakeding AM
Abstract:
Background: Sexual violence is one of the violence with high incidence in Indonesia. Purpose: This research aims to describe the implementation of sexual violence victim assessment in Semarang Regency. Method: This research is a qualitative research with embeded single case study design. Data is analized with two units of analysis. The first unit of analysis is victim’s examiner with minimum one year of work experience. Semi-structured interview method is used to obtain the data. The second unit of analysis is document related. The data is taken by observing the pathway and description of every document and how it supported each implementation of assessment. Results: This study is resulted with three themes, which are: The first theme is assessments of sexual violence in Semarang regency has been standardized. The laws of the Republic of Indonesia have regulated the handling of victims of sexual violence in outline. Victims of sexual violence can be dealt with by the police, the Integrated Service Center for Women and Children Empowerment and the Regional General Hospital. Each examination site has different operational procedures standards for dealing with victims of sexual violence. Cooperation with family and witnesses is also required in the review process to obtain accurate results and evidence; The second idea that resulted from this study is there are inhibits factors in the assessments process. Victims sometimes feel embarrassed and reluctant to recount the chronological events during reporting. The examining officer should be able to approach and build a trust to convince the victim to be able to cooperate. The third theme is there are other things to consider in the process of assessing victims of sexual violence. Ensuring implementation in accordance with applicable operational procedures standards, providing exclusive examination rooms, counseling and safeguarding the privacy of victims are important to be considered in the assessment.Keywords: assessment, case study, Semarang regency, sexual violence
Procedia PDF Downloads 14040520 Relationship between Gender and Performance with Respect to a Basic Math Skills Quiz in Statistics Courses in Lebanon
Authors: Hiba Naccache
Abstract:
The present research investigated whether gender differences affect performance in a simple math quiz in statistics course. Participants of this study comprised a sample of 567 statistics students in two different universities in Lebanon. Data were collected through a simple math quiz. Analysis of quantitative data indicated that there wasn’t a significant difference in math performance between males and females. The results suggest that improvements in student performance may depend on improved mastery of basic algebra especially for females. The implications of these findings and further recommendations were discussed.Keywords: gender, education, math, statistics
Procedia PDF Downloads 37740519 The Documentary Analysis of Meta-Analysis Research in Violence of Media
Authors: Proud Arunrangsiwed
Abstract:
The part of “future direction” in the findings of meta-analysis could provide the great direction to conduct the future studies. This study, “The Documentary Analysis of Meta-Analysis Research in Violence of Media” would conclude “future directions” out of 10 meta-analysis papers. The purposes of this research are to find an appropriate research design or an appropriate methodology for the future research related to the topic, “violence of media”. Further research needs to explore by longitudinal and experimental design, and also needs to have a careful consideration about age effects, time spent effects, enjoyment effects, and ordinary lifestyle of each media consumer.Keywords: aggressive, future direction, meta-analysis, media, violence
Procedia PDF Downloads 41040518 Important Factors Affecting the Effectiveness of Quality Control Circles
Authors: Sogol Zarafshan
Abstract:
The present study aimed to identify important factors affecting the effectiveness of quality control circles in a hospital, as well as rank them using a combination of fuzzy VIKOR and Grey Relational Analysis (GRA). The study population consisted of five academic members and five experts in the field of nursing working in a hospital, who were selected using a purposive sampling method. Also, a sample of 107 nurses was selected through a simple random sampling method using their employee codes and the random-number table. The required data were collected using a researcher-made questionnaire which consisted of 12 factors. The validity of this questionnaire was confirmed through giving the opinions of experts and academic members who participated in the present study, as well as performing confirmatory factor analysis. Its reliability also was verified (α=0.796). The collected data were analyzed using SPSS 22.0 and LISREL 8.8, as well as VIKOR–GRA and IPA methods. The results of ranking the factors affecting the effectiveness of quality control circles showed that the highest and lowest ranks were related to ‘Managers’ and supervisors’ support’ and ‘Group leadership’. Also, the highest hospital performance was for factors such as ‘Clear goals and objectives’ and ‘Group cohesiveness and homogeneity’, and the lowest for ‘Reward system’ and ‘Feedback system’, respectively. The results showed that although ‘Training the members’, ‘Using the right tools’ and ‘Reward system’ were factors that were of great importance, the organization’s performance for these factors was poor. Therefore, these factors should be paid more attention by the studied hospital managers and should be improved as soon as possible.Keywords: Quality control circles, Fuzzy VIKOR, Grey Relational Analysis, Importance–Performance Analysis
Procedia PDF Downloads 13540517 Using Mixed Methods in Studying Classroom Social Network Dynamics
Authors: Nashrawan Naser Taha, Andrew M. Cox
Abstract:
In a multi-cultural learning context, where ties are weak and dynamic, combining qualitative with quantitative research methods may be more effective. Such a combination may also allow us to answer different types of question, such as about people’s perception of the network. In this study the use of observation, interviews and photos were explored as ways of enhancing data from social network questionnaires. Integrating all of these methods was found to enhance the quality of data collected and its accuracy, also providing a richer story of the network dynamics and the factors that shaped these changes over time.Keywords: mixed methods, social network analysis, multi-cultural learning, social network dynamics
Procedia PDF Downloads 51040516 Nepal Himalaya: Status of Women, Politics, and Administration
Authors: Tulasi Acharya
Abstract:
The paper is a qualitative analysis of status of women and women in politics and administration in Nepal Himalaya. The paper reviews data of women in civil service and in administrative levels. Looking at the Nepali politics and administration from the social constructivist perspective, the paper highlights some social and cultural issues that have othered women as “second sex.” As the country is heading towards modernity, gender friendly approaches are being instituted. Although the data reflects on the progress on women’s status and on women’s political and administrative participation, the data is not enough to predict the democratic gender practices in political and administrative levels. The political and administrative culture of Nepal Himalaya should be changed by promoting gender practices and deconstructing gender images in administrative culture through representative bureaucracy and by introducing democratic policies.Keywords: politics, policy, administration, culture, women, Nepal, democracy
Procedia PDF Downloads 53740515 Hidden Hot Spots: Identifying and Understanding the Spatial Distribution of Crime
Authors: Lauren C. Porter, Andrew Curtis, Eric Jefferis, Susanne Mitchell
Abstract:
A wealth of research has been generated examining the variation in crime across neighborhoods. However, there is also a striking degree of crime concentration within neighborhoods. A number of studies show that a small percentage of street segments, intersections, or addresses account for a large portion of crime. Not surprisingly, a focus on these crime hot spots can be an effective strategy for reducing community level crime and related ills, such as health problems. However, research is also limited in an important respect. Studies tend to use official data to identify hot spots, such as 911 calls or calls for service. While the use of call data may be more representative of the actual level and distribution of crime than some other official measures (e.g. arrest data), call data still suffer from the 'dark figure of crime.' That is, there is most certainly a degree of error between crimes that occur versus crimes that are reported to the police. In this study, we present an alternative method of identifying crime hot spots, that does not rely on official data. In doing so, we highlight the potential utility of neighborhood-insiders to identify and understand crime dynamics within geographic spaces. Specifically, we use spatial video and geo-narratives to record the crime insights of 36 police, ex-offenders, and residents of a high crime neighborhood in northeast Ohio. Spatial mentions of crime are mapped to identify participant-identified hot spots, and these are juxtaposed with calls for service (CFS) data. While there are bound to be differences between these two sources of data, we find that one location, in particular, a corner store, emerges as a hot spot for all three groups of participants. Yet it does not emerge when we examine CFS data. A closer examination of the space around this corner store and a qualitative analysis of narrative data reveal important clues as to why this store may indeed be a hot spot, but not generate disproportionate calls to the police. In short, our results suggest that researchers who rely solely on official data to study crime hot spots may risk missing some of the most dangerous places.Keywords: crime, narrative, video, neighborhood
Procedia PDF Downloads 23840514 Numerical Analysis of Swirling Chamber Using Improved Delayed Detached Eddy Simulation Turbulence Model
Authors: Hamad M. Alhajeri
Abstract:
Swirling chamber is a promising cooling method for heavily thermally loaded parts like turbine blades due to the additional circumferential velocity and therefore improved turbulent mixing of the fluid. This paper investigates numerically the effect of turbulence model on the heat convection of the swirling chamber. Grid independence analysis is conducted to obtain the proper grid dimension. The work validated with experimental data available in the literature. Flow analysis using improved delayed detached eddy simulation turbulence model and Reynolds averaged Navier-Stokes k-ɛ turbulence model is carried. The flow characteristic near the exit is reformed when improved delayed detached eddy simulation model used.Keywords: gas turbine, Nusselt number, flow characteristics, heat transfer
Procedia PDF Downloads 20140513 Unlocking Health Insights: Studying Data for Better Care
Authors: Valentina Marutyan
Abstract:
Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.Keywords: data mining, healthcare, big data, large amounts of data
Procedia PDF Downloads 7640512 In-situ Acoustic Emission Analysis of a Polymer Electrolyte Membrane Water Electrolyser
Authors: M. Maier, I. Dedigama, J. Majasan, Y. Wu, Q. Meyer, L. Castanheira, G. Hinds, P. R. Shearing, D. J. L. Brett
Abstract:
Increasing the efficiency of electrolyser technology is commonly seen as one of the main challenges on the way to the Hydrogen Economy. There is a significant lack of understanding of the different states of operation of polymer electrolyte membrane water electrolysers (PEMWE) and how these influence the overall efficiency. This in particular means the two-phase flow through the membrane, gas diffusion layers (GDL) and flow channels. In order to increase the efficiency of PEMWE and facilitate their spread as commercial hydrogen production technology, new analytic approaches have to be found. Acoustic emission (AE) offers the possibility to analyse the processes within a PEMWE in a non-destructive, fast and cheap in-situ way. This work describes the generation and analysis of AE data coming from a PEM water electrolyser, for, to the best of our knowledge, the first time in literature. Different experiments are carried out. Each experiment is designed so that only specific physical processes occur and AE solely related to one process can be measured. Therefore, a range of experimental conditions is used to induce different flow regimes within flow channels and GDL. The resulting AE data is first separated into different events, which are defined by exceeding the noise threshold. Each acoustic event consists of a number of consequent peaks and ends when the wave diminishes under the noise threshold. For all these acoustic events the following key attributes are extracted: maximum peak amplitude, duration, number of peaks, peaks before the maximum, average intensity of a peak and time till the maximum is reached. Each event is then expressed as a vector containing the normalized values for all criteria. Principal Component Analysis is performed on the resulting data, which orders the criteria by the eigenvalues of their covariance matrix. This can be used as an easy way of determining which criteria convey the most information on the acoustic data. In the following, the data is ordered in the two- or three-dimensional space formed by the most relevant criteria axes. By finding spaces in the two- or three-dimensional space only occupied by acoustic events originating from one of the three experiments it is possible to relate physical processes to certain acoustic patterns. Due to the complex nature of the AE data modern machine learning techniques are needed to recognize these patterns in-situ. Using the AE data produced before allows to train a self-learning algorithm and develop an analytical tool to diagnose different operational states in a PEMWE. Combining this technique with the measurement of polarization curves and electrochemical impedance spectroscopy allows for in-situ optimization and recognition of suboptimal states of operation.Keywords: acoustic emission, gas diffusion layers, in-situ diagnosis, PEM water electrolyser
Procedia PDF Downloads 15640511 Algorithm for Modelling Land Surface Temperature and Land Cover Classification and Their Interaction
Authors: Jigg Pelayo, Ricardo Villar, Einstine Opiso
Abstract:
The rampant and unintended spread of urban areas resulted in increasing artificial component features in the land cover types of the countryside and bringing forth the urban heat island (UHI). This paved the way to wide range of negative influences on the human health and environment which commonly relates to air pollution, drought, higher energy demand, and water shortage. Land cover type also plays a relevant role in the process of understanding the interaction between ground surfaces with the local temperature. At the moment, the depiction of the land surface temperature (LST) at city/municipality scale particularly in certain areas of Misamis Oriental, Philippines is inadequate as support to efficient mitigations and adaptations of the surface urban heat island (SUHI). Thus, this study purposely attempts to provide application on the Landsat 8 satellite data and low density Light Detection and Ranging (LiDAR) products in mapping out quality automated LST model and crop-level land cover classification in a local scale, through theoretical and algorithm based approach utilizing the principle of data analysis subjected to multi-dimensional image object model. The paper also aims to explore the relationship between the derived LST and land cover classification. The results of the presented model showed the ability of comprehensive data analysis and GIS functionalities with the integration of object-based image analysis (OBIA) approach on automating complex maps production processes with considerable efficiency and high accuracy. The findings may potentially lead to expanded investigation of temporal dynamics of land surface UHI. It is worthwhile to note that the environmental significance of these interactions through combined application of remote sensing, geographic information tools, mathematical morphology and data analysis can provide microclimate perception, awareness and improved decision-making for land use planning and characterization at local and neighborhood scale. As a result, it can aid in facilitating problem identification, support mitigations and adaptations more efficiently.Keywords: LiDAR, OBIA, remote sensing, local scale
Procedia PDF Downloads 28240510 Real-Time Visualization Using GPU-Accelerated Filtering of LiDAR Data
Authors: Sašo Pečnik, Borut Žalik
Abstract:
This paper presents a real-time visualization technique and filtering of classified LiDAR point clouds. The visualization is capable of displaying filtered information organized in layers by the classification attribute saved within LiDAR data sets. We explain the used data structure and data management, which enables real-time presentation of layered LiDAR data. Real-time visualization is achieved with LOD optimization based on the distance from the observer without loss of quality. The filtering process is done in two steps and is entirely executed on the GPU and implemented using programmable shaders.Keywords: filtering, graphics, level-of-details, LiDAR, real-time visualization
Procedia PDF Downloads 30840509 An Infinite Mixture Model for Modelling Stutter Ratio in Forensic Data Analysis
Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer
Abstract:
Forensic DNA analysis has received much attention over the last three decades, due to its incredible usefulness in human identification. The statistical interpretation of DNA evidence is recognised as one of the most mature fields in forensic science. Peak heights in an Electropherogram (EPG) are approximately proportional to the amount of template DNA in the original sample being tested. A stutter is a minor peak in an EPG, which is not masking as an allele of a potential contributor, and considered as an artefact that is presumed to be arisen due to miscopying or slippage during the PCR. Stutter peaks are mostly analysed in terms of stutter ratio that is calculated relative to the corresponding parent allele height. Analysis of mixture profiles has always been problematic in evidence interpretation, especially with the presence of PCR artefacts like stutters. Unlike binary and semi-continuous models; continuous models assign a probability (as a continuous weight) for each possible genotype combination, and significantly enhances the use of continuous peak height information resulting in more efficient reliable interpretations. Therefore, the presence of a sound methodology to distinguish between stutters and real alleles is essential for the accuracy of the interpretation. Sensibly, any such method has to be able to focus on modelling stutter peaks. Bayesian nonparametric methods provide increased flexibility in applied statistical modelling. Mixture models are frequently employed as fundamental data analysis tools in clustering and classification of data and assume unidentified heterogeneous sources for data. In model-based clustering, each unknown source is reflected by a cluster, and the clusters are modelled using parametric models. Specifying the number of components in finite mixture models, however, is practically difficult even though the calculations are relatively simple. Infinite mixture models, in contrast, do not require the user to specify the number of components. Instead, a Dirichlet process, which is an infinite-dimensional generalization of the Dirichlet distribution, is used to deal with the problem of a number of components. Chinese restaurant process (CRP), Stick-breaking process and Pólya urn scheme are frequently used as Dirichlet priors in Bayesian mixture models. In this study, we illustrate an infinite mixture of simple linear regression models for modelling stutter ratio and introduce some modifications to overcome weaknesses associated with CRP.Keywords: Chinese restaurant process, Dirichlet prior, infinite mixture model, PCR stutter
Procedia PDF Downloads 33040508 Identifying the Faces of colonialism: An Analysis of Gender Inequalities in Economic Participation in Pakistan through Postcolonial Feminist Lens
Authors: Umbreen Salim, Anila Noor
Abstract:
This paper analyses the influences and faces of colonialism in women’s participation in economic activity in postcolonial Pakistan, through postcolonial feminist economic lens. It is an attempt to probe the shifts in gender inequalities that have existed in three stages; pre-colonial, colonial, and postcolonial times in the Indo-Pak subcontinent. It delves into an inquiry of pre-colonial as it is imperative to understand the situation and context before colonisation in order to assess the deviations associated with its onset. Hence, in order to trace gender inequalities this paper analyses from Mughal Era (1526-1757) that existed before British colonisation, then, the gender inequalities that existed during British colonisation (1857- 1947) and the associated dynamics and changes in women’s vulnerabilities to participate in the economy are examined. Followed by, the postcolonial (1947 onwards) scenario of discriminations and oppressions faced by women. As part of the research methodology, primary and secondary data analysis was done. Analysis of secondary data including literary works and photographs was carried out, followed by primary data collection using ethnographic approaches and participatory tools to understand the presence of coloniality and gender inequalities embedded in the social structure through participant’s real-life stories. The data is analysed using feminist postcolonial analysis. Intersectionality has been a key tool of analysis as the paper delved into the gender inequalities through the class and caste lens briefly touching at religion. It is imperative to mention the significance of the study and very importantly the practical challenges as historical analysis of 18th and 19th century is involved. Most of the available work on history is produced by a) men and b) foreigners and mostly white authors. Since the historical analysis is mostly by men the gender analysis presented misses on many aspects of women’s issues and since the authors have been mostly white European gives it as Mohanty says, ‘under western eyes’ perspective. Whereas the edge of this paper is the authors’ deep attachment, belongingness as lived reality and work with women in Pakistan as postcolonial subjects, a better position to relate with the social reality and understand the phenomenon. The study brought some key results as gender inequalities existed before colonisation when women were hidden wheel of stable economy which was completely invisible. During the British colonisation, the vulnerabilities of women only increased and as compared to men their inferiority status further strengthened. Today, the postcolonial woman lives in deep-rooted effects of coloniality where she is divided in class and position within the class, and she has to face gender inequalities within household and in the market for economic participation. Gender inequalities have existed in pre-colonial, during colonisation and postcolonial times in Pakistan with varying dynamics, degrees and intensities for women whereby social class, caste and religion have been key factors defining the extent of discrimination and oppression. Colonialism may have physically ended but the coloniality remains and has its deep, broad and wide effects in increasing gender inequalities in women’s participation in the economy in Pakistan.Keywords: colonialism, economic participation, gender inequalities, women
Procedia PDF Downloads 20840507 Considering Partially Developed Artifacts in Change Impact Analysis Implementation
Authors: Nazri Kama, Sufyan Basri, Roslina Ibrahim
Abstract:
It is important to manage the changes in the software to meet the evolving needs of the customer. Accepting too many changes causes delay in the completion and it incurs additional cost. One type of information that helps to make the decision is through change impact analysis. Current impact analysis approaches assume that all classes in the class artifact are completely developed and the class artifact is used as a source of analysis. However, these assumptions are impractical for impact analysis in the software development phase as some classes in the class artifact are still under development or partially developed that leads to inaccuracy. This paper presents a novel impact analysis approach to be used in the software development phase. The significant achievements of the approach are demonstrated through an extensive experimental validation using three case studies.Keywords: software development, impact analysis, traceability, static analysis.
Procedia PDF Downloads 60840506 Life Cycle Assesment (LCA) Study of Shrimp Fishery in the South East Coast of Arabian Sea
Authors: Leela Edwin, Rithin Joseph, P. H. Dhiju Das, K. A. Sayana, P. S. Muhammed Sherief
Abstract:
The shrimp trawl fishery is considered one of the more valuable fisheries from the South east Coast of Arabian Sea. Inventory data for the shrimp were collected over 1 year period and used to carry out a life cycle assessment (LCA). LCA was performed to assess and compare the environmental impacts associated with the fishing operations related to shrimp fishery. This analysis included the operation of the vessels, together with major inputs related to the production of diesel, trawl nets, or anti-fouling paints. Data regarding vessel operation was obtained from the detailed questionnaires filled out by 180 trawlers. The analysis on environmental impacts linked to shrimp extraction on a temporal scale, showed that varying landings enhanced the environmental burdens mainly associated with activities related to diesel production, transport and consumption of the fishing vessels. Discard rates for trawlers were also identified as a major environmental impact in this fishery.Keywords: shrimp trawling, life cycle assesment (LCA), Arabian sea, environmental impacts
Procedia PDF Downloads 32240505 Determination of Hydrocarbon Path Migration from Gravity Data Analysis (Ghadames Basin, Southern Tunisia, North Africa)
Authors: Mohamed Dhaoui, Hakim Gabtni
Abstract:
The migration of hydrocarbons is a fairly complicated process that depends on several parameters, both structural and sedimentological. In this study, we will try to determine secondary migration paths which convey hydrocarbon from their main source rock to the largest reservoir of the Paleozoic petroleum system of the Tunisian part of Ghadames basin. In fact, The Silurian source rock is the main source rock of the Paleozoic petroleum system of the Ghadames basin. However, the most solicited reservoir in this area is the Triassic reservoir TAGI (Trias Argilo-Gréseux Inférieur). Several geochemical studies have confirmed that oil products TAGI come mainly from the Tannezuft Silurian source rock. That being said that secondary migration occurs through the fault system which affects the post-Silurian series. Our study is based on analysis and interpretation of gravity data. The gravity modeling was conducted in the northern part of Ghadames basin and the Telemzane uplift. We noted that there is a close relationship between the location of producing oil fields and gravity gradients which separate the positive and negative gravity anomalies. In fact, the analysis and transformation of the Bouguer anomaly map, and the residual gravity map allowed as understanding the architecture of the Precambrian in the study area, thereafter gravimetric models were established allowed to determine the probable migration path.Keywords: basement, Ghadames, gravity, hydrocarbon, migration path
Procedia PDF Downloads 36640504 Evidence of Climate Change from Statistical Analysis of Temperature and Rainfall Data of Kaduna State, Nigeria
Authors: Iliya Bitrus Abaje
Abstract:
This study examines the evidence of climate change scenario in Kaduna State from the analysis of temperature and rainfall data (1976-2015) from three meteorological stations along a geographic transect from the southern part to the northern part of the State. Different statistical methods were used in determining the changes in both the temperature and rainfall series. The result of the linear trend lines revealed a mean increase in average temperature of 0.73oC for the 40 years period of study in the State. The plotted standard deviation for the temperature anomalies generally revealed that years of temperatures above the mean standard deviation (hotter than the normal conditions) in the last two decades (1996-2005 and 2006-2015) were more than those below (colder than the normal condition). The Cramer’s test and student’s t-test generally revealed an increasing temperature trend in the recent decades. The increased in temperature is an evidence that the earth’s atmosphere is getting warmer in recent years. The linear trend line equation of the annual rainfall for the period of study showed a mean increase of 316.25 mm for the State. Findings also revealed that the plotted standard deviation for the rainfall anomalies, and the 10-year non-overlapping and 30-year overlapping sub-periods analysis in all the three stations generally showed an increasing trend from the beginning of the data to the recent years. This is an evidence that the study area is now experiencing wetter conditions in recent years and hence climate change. The study recommends diversification of the economic base of the populace with emphasis on moving away from activities that are sensitive to temperature and rainfall extremes Also, appropriate strategies to ameliorate the scourge of climate change at all levels/sectors should always take into account the recent changes in temperature and rainfall amount in the area.Keywords: anomalies, linear trend, rainfall, temperature
Procedia PDF Downloads 31840503 Estimating Destinations of Bus Passengers Using Smart Card Data
Authors: Hasik Lee, Seung-Young Kho
Abstract:
Nowadays, automatic fare collection (AFC) system is widely used in many countries. However, smart card data from many of cities does not contain alighting information which is necessary to build OD matrices. Therefore, in order to utilize smart card data, destinations of passengers should be estimated. In this paper, kernel density estimation was used to forecast probabilities of alighting stations of bus passengers and applied to smart card data in Seoul, Korea which contains boarding and alighting information. This method was also validated with actual data. In some cases, stochastic method was more accurate than deterministic method. Therefore, it is sufficiently accurate to be used to build OD matrices.Keywords: destination estimation, Kernel density estimation, smart card data, validation
Procedia PDF Downloads 35240502 Bioinformatics High Performance Computation and Big Data
Authors: Javed Mohammed
Abstract:
Right now, bio-medical infrastructure lags well behind the curve. Our healthcare system is dispersed and disjointed; medical records are a bit of a mess; and we do not yet have the capacity to store and process the crazy amounts of data coming our way from widespread whole-genome sequencing. And then there are privacy issues. Despite these infrastructure challenges, some researchers are plunging into bio medical Big Data now, in hopes of extracting new and actionable knowledge. They are doing delving into molecular-level data to discover bio markers that help classify patients based on their response to existing treatments; and pushing their results out to physicians in novel and creative ways. Computer scientists and bio medical researchers are able to transform data into models and simulations that will enable scientists for the first time to gain a profound under-standing of the deepest biological functions. Solving biological problems may require High-Performance Computing HPC due either to the massive parallel computation required to solve a particular problem or to algorithmic complexity that may range from difficult to intractable. Many problems involve seemingly well-behaved polynomial time algorithms (such as all-to-all comparisons) but have massive computational requirements due to the large data sets that must be analyzed. High-throughput techniques for DNA sequencing and analysis of gene expression have led to exponential growth in the amount of publicly available genomic data. With the increased availability of genomic data traditional database approaches are no longer sufficient for rapidly performing life science queries involving the fusion of data types. Computing systems are now so powerful it is possible for researchers to consider modeling the folding of a protein or even the simulation of an entire human body. This research paper emphasizes the computational biology's growing need for high-performance computing and Big Data. It illustrates this article’s indispensability in meeting the scientific and engineering challenges of the twenty-first century, and how Protein Folding (the structure and function of proteins) and Phylogeny Reconstruction (evolutionary history of a group of genes) can use HPC that provides sufficient capability for evaluating or solving more limited but meaningful instances. This article also indicates solutions to optimization problems, and benefits Big Data and Computational Biology. The article illustrates the Current State-of-the-Art and Future-Generation Biology of HPC Computing with Big Data.Keywords: high performance, big data, parallel computation, molecular data, computational biology
Procedia PDF Downloads 36340501 Data Analysis Tool for Predicting Water Scarcity in Industry
Authors: Tassadit Issaadi Hamitouche, Nicolas Gillard, Jean Petit, Valerie Lavaste, Celine Mayousse
Abstract:
Water is a fundamental resource for the industry. It is taken from the environment either from municipal distribution networks or from various natural water sources such as the sea, ocean, rivers, aquifers, etc. Once used, water is discharged into the environment, reprocessed at the plant or treatment plants. These withdrawals and discharges have a direct impact on natural water resources. These impacts can apply to the quantity of water available, the quality of the water used, or to impacts that are more complex to measure and less direct, such as the health of the population downstream from the watercourse, for example. Based on the analysis of data (meteorological, river characteristics, physicochemical substances), we wish to predict water stress episodes and anticipate prefectoral decrees, which can impact the performance of plants and propose improvement solutions, help industrialists in their choice of location for a new plant, visualize possible interactions between companies to optimize exchanges and encourage the pooling of water treatment solutions, and set up circular economies around the issue of water. The development of a system for the collection, processing, and use of data related to water resources requires the functional constraints specific to the latter to be made explicit. Thus the system will have to be able to store a large amount of data from sensors (which is the main type of data in plants and their environment). In addition, manufacturers need to have 'near-real-time' processing of information in order to be able to make the best decisions (to be rapidly notified of an event that would have a significant impact on water resources). Finally, the visualization of data must be adapted to its temporal and geographical dimensions. In this study, we set up an infrastructure centered on the TICK application stack (for Telegraf, InfluxDB, Chronograf, and Kapacitor), which is a set of loosely coupled but tightly integrated open source projects designed to manage huge amounts of time-stamped information. The software architecture is coupled with the cross-industry standard process for data mining (CRISP-DM) data mining methodology. The robust architecture and the methodology used have demonstrated their effectiveness on the study case of learning the level of a river with a 7-day horizon. The management of water and the activities within the plants -which depend on this resource- should be considerably improved thanks, on the one hand, to the learning that allows the anticipation of periods of water stress, and on the other hand, to the information system that is able to warn decision-makers with alerts created from the formalization of prefectoral decrees.Keywords: data mining, industry, machine Learning, shortage, water resources
Procedia PDF Downloads 12140500 Offshore Facilities Load Out: Case Study of Jacket Superstructure Loadout by Strand Jacking Skidding Method
Authors: A. Rahim Baharudin, Nor Arinee binti Mat Saaud, Muhammad Afiq Azman, Farah Adiba A. Sani
Abstract:
Objectives: This paper shares the case study on the engineering analysis, data analysis, and real-time data comparison for qualifying the stand wires' minimum breaking load and safe working load upon loadout operation for a new project and, at the same time, eliminate the risk due to discrepancies and unalignment of COMPANY Technical Standards to Industry Standards and Practices. This paper demonstrates “Lean Construction” for COMPANY’s Project by sustaining fit-for-purpose Technical Requirements of Loadout Strand Wire Factor of Safety (F.S). The case study utilizes historical engineering data from a few loadout operations by skidding methods from different projects. It is also demonstrating and qualifying the skidding wires' minimum breaking load and safe working load used for loadout operation for substructure and other facilities for the future. Methods: Engineering analysis and comparison of data were taken as referred to the international standard and internal COMPANY standard requirements. Data was taken from nine (9) previous projects for both topsides and jacket facilities executed at the several local fabrication yards where load out was conducted by three (3) different service providers with emphasis on four (4) basic elements: i) Industry Standards for Loadout Engineering and Operation Reference: COMPANY internal standard was referred to superseded documents of DNV-OS-H201 and DNV/GL 0013/ND. DNV/GL 0013/ND and DNVGL-ST-N001 do not mention any requirements of Strand Wire F.S of 4.0 for Skidding / Pulling Operations. ii) Reference to past Loadout Engineering and Execution Package: Reference was made to projects delivered by three (3) major offshore facilities operators. Strand Wire F.S observed ranges from 2.0 MBL (Min) to 2.5 MBL (Max). No Loadout Operation using the requirements of 4.0 MBL was sighted from the reference. iii) Strand Jack Equipment Manufacturer Datasheet Reference: Referring to Strand Jack Equipment Manufactured Datasheet by different loadout service providers, it is shown that the Designed F.S for the equipment is also ranging between 2.0 ~ 2.5. Eight (8) Strand Jack Datasheet Model was referred to, ranging from 15 Mt to 850 Mt Capacity; however, there are NO observations of designed F.S 4.0 sighted. iv) Site Monitoring on Actual Loadout Data and Parameter: Max Load on Strand Wire was captured during 2nd Breakout, which is during Static Condition of 12.9 MT / Strand Wire (67.9% Utilization). Max Load on Strand Wire for Dynamic Conditions during Step 8 and Step 12 is 9.4 Mt / Strand Wire (49.5% Utilization). Conclusion: This analysis and study demonstrated the adequacy of strand wires supplied by the service provider were technically sufficient in terms of strength, and via engineering analysis conducted, the minimum breaking load and safe working load utilized and calculated for the projects were satisfied and operated safely for the projects. It is recommended from this study that COMPANY’s technical requirements are to be revised for future projects’ utilization.Keywords: construction, load out, minimum breaking load, safe working load, strand jacking, skidding
Procedia PDF Downloads 112