Search results for: big data in higher education
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 35169

Search results for: big data in higher education

22569 Measuring Environmental Efficiency of Energy in OPEC Countries

Authors: Bahram Fathi, Seyedhossein Sajadifar, Naser Khiabani

Abstract:

Data envelopment analysis (DEA) has recently gained popularity in energy efficiency analysis. A common feature of the previously proposed DEA models for measuring energy efficiency performance is that they treat energy consumption as an input within a production framework without considering undesirable outputs. However, energy use results in the generation of undesirable outputs as byproducts of producing desirable outputs. Within a joint production framework of both desirable and undesirable outputs, this paper presents several DEA-type linear programming models for measuring energy efficiency performance. In addition to considering undesirable outputs, our models treat different energy sources as different inputs so that changes in energy mix could be accounted for in evaluating energy efficiency. The proposed models are applied to measure the energy efficiency performances of 12 OPEC countries and the results obtained are presented.

Keywords: energy efficiency, undesirable outputs, data envelopment analysis

Procedia PDF Downloads 716
22568 Exploring Barriers and Pathways to Wellbeing and Sources of Resilience of Refugee Mothers in Calgary during the COVID-19 Pandemic: The Role of Home Instruction for Parents of Preschool Youngsters (HIPPY)

Authors: Chloe Zivot, Natasha Vattikonda, Debbie Bell

Abstract:

We conducted interviews with refugee mothers (n=28) participating in the Home Instruction for Parents of Preschool Youngsters (HIPPY) program in Calgary to explore experiences of wellbeing and resilience during the COVID-19 pandemic. Disruptions to education and increased isolation, and parental duties contributed to decreased wellbeing. Mothers identified tangible protective factors at the micro, meso, and macro levels. HIPPY played a substantial role in pandemic resilience, speaking to the potential of home-based intervention models in mitigating household adversity.

Keywords: refugee resettlement, family wellbeing, COVID-19, motherhood, resilience, gender, health

Procedia PDF Downloads 189
22567 Big Data and Cardiovascular Healthcare Management: Recent Advances, Future Potential and Pitfalls

Authors: Maariyah Irfan

Abstract:

Intro: Current cardiovascular (CV) care faces challenges such as low budgets and high hospital admission rates. This review aims to evaluate Big Data in CV healthcare management through the use of wearable devices in atrial fibrillation (AF) detection. AF may present intermittently, thus it is difficult for a healthcare professional to capture and diagnose a symptomatic rhythm. Methods: The iRhythm ZioPatch, AliveCor portable electrocardiogram (ECG), and Apple Watch were chosen for review due to their involvement in controlled clinical trials, and their integration with smartphones. The cost-effectiveness and AF detection of these devices were compared against the 12-lead ambulatory ECG (Holter monitor) that the NHS currently employs for the detection of AF. Results: The Zio patch was found to detect more arrhythmic events than the Holter monitor over a 2-week period. When patients presented to the emergency department with palpitations, AliveCor portable ECGs detected 6-fold more symptomatic events compared to the standard care group over 3-months. Based off preliminary results from the Apple Heart Study, only 0.5% of participants received irregular pulse notifications from the Apple Watch. Discussion: The Zio Patch and AliveCor devices have promising potential to be implemented into the standard duty of care offered by the NHS as they compare well to current routine measures. Nonetheless, companies must address the discrepancy between their target population and current consumers as those that could benefit the most from the innovation may be left out due to cost and access.

Keywords: atrial fibrillation, big data, cardiovascular healthcare management, wearable devices

Procedia PDF Downloads 122
22566 Theoretical Approach of Maritime Transport Sector’s Specialist’s Resilience Enhancement

Authors: Elena Valionienė, Genutė Kalvaitienė

Abstract:

The issue of resilience of an individual, an organisation, or an entire ecosystem of organisations has recently become an integral part of the education system, where the uncertainties that lead to societal development in the short term create economic, social, and psycho-emotional instability. The Maritime Transport Sector (MTS) is no exception, and the aim of the article is to model the possibilities of enhancing the professional, sociocultural, and psycho-emotional resilience of MTS specialists to proactively respond to crises caused by uncertainties. The research consists of theoretical model creation that helps to identify general maritime business resilience factors and critical success factors. This can develop high resilience and achieve business excellence in a highly volatile, uncertain, complex, and ambiguous (VUCA) environment.

Keywords: maritime transport sector, resilience, uncertainties, VUCA

Procedia PDF Downloads 61
22565 CPPI Method with Conditional Floor: The Discrete Time Case

Authors: Hachmi Ben Ameur, Jean Luc Prigent

Abstract:

We propose an extension of the CPPI method, which is based on conditional floors. In this framework, we examine in particular the TIPP and margin based strategies. These methods allow keeping part of the past gains and protecting the portfolio value against future high drawdowns of the financial market. However, as for the standard CPPI method, the investor can benefit from potential market rises. To control the risk of such strategies, we introduce both Value-at-Risk (VaR) and Expected Shortfall (ES) risk measures. For each of these criteria, we show that the conditional floor must be higher than a lower bound. We illustrate these results, for a quite general ARCH type model, including the EGARCH (1,1) as a special case.

Keywords: CPPI, conditional floor, ARCH, VaR, expected ehortfall

Procedia PDF Downloads 287
22564 On-Plot Piping Corrosion Analysis for Gas and Oil Separation Plants (GOSPs)

Authors: Sultan A. Al Shaqaq

Abstract:

Corrosion is a serious challenge for a piping system in our Gas and Oil Separation Plant (GOSP) that causes piping failures. Two GOSPs (Plant-A and Plant-B) observed chronic corrosion issue with an on-plot piping system that leads to having more piping replacement during the past years. Since it is almost impossible to avoid corrosion, it is becoming more obvious that managing the corrosion level may be the most economical resolution. Corrosion engineers are thus increasingly involved in approximating the cost of their answers to corrosion prevention, and assessing the useful life of the equipment. This case study covers the background of corrosion encountered in piping internally and externally in these two GOSPs. The collected piping replacement data from year of 2011 to 2014 was covered. These data showed the replicate corrosion levels in an on-plot piping system. Also, it is included the total piping replacement with drain lines system and other service lines in plants (Plant-A and Plant-B) at Saudi Aramco facility.

Keywords: gas and oil separation plant, on-plot piping, drain lines, Saudi Aramco

Procedia PDF Downloads 315
22563 Assessment of Groundwater Quality in Kaltungo Local Government Area of Gombe State

Authors: Rasaq Bello, Grace Akintola Sunday, Yemi Sikiru Onifade

Abstract:

Groundwater is required for the continuity of life and sustainability of the ecosystem. Hence, this research was purposed to assess groundwater quality for domestic use in Kaltungo Local Government Area, Gombe State. The work was also aimed at determining the thickness and resistivity of the topsoil, areas suitable for borehole construction, quality and potentials of groundwater in the study area. The study area extends from latitude N10015’38” - E11008’01” and longitude N10019’29” - E11013’05”. The data was acquired using the Vertical Electrical Sounding (VES) method and processed using IP12win software. Twenty (20) Vertical Electrical Soundings were carried out with a maximum current electrode separation (AB) of 150m. The VES curves generated from the data reveal that all the VES points have five to six subsurface layers. The first layer has a resistivity value of 7.5 to 364.1 Ωm and a thickness ranging from 0.8 to 7.4m, and the second layer has a resistivity value of 1.8 to 600.3 Ωm thickness ranging from 2.6 to 31.4m, the third layer has resistivity value of 23.3 to 564.4 Ωm thickness ranging from 10.3 to 77.8m, the fourth layer has resistivity value of 19.7 to 640.2 Ωm thickness ranging from 8.2m to 120.0m, the fifth layer has resistivity value of 27 to 234 Ωm thickness ranging from 8.2 to 53.7m and the six-layer is the layer that extended beyond the probing depth. The VES curves generated from the data revealed KQHA curve type for VES 1, HKQQ curve for VES 4, HKQ curve for VES 5, KHA curve for VES 11, QQHK curve for VES 12, HAA curve for VES 6 and VES 19, HAKH curve for VES 7, VES 8, VES 10 and VES 18, HKH curve for VES 2, VES 3, VES 9, VES 13, VES 14, VES 15, VES 16, VES 17 and VES 20. Values of the Coefficient of Anisotropy, Reflection Coefficient, and Resistivity Contrast obtained from the Dar-Zarrouk parameters indicated good water prospects for all the VES points in this study, with VES points 4, 9 and 18 having the highest prospects for groundwater exploration.

Keywords: formation parameters, groundwater, resistivity, resistivity contrast, vertical electrical sounding

Procedia PDF Downloads 31
22562 Enabling Citizen Participation in Urban Planning through Geospatial Gamification

Authors: Joanne F. Hayek

Abstract:

This study explores the use of gamification to promote citizen e-participation in urban planning. The research departs from a case study: the ‘Shape Your City’ web app designed and programmed by the author and presented as part of the 2021 Dubai Design Week to engage citizens in the co-creation of the future of their city through a gamified experience. The paper documents the design and development methodology of the web app and concludes with the findings of its pilot release. The case study explores the use of mobile interactive mapping, real-time data visualization, augmented reality, and machine learning as tools to enable co-planning. The paper also details the user interface design strategies employed to integrate complex cross-sector e-planning systems and make them accessible to citizens.

Keywords: gamification, co-planning, citizen e-participation, mobile interactive mapping, real-time data visualization

Procedia PDF Downloads 129
22561 Nonparametric Quantile Regression for Multivariate Spatial Data

Authors: S. H. Arnaud Kanga, O. Hili, S. Dabo-Niang

Abstract:

Spatial prediction is an issue appealing and attracting several fields such as agriculture, environmental sciences, ecology, econometrics, and many others. Although multiple non-parametric prediction methods exist for spatial data, those are based on the conditional expectation. This paper took a different approach by examining a non-parametric spatial predictor of the conditional quantile. The study especially observes the stationary multidimensional spatial process over a rectangular domain. Indeed, the proposed quantile is obtained by inverting the conditional distribution function. Furthermore, the proposed estimator of the conditional distribution function depends on three kernels, where one of them controls the distance between spatial locations, while the other two control the distance between observations. In addition, the almost complete convergence and the convergence in mean order q of the kernel predictor are obtained when the sample considered is alpha-mixing. Such approach of the prediction method gives the advantage of accuracy as it overcomes sensitivity to extreme and outliers values.

Keywords: conditional quantile, kernel, nonparametric, stationary

Procedia PDF Downloads 138
22560 Disparities in Language Competence and Conflict: The Moderating Role of Cultural Intelligence in Intercultural Interactions

Authors: Catherine Peyrols Wu

Abstract:

Intercultural interactions are becoming increasingly common in organizations and life. These interactions are often the stage of miscommunication and conflict. In management research, these problems are commonly attributed to cultural differences in values and interactional norms. As a result, the notion that intercultural competence can minimize these challenges is widely accepted. Cultural differences, however, are not the only source of a challenge during intercultural interactions. The need to rely on a lingua franca – or common language between people who have different mother tongues – is another important one. In theory, a lingua franca can improve communication and ease coordination. In practice however, disparities in people’s ability and confidence to communicate in the language can exacerbate tensions and generate inefficiencies. In this study, we draw on power theory to develop a model of disparities in language competence and conflict in a multicultural work context. Specifically, we hypothesized that differences in language competence between interaction partners would be positively related to conflict such that people would report greater conflict with partners who have more dissimilar levels of language competence and lesser conflict with partners with more similar levels of language competence. Furthermore, we proposed that cultural intelligence (CQ) an intercultural competence that denotes an individual’s capability to be effective in intercultural situations, would weaken the relationship between disparities in language competence and conflict such that people would report less conflict with partners who have more dissimilar levels of language competence when the interaction partner has high CQ and more conflict when the partner has low CQ. We tested this model with a sample of 135 undergraduate students working in multicultural teams for 13 weeks. We used a round-robin design to examine conflict in 646 dyads nested within 21 teams. Results of analyses using social relations modeling provided support for our hypotheses. Specifically, we found that in intercultural dyads with large disparities in language competence, partners with the lowest level of language competence would report higher levels of interpersonal conflict. However, this relationship disappeared when the partner with higher language competence was also high in CQ. These findings suggest that communication in a lingua franca can be a source of conflict in intercultural collaboration when partners differ in their level of language competence and that CQ can alleviate these effects during collaboration with partners who have relatively lower levels of language competence. Theoretically, this study underscores the benefits of CQ as a complement to language competence for intercultural effectiveness. Practically, these results further attest to the benefits of investing resources to develop language competence and CQ in employees engaged in multicultural work.

Keywords: cultural intelligence, intercultural interactions, language competence, multicultural teamwork

Procedia PDF Downloads 153
22559 Particle Swarm Optimization and Quantum Particle Swarm Optimization to Multidimensional Function Approximation

Authors: Diogo Silva, Fadul Rodor, Carlos Moraes

Abstract:

This work compares the results of multidimensional function approximation using two algorithms: the classical Particle Swarm Optimization (PSO) and the Quantum Particle Swarm Optimization (QPSO). These algorithms were both tested on three functions - The Rosenbrock, the Rastrigin, and the sphere functions - with different characteristics by increasing their number of dimensions. As a result, this study shows that the higher the function space, i.e. the larger the function dimension, the more evident the advantages of using the QPSO method compared to the PSO method in terms of performance and number of necessary iterations to reach the stop criterion.

Keywords: PSO, QPSO, function approximation, AI, optimization, multidimensional functions

Procedia PDF Downloads 562
22558 A Novel Machine Learning Approach to Aid Agrammatism in Non-fluent Aphasia

Authors: Rohan Bhasin

Abstract:

Agrammatism in non-fluent Aphasia Cases can be defined as a language disorder wherein a patient can only use content words ( nouns, verbs and adjectives ) for communication and their speech is devoid of functional word types like conjunctions and articles, generating speech of with extremely rudimentary grammar . Past approaches involve Speech Therapy of some order with conversation analysis used to analyse pre-therapy speech patterns and qualitative changes in conversational behaviour after therapy. We describe this approach as a novel method to generate functional words (prepositions, articles, ) around content words ( nouns, verbs and adjectives ) using a combination of Natural Language Processing and Deep Learning algorithms. The applications of this approach can be used to assist communication. The approach the paper investigates is : LSTMs or Seq2Seq: A sequence2sequence approach (seq2seq) or LSTM would take in a sequence of inputs and output sequence. This approach needs a significant amount of training data, with each training data containing pairs such as (content words, complete sentence). We generate such data by starting with complete sentences from a text source, removing functional words to get just the content words. However, this approach would require a lot of training data to get a coherent input. The assumptions of this approach is that the content words received in the inputs of both text models are to be preserved, i.e, won't alter after the functional grammar is slotted in. This is a potential limit to cases of severe Agrammatism where such order might not be inherently correct. The applications of this approach can be used to assist communication mild Agrammatism in non-fluent Aphasia Cases. Thus by generating these function words around the content words, we can provide meaningful sentence options to the patient for articulate conversations. Thus our project translates the use case of generating sentences from content-specific words into an assistive technology for non-Fluent Aphasia Patients.

Keywords: aphasia, expressive aphasia, assistive algorithms, neurology, machine learning, natural language processing, language disorder, behaviour disorder, sequence to sequence, LSTM

Procedia PDF Downloads 149
22557 Determining the Effectiveness of Dialectical Behavior Therapy in Reducing the Psychopathic Deviance of Criminals

Authors: Setareh Gerayeli

Abstract:

The present study tries to determine the effectiveness of dialectical behavior therapy in reducing the psychopathic deviance of employed criminals released from prison. The experimental method was used in this study, and the statistical population included employed criminals released from prison in Mashhad. Thirty offenders were selected randomly as the samples of the study. The MMPI-2 was used to collect data in the pre-test and post-test stages. The behavioral therapy was conducted on the experimental group during fourteen two and a half hour sessions, while the control group did not receive any intervention. Data analysis was conducted by using covariance. The results showed there is a significant difference between the post-test mean scores of the two groups. The findings suggest that dialectical behavior therapy is effective in reducing psychopathic deviance.

Keywords: criminals, dialectical behavior therapy, psychopathic deviance, prison

Procedia PDF Downloads 219
22556 Banks Profitability Indicators in CEE Countries

Authors: I. Erins, J. Erina

Abstract:

The aim of the present article is to determine the impact of the external and internal factors of bank performance on the profitability indicators of the CEE countries banks in the period from 2006 to 2012. On the basis of research conducted abroad on bank and macroeconomic profitability indicators, in order to obtain research results, the authors evaluated return on average assets (ROAA) and return on average equity (ROAE) indicators of the CEE countries banks. The authors analyzed profitability indicators of banks using descriptive methods, SPSS data analysis methods as well as data correlation and linear regression analysis. The authors concluded that most internal and external indicators of bank performance have no direct effect on the profitability of the banks in the CEE countries. The only exceptions are credit risk and bank size which affect one of the measures of bank profitability–return on average equity.

Keywords: banks, CEE countries, profitability ROAA, ROAE

Procedia PDF Downloads 353
22555 E-Learning Approaches Based on Artificial Intelligence Techniques: A Survey

Authors: Nabila Daly, Hamdi Ellouzi, Hela Ltifi

Abstract:

In last year’s, several recent researches’ that focus on e-learning approaches having as goal to improve pedagogy and student’s academy level assessment. E-learning-related works have become an important research file nowadays due to several problems that make it impossible for students join classrooms, especially in last year’s. Among those problems, we note the current epidemic problems in the word case of Covid-19. For those reasons, several e-learning-related works based on Artificial Intelligence techniques are proposed to improve distant education targets. In the current paper, we will present a short survey of the most relevant e-learning based on Artificial Intelligence techniques giving birth to newly developed e-learning tools that rely on new technologies.

Keywords: artificial intelligence techniques, decision, e-learning, support system, survey

Procedia PDF Downloads 208
22554 The Factors That Influence the Self-Sufficiency and the Self-Efficacy Levels among Oncology Patients

Authors: Esra Danaci, Tugba Kavalali Erdogan, Sevil Masat, Selin Keskin Kiziltepe, Tugba Cinarli, Zeliha Koc

Abstract:

This study was conducted in a descriptive and cross-sectional manner to determine that factors that influence the self-efficacy and self-sufficiency levels among oncology patients. The research was conducted between January 24, 2017 and September 24, 2017 in the oncology and hematology departments of a university hospital in Turkey with 179 voluntary inpatients. The data were collected through the Self-Sufficiency/Self-Efficacy Scale and a 29-question survey, which was prepared in order to determine the sociodemographic and clinical properties of the patients. The Self-Sufficiency/Self-Efficacy Scale is a Likert-type scale with 23 articles. The scale scores range between 23 and 115. A high final score indicates a good self-sufficiency/self-efficacy perception for the individual. The data were analyzed using percentage analysis, one-way ANOVA, Mann Whitney U-test, Kruskal Wallis test and Tukey test. The demographic data of the subjects were as follows: 57.5% were male and 42.5% were female, 82.7% were married, 46.4% were primary school graduate, 36.3% were housewives, 19% were employed, 93.3% had social security, 52.5% had matching expenses and incomes, 49.2% lived in the center of the city. The mean age was 57.1±14.6. It was determined that 22.3% of the patients had lung cancer, 19.6% had leukemia, and 43.6% had a good overall condition. The mean self-sufficiency/self-efficacy score was 83,00 (41-115). It was determined that the patients' self-sufficiency/self-efficacy scores were influenced by some of their socio-demographic and clinical properties. This study has found that the patients had high self-sufficiency/self-efficacy scores. It is recommended that the nursing care plans should be developed to improve their self-sufficiency/self-efficacy levels in the light of the patients' sociodemographic and clinical properties.

Keywords: oncology, patient, self-efficacy, self-sufficiency

Procedia PDF Downloads 154
22553 A Two-Stage Bayesian Variable Selection Method with the Extension of Lasso for Geo-Referenced Data

Authors: Georgiana Onicescu, Yuqian Shen

Abstract:

Due to the complex nature of geo-referenced data, multicollinearity of the risk factors in public health spatial studies is a commonly encountered issue, which leads to low parameter estimation accuracy because it inflates the variance in the regression analysis. To address this issue, we proposed a two-stage variable selection method by extending the least absolute shrinkage and selection operator (Lasso) to the Bayesian spatial setting, investigating the impact of risk factors to health outcomes. Specifically, in stage I, we performed the variable selection using Bayesian Lasso and several other variable selection approaches. Then, in stage II, we performed the model selection with only the selected variables from stage I and compared again the methods. To evaluate the performance of the two-stage variable selection methods, we conducted a simulation study with different distributions for the risk factors, using geo-referenced count data as the outcome and Michigan as the research region. We considered the cases when all candidate risk factors are independently normally distributed, or follow a multivariate normal distribution with different correlation levels. Two other Bayesian variable selection methods, Binary indicator, and the combination of Binary indicator and Lasso were considered and compared as alternative methods. The simulation results indicated that the proposed two-stage Bayesian Lasso variable selection method has the best performance for both independent and dependent cases considered. When compared with the one-stage approach, and the other two alternative methods, the two-stage Bayesian Lasso approach provides the highest estimation accuracy in all scenarios considered.

Keywords: Lasso, Bayesian analysis, spatial analysis, variable selection

Procedia PDF Downloads 123
22552 Applications of Greenhouse Data in Guatemala in the Analysis of Sustainability Indicators

Authors: Maria A. Castillo H., Andres R. Leandro, Jose F. Bienvenido B.

Abstract:

In 2015, Guatemala officially adopted the Sustainable Development Goals (SDG) according to the 2030 Agenda agreed by the United Nations Organization. In 2016, these objectives and goals were reviewed, and the National Priorities were established within the K'atún 2032 National Development Plan. In 2019 and 2021, progress was evaluated with 120 defined indicators, and the need to improve quality and availability of statistical data necessary for the analysis of sustainability indicators was detected, so the values to be reached in 2024 and 2032 were adjusted. The need for greater agricultural technology is one of the priorities established within SDG 2 "Zero Hunger". Within this area, protected agricultural production provides greater productivity throughout the year, reduces the use of chemical products to control pests and diseases, reduces the negative impact of climate and improves product quality. During the crisis caused by Covid-19, there was an increase in exports of fruits and vegetables produced in greenhouses from Guatemala. However, this information has not been considered in the 2021 revision of the Plan. The objective of this study is to evaluate the information available on Greenhouse Agricultural Production and its integration into the Sustainability Indicators for Guatemala. This study was carried out in four phases: 1. Analysis of the Goals established for SDG 2 and the indicators included in the K'atún Plan. 2. Analysis of Environmental, Social and Economic Indicator Models. 3. Definition of territorial levels in 2 geographic scales: Departments and Municipalities. 4. Diagnosis of the available data on technological agricultural production with emphasis on Greenhouses at the 2 geographical scales. A summary of the results is presented for each phase and finally some recommendations for future research are added. The main contribution of this work is to improve the available data that allow the incorporation of some agricultural technology indicators in the established goals, to evaluate their impact on Food Security and Nutrition, Employment and Investment, Poverty, the use of Water and Natural Resources, and to provide a methodology applicable to other production models and other geographical areas.

Keywords: greenhouses, protected agriculture, sustainable indicators, Guatemala, sustainability, SDG

Procedia PDF Downloads 70
22551 A Design of Elliptic Curve Cryptography Processor based on SM2 over GF(p)

Authors: Shiji Hu, Lei Li, Wanting Zhou, DaoHong Yang

Abstract:

The data encryption, is the foundation of today’s communication. On this basis, how to improve the speed of data encryption and decryption is always a problem that scholars work for. In this paper, we proposed an elliptic curve crypto processor architecture based on SM2 prime field. In terms of hardware implementation, we optimized the algorithms in different stages of the structure. In finite field modulo operation, we proposed an optimized improvement of Karatsuba-Ofman multiplication algorithm, and shorten the critical path through pipeline structure in the algorithm implementation. Based on SM2 recommended prime field, a fast modular reduction algorithm is used to reduce 512-bit wide data obtained from the multiplication unit. The radix-4 extended Euclidean algorithm was used to realize the conversion between affine coordinate system and Jacobi projective coordinate system. In the parallel scheduling of point operations on elliptic curves, we proposed a three-level parallel structure of point addition and point double based on the Jacobian projective coordinate system. Combined with the scalar multiplication algorithm, we added mutual pre-operation to the point addition and double point operation to improve the efficiency of the scalar point multiplication. The proposed ECC hardware architecture was verified and implemented on Xilinx Virtex-7 and ZYNQ-7 platforms, and each 256-bit scalar multiplication operation took 0.275ms. The performance for handling scalar multiplication is 32 times that of CPU(dual-core ARM Cortex-A9).

Keywords: Elliptic curve cryptosystems, SM2, modular multiplication, point multiplication.

Procedia PDF Downloads 78
22550 Transformation of Positron Emission Tomography Raw Data into Images for Classification Using Convolutional Neural Network

Authors: Paweł Konieczka, Lech Raczyński, Wojciech Wiślicki, Oleksandr Fedoruk, Konrad Klimaszewski, Przemysław Kopka, Wojciech Krzemień, Roman Shopa, Jakub Baran, Aurélien Coussat, Neha Chug, Catalina Curceanu, Eryk Czerwiński, Meysam Dadgar, Kamil Dulski, Aleksander Gajos, Beatrix C. Hiesmayr, Krzysztof Kacprzak, łukasz Kapłon, Grzegorz Korcyl, Tomasz Kozik, Deepak Kumar, Szymon Niedźwiecki, Dominik Panek, Szymon Parzych, Elena Pérez Del Río, Sushil Sharma, Shivani Shivani, Magdalena Skurzok, Ewa łucja Stępień, Faranak Tayefi, Paweł Moskal

Abstract:

This paper develops the transformation of non-image data into 2-dimensional matrices, as a preparation stage for classification based on convolutional neural networks (CNNs). In positron emission tomography (PET) studies, CNN may be applied directly to the reconstructed distribution of radioactive tracers injected into the patient's body, as a pattern recognition tool. Nonetheless, much PET data still exists in non-image format and this fact opens a question on whether they can be used for training CNN. In this contribution, the main focus of this paper is the problem of processing vectors with a small number of features in comparison to the number of pixels in the output images. The proposed methodology was applied to the classification of PET coincidence events.

Keywords: convolutional neural network, kernel principal component analysis, medical imaging, positron emission tomography

Procedia PDF Downloads 119
22549 Evaluation of Actual Nutrition Patients of Osteoporosis

Authors: Aigul Abduldayeva, Gulnar Tuleshova

Abstract:

Osteoporosis (OP) is a major socio-economic problem and is a major cause of disability, reduced quality of life and premature death of elderly people. In Astana, the study involved 93 respondents, of whom 17 were men (18.3%), and 76 were women (81.7%). Age distribution of the respondents is as follows: 40-59 (66.7%), 60-75 (29.0%), 75-90 (4.3%). In the city of Astana general breach of bone mass (CCM) was determined in 83.8% (nationwide figure - RRP - 79.0%) of the patients, and normal levels of ultrasound densitometry were detected in 16.1% (RRP 21.0%) of the patients. OP was diagnosed in 20.4% of people over 40 (RRP for citizens is 19.0%), 25.4% in the group older than 50 (23.4% PIU), 22,6% in the group older than 60 (RRP 32.6%), 25.0% in the group older than 70 (47.6% of RRP). OPN was detected in 63.4% (RRP 59.6%) of the surveyed population. These data indicate that, there is no sharp difference between Astana and other cities in the country regarding the incidence of OP, that is, the situation with the OP is not aggravated by any regional characteristics. In the distribution of respondents by clusters it was found that 80.0% of the respondents with CCM were in the "best urban cluster", 93.8% were in "average urban cluster", and 77.4% were in a "poor urban cluster". There is a high rate construction of new buildings in Astana, presumably, that the new settlers inhabit the outskirts of the city, and very difficult to trace the socio-economic differences there. Based on these data the following conclusions can be made: 1. According to the ultrasound densitometry of the calcaneus the prevalence rate of NCM among the residents of Astana is 83.3%, OP - 20.4%, which generally coincides with data elsewhere in the country. 2. The urban population of Astana is under a high degree of risk for low energetic fracture, 46.2% of the population had medium and high risks of fracture, while the nationwide index is 26.7%. 3. In the development of CCM residents of Akmola region play a significant role gender, age, ethnic factors. According to the ultrasound densitometry women are more prone to Astana OP - 22.4% of respondents than men - 11.8% of respondents.

Keywords: nutrition, osteoporosis, elderly, urban population

Procedia PDF Downloads 458
22548 Bovine Sperm Capacitation Promoters: The Comparison between Serum and Non-serum Albumin originated from Fish

Authors: Haris Setiawan, Phongsakorn Chuammitri, Korawan Sringarm, Montira Intanon, Anucha Sathanawongs

Abstract:

Capacitation is a prerequisite to achieving sperm competency to penetrate the oocyte naturally occurring in vivo throughout the female reproductive tract and entangling secretory fluid and epithelial cells. One of the crucial compounds in the oviductal fluid which promotes capacitation is albumin, secreted in major concentrations. However, the difficulties in the collection and the inconsistency of the oviductal fluid composition throughout the estrous cycle have replaced its function with serum-based albumins such as bovine serum albumin (BSA). BSA has been primarily involved and evidenced for their stabilizing effect to maintain the acrosome intact during the capacitation process, modulate hyperactivation, and elevate the number of sperm bound to zona pellucida. Contrary to its benefits, the use of blood-derived products in the culture system is not sustainable and increases the risk of disease transmissions, such as Creutzfeldt-Jakob disease (CJD) and bovine spongiform encephalopathy (BSE). Moreover, it has been asserted that this substance is an aeroallergen that produces allergies and respiratory problems. In an effort to identify an alternative sustainable and non-toxic albumin source, the present work evaluated sperm reactions to a capacitation medium containing albumin derived from the flesh of the snakehead fish (Channa striata). Before examining the ability of this non-serum albumin to promote capacitation in bovine sperm, the presence of albumin was detected using bromocresol purple (BCP) at the level of 25% from snakehead fish extract. Following the SDS-PAGE and densitometric analysis, two major bands at 40 kDa and 47 kDa consisting of 57% and 16% of total protein loaded were detected as the potential albumin-related bands. Significant differences were observed in all kinematic parameters upon incubation in the capacitation medium. Moreover, consistently higher values were shown for the kinematic parameters related to hyperactivation, such as amplitude lateral head (ALH), velocity curve linear (VCL), and linearity (LIN) when sperm were treated with 3 mg/mL of snakehead fish albumin among other treatments. Likewise, substantial differences of higher acrosome intact presented in sperm upon incubation with various concentrations of snakehead fish albumin for 90 minutes, indicating that this level of snakehead fish albumin can be used to replace the bovine serum albumin. However, further study is highly required to purify the albumin from snakehead fish extract for more reliable findings.

Keywords: capacitation promoter, snakehead fish, non-serum albumin, bovine sperm

Procedia PDF Downloads 99
22547 Investigation of the Relationship between Government Expenditure and Country’s Economic Development in the Context of Sustainable Development

Authors: Lina Sinevičienė

Abstract:

Arising problems of countries’ public finances, social and demographic changes motivate scientific and policy debates on public spending size, structure and efficiency in order to meet the changing needs of society and business. The concept of sustainable development poses new challenges for scientists and policy-makers in the field of public finance. This paper focuses on the investigation of the relationship between government expenditure and country’s economic development in the context of sustainable development. Empirical analysis focuses on the data of the European Union (except Croatia and Luxemburg) countries. The study covers 2003 – 2012 years, using annual cross-sectional data. Summarizing the research results, it can be stated that governments should pay more attention to the needs that ensure sustainable development in the long-run when formulating public expenditure policy, particularly in the field of environment protection.

Keywords: economic development, economic growth, government expenditure, sustainable development

Procedia PDF Downloads 278
22546 Application of Neutron Stimulated Gamma Spectroscopy for Soil Elemental Analysis and Mapping

Authors: Aleksandr Kavetskiy, Galina Yakubova, Nikolay Sargsyan, Stephen A. Prior, H. Allen Torbert

Abstract:

Determining soil elemental content and distribution (mapping) within a field are key features of modern agricultural practice. While traditional chemical analysis is a time consuming and labor-intensive multi-step process (e.g., sample collections, transport to laboratory, physical preparations, and chemical analysis), neutron-gamma soil analysis can be performed in-situ. This analysis is based on the registration of gamma rays issued from nuclei upon interaction with neutrons. Soil elements such as Si, C, Fe, O, Al, K, and H (moisture) can be assessed with this method. Data received from analysis can be directly used for creating soil elemental distribution maps (based on ArcGIS software) suitable for agricultural purposes. The neutron-gamma analysis system developed for field application consisted of an MP320 Neutron Generator (Thermo Fisher Scientific, Inc.), 3 sodium iodide gamma detectors (SCIONIX, Inc.) with a total volume of 7 liters, 'split electronics' (XIA, LLC), a power system, and an operational computer. Paired with GPS, this system can be used in the scanning mode to acquire gamma spectra while traversing a field. Using acquired spectra, soil elemental content can be calculated. These data can be combined with geographical coordinates in a geographical information system (i.e., ArcGIS) to produce elemental distribution maps suitable for agricultural purposes. Special software has been developed that will acquire gamma spectra, process and sort data, calculate soil elemental content, and combine these data with measured geographic coordinates to create soil elemental distribution maps. For example, 5.5 hours was needed to acquire necessary data for creating a carbon distribution map of an 8.5 ha field. This paper will briefly describe the physics behind the neutron gamma analysis method, physical construction the measurement system, and main characteristics and modes of work when conducting field surveys. Soil elemental distribution maps resulting from field surveys will be presented. and discussed. Comparison of these maps with maps created on the bases of chemical analysis and soil moisture measurements determined by soil electrical conductivity was similar. The maps created by neutron-gamma analysis were reproducible, as well. Based on these facts, it can be asserted that neutron stimulated soil gamma spectroscopy paired with GPS system is fully applicable for soil elemental agricultural field mapping.

Keywords: ArcGIS mapping, neutron gamma analysis, soil elemental content, soil gamma spectroscopy

Procedia PDF Downloads 123
22545 The Impact of Feuerstein Enhancement of Learning Potential to the Integration of Children from Socially Disadvantaged Backgrounds into Society

Authors: Michal Kozubík, Svetlana Síthová

Abstract:

Aim: Aim of this study is to introduce the method of instrumental enrichment to people who works in the helping professions, and show further possibilities of its realization with children from socially disadvantaged backgrounds into society. Methods: We focused on Feuerstein’s Instrumental Enrichment method, its theoretical grounds and practical implementation. We carried out questionnaires and directly observed children from the disadvantaged background in Partizánske district. Results: We outlined the issues of children from disadvantaged social environment and their opportunity of social integration using the method. The findings showed the utility of Feuerstein method. Conclusions: We conclude that Feuerstein methods are very suitable for children from socially disadvantaged background and importance of social workers and special educator co-operation.

Keywords: Feuerstein, inclusion, education, socially disadvantaged background

Procedia PDF Downloads 301
22544 On Stochastic Models for Fine-Scale Rainfall Based on Doubly Stochastic Poisson Processes

Authors: Nadarajah I. Ramesh

Abstract:

Much of the research on stochastic point process models for rainfall has focused on Poisson cluster models constructed from either the Neyman-Scott or Bartlett-Lewis processes. The doubly stochastic Poisson process provides a rich class of point process models, especially for fine-scale rainfall modelling. This paper provides an account of recent development on this topic and presents the results based on some of the fine-scale rainfall models constructed from this class of stochastic point processes. Amongst the literature on stochastic models for rainfall, greater emphasis has been placed on modelling rainfall data recorded at hourly or daily aggregation levels. Stochastic models for sub-hourly rainfall are equally important, as there is a need to reproduce rainfall time series at fine temporal resolutions in some hydrological applications. For example, the study of climate change impacts on hydrology and water management initiatives requires the availability of data at fine temporal resolutions. One approach to generating such rainfall data relies on the combination of an hourly stochastic rainfall simulator, together with a disaggregator making use of downscaling techniques. Recent work on this topic adopted a different approach by developing specialist stochastic point process models for fine-scale rainfall aimed at generating synthetic precipitation time series directly from the proposed stochastic model. One strand of this approach focused on developing a class of doubly stochastic Poisson process (DSPP) models for fine-scale rainfall to analyse data collected in the form of rainfall bucket tip time series. In this context, the arrival pattern of rain gauge bucket tip times N(t) is viewed as a DSPP whose rate of occurrence varies according to an unobserved finite state irreducible Markov process X(t). Since the likelihood function of this process can be obtained, by conditioning on the underlying Markov process X(t), the models were fitted with maximum likelihood methods. The proposed models were applied directly to the raw data collected by tipping-bucket rain gauges, thus avoiding the need to convert tip-times to rainfall depths prior to fitting the models. One advantage of this approach was that the use of maximum likelihood methods enables a more straightforward estimation of parameter uncertainty and comparison of sub-models of interest. Another strand of this approach employed the DSPP model for the arrivals of rain cells and attached a pulse or a cluster of pulses to each rain cell. Different mechanisms for the pattern of the pulse process were used to construct variants of this model. We present the results of these models when they were fitted to hourly and sub-hourly rainfall data. The results of our analysis suggest that the proposed class of stochastic models is capable of reproducing the fine-scale structure of the rainfall process, and hence provides a useful tool in hydrological modelling.

Keywords: fine-scale rainfall, maximum likelihood, point process, stochastic model

Procedia PDF Downloads 261
22543 Review of Life-Cycle Analysis Applications on Sustainable Building and Construction Sector as Decision Support Tools

Authors: Liying Li, Han Guo

Abstract:

Considering the environmental issues generated by the building sector for its energy consumption, solid waste generation, water use, land use, and global greenhouse gas (GHG) emissions, this review pointed out to LCA as a decision-support tool to substantially improve the sustainability in the building and construction industry. The comprehensiveness and simplicity of LCA make it one of the most promising decision support tools for the sustainable design and construction of future buildings. This paper contains a comprehensive review of existing studies related to LCAs with a focus on their advantages and limitations when applied in the building sector. The aim of this paper is to enhance the understanding of a building life-cycle analysis, thus promoting its application for effective, sustainable building design and construction in the future. Comparisons and discussions are carried out between four categories of LCA methods: building material and component combinations (BMCC) vs. the whole process of construction (WPC) LCA,attributional vs. consequential LCA, process-based LCA vs. input-output (I-O) LCA, traditional vs. hybrid LCA. Classical case studies are presented, which illustrate the effectiveness of LCA as a tool to support the decisions of practitioners in the design and construction of sustainable buildings. (i) BMCC and WPC categories of LCA researches tend to overlap with each other, as majority WPC LCAs are actually developed based on a bottom-up approach BMCC LCAs use. (ii) When considering the influence of social and economic factors outside the proposed system by research, a consequential LCA could provide a more reliable result than an attributional LCA. (iii) I-O LCA is complementary to process-based LCA in order to address the social and economic problems generated by building projects. (iv) Hybrid LCA provides a more superior dynamic perspective than a traditional LCA that is criticized for its static view of the changing processes within the building’s life cycle. LCAs are still being developed to overcome their limitations and data shortage (especially data on the developing world), and the unification of LCA methods and data can make the results of building LCA more comparable and consistent across different studies or even countries.

Keywords: decision support tool, life-cycle analysis, LCA tools and data, sustainable building design

Procedia PDF Downloads 103
22542 Educational Debriefing in Prehospital Medicine: A Qualitative Study Exploring Educational Debrief Facilitation and the Effects of Debriefing

Authors: Maria Ahmad, Michael Page, Danë Goodsman

Abstract:

‘Educational’ debriefing – a construct distinct from clinical debriefing – is used following simulated scenarios and is central to learning and development in fields ranging from aviation to emergency medicine. However, little research into educational debriefing in prehospital medicine exists. This qualitative study explored the facilitation and effects of prehospital educational debriefing and identified obstacles to debriefing, using the London’s Air Ambulance Pre-Hospital Care Course (PHCC) as a model. Method: Ethnographic observations of moulages and debriefs were conducted over two consecutive days of the PHCC in October 2019. Detailed contemporaneous field notes were made and analysed thematically. Subsequently, seven one-to-one, semi-structured interviews were conducted with four PHCC debrief facilitators and three course participants to explore their experiences of prehospital educational debriefing. Interview data were manually transcribed and analysed thematically. Results: Four overarching themes were identified: the approach to the facilitation of debriefs, effects of debriefing, facilitator development, and obstacles to debriefing. The unpredictable debriefing environment was seen as both hindering and paradoxically benefitting educational debriefing. Despite using varied debriefing structures, facilitators emphasised similar key debriefing components, including exploring participants’ reasoning and sharing experiences to improve learning and prevent future errors. Debriefing was associated with three principal effects: releasing emotion; learning and improving, particularly participant compound learning as they progressed through scenarios; and the application of learning to clinical practice. Facilitator training and feedback were central to facilitator learning and development. Several obstacles to debriefing were identified, including mismatch of participant and facilitator agendas, performance pressure, and time. Interestingly, when used appropriately in the educational environment, these obstacles may paradoxically enhance learning. Conclusions: Educational debriefing in prehospital medicine is complex. It requires the establishment of a safe learning environment, an understanding of participant agendas, and facilitator experience to maximise participant learning. Aspects unique to prehospital educational debriefing were identified, notably the unpredictable debriefing environment, interdisciplinary working, and the paradoxical benefit of educational obstacles for learning. This research also highlights aspects of educational debriefing not extensively detailed in the literature, such as compound participant learning, display of ‘professional honesty’ by facilitators, and facilitator learning, which require further exploration. Future research should also explore educational debriefing in other prehospital services.

Keywords: debriefing, prehospital medicine, prehospital medical education, pre-hospital care course

Procedia PDF Downloads 198
22541 Underrepresentation of Right Middle Cerebral Infarct: A Statistical Parametric Mapping

Authors: Wi-Sun Ryu, Eun-Kee Bae

Abstract:

Prior studies have shown that patients with right hemispheric stroke are likely to seek medical service compared with those with left hemispheric stroke. However, the underlying mechanism for this phenomenon is unknown. In the present study, we generated lesion probability maps in a patient with right and left middle cerebral artery infarct and statistically compared. We found that precentral gyrus-Brodmann area 44, a language area in the left hemisphere - involvement was significantly higher in patients with left hemispheric stroke. This finding suggests that a language dysfunction was more noticeable, thereby taking more patients to hospitals.

Keywords: cerebral infarct, brain MRI, statistical parametric mapping, middle cerebral infarct

Procedia PDF Downloads 322
22540 Identification of Soft Faults in Branched Wire Networks by Distributed Reflectometry and Multi-Objective Genetic Algorithm

Authors: Soumaya Sallem, Marc Olivas

Abstract:

This contribution presents a method for detecting, locating, and characterizing soft faults in a complex wired network. The proposed method is based on multi-carrier reflectometry MCTDR (Multi-Carrier Time Domain Reflectometry) combined with a multi-objective genetic algorithm. In order to ensure complete network coverage and eliminate diagnosis ambiguities, the MCTDR test signal is injected at several points on the network, and the data is merged between different reflectometers (sensors) distributed on the network. An adapted multi-objective genetic algorithm is used to merge data in order to obtain more accurate faults location and characterization. The proposed method performances are evaluated from numerical and experimental results.

Keywords: wired network, reflectometry, network distributed diagnosis, multi-objective genetic algorithm

Procedia PDF Downloads 178