Search results for: field data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30238

Search results for: field data

28018 Temporally Coherent 3D Animation Reconstruction from RGB-D Video Data

Authors: Salam Khalifa, Naveed Ahmed

Abstract:

We present a new method to reconstruct a temporally coherent 3D animation from single or multi-view RGB-D video data using unbiased feature point sampling. Given RGB-D video data, in form of a 3D point cloud sequence, our method first extracts feature points using both color and depth information. In the subsequent steps, these feature points are used to match two 3D point clouds in consecutive frames independent of their resolution. Our new motion vectors based dynamic alignment method then fully reconstruct a spatio-temporally coherent 3D animation. We perform extensive quantitative validation using novel error functions to analyze the results. We show that despite the limiting factors of temporal and spatial noise associated to RGB-D data, it is possible to extract temporal coherence to faithfully reconstruct a temporally coherent 3D animation from RGB-D video data.

Keywords: 3D video, 3D animation, RGB-D video, temporally coherent 3D animation

Procedia PDF Downloads 359
28017 Determining Abnomal Behaviors in UAV Robots for Trajectory Control in Teleoperation

Authors: Kiwon Yeom

Abstract:

Change points are abrupt variations in a data sequence. Detection of change points is useful in modeling, analyzing, and predicting time series in application areas such as robotics and teleoperation. In this paper, a change point is defined to be a discontinuity in one of its derivatives. This paper presents a reliable method for detecting discontinuities within a three-dimensional trajectory data. The problem of determining one or more discontinuities is considered in regular and irregular trajectory data from teleoperation. We examine the geometric detection algorithm and illustrate the use of the method on real data examples.

Keywords: change point, discontinuity, teleoperation, abrupt variation

Procedia PDF Downloads 155
28016 Multidimensional Item Response Theory Models for Practical Application in Large Tests Designed to Measure Multiple Constructs

Authors: Maria Fernanda Ordoñez Martinez, Alvaro Mauricio Montenegro

Abstract:

This work presents a statistical methodology for measuring and founding constructs in Latent Semantic Analysis. This approach uses the qualities of Factor Analysis in binary data with interpretations present on Item Response Theory. More precisely, we propose initially reducing dimensionality with specific use of Principal Component Analysis for the linguistic data and then, producing axes of groups made from a clustering analysis of the semantic data. This approach allows the user to give meaning to previous clusters and found the real latent structure presented by data. The methodology is applied in a set of real semantic data presenting impressive results for the coherence, speed and precision.

Keywords: semantic analysis, factorial analysis, dimension reduction, penalized logistic regression

Procedia PDF Downloads 428
28015 Analysis of Production Forecasting in Unconventional Gas Resources Development Using Machine Learning and Data-Driven Approach

Authors: Dongkwon Han, Sangho Kim, Sunil Kwon

Abstract:

Unconventional gas resources have dramatically changed the future energy landscape. Unlike conventional gas resources, the key challenges in unconventional gas have been the requirement that applies to advanced approaches for production forecasting due to uncertainty and complexity of fluid flow. In this study, artificial neural network (ANN) model which integrates machine learning and data-driven approach was developed to predict productivity in shale gas. The database of 129 wells of Eagle Ford shale basin used for testing and training of the ANN model. The Input data related to hydraulic fracturing, well completion and productivity of shale gas were selected and the output data is a cumulative production. The performance of the ANN using all data sets, clustering and variables importance (VI) models were compared in the mean absolute percentage error (MAPE). ANN model using all data sets, clustering, and VI were obtained as 44.22%, 10.08% (cluster 1), 5.26% (cluster 2), 6.35%(cluster 3), and 32.23% (ANN VI), 23.19% (SVM VI), respectively. The results showed that the pre-trained ANN model provides more accurate results than the ANN model using all data sets.

Keywords: unconventional gas, artificial neural network, machine learning, clustering, variables importance

Procedia PDF Downloads 183
28014 Biophysical Analysis of the Interaction of Polymeric Nanoparticles with Biomimetic Models of the Lung Surfactant

Authors: Weiam Daear, Patrick Lai, Elmar Prenner

Abstract:

The human body offers many avenues that could be used for drug delivery. The pulmonary route, which is delivered through the lungs, presents many advantages that have sparked interested in the field. These advantages include; 1) direct access to the lungs and the large surface area it provides, and 2) close proximity to the blood circulation. The air-blood barrier of the alveoli is about 500 nm thick. The air-blood barrier consist of a monolayer of lipids and few proteins called the lung surfactant and cells. This monolayer consists of ~90% lipids and ~10% proteins that are produced by the alveolar epithelial cells. The two major lipid classes constitutes of various saturation and chain length of phosphatidylcholine (PC) and phosphatidylglycerol (PG) representing 80% of total lipid component. The major role of the lung surfactant monolayer is to reduce surface tension experienced during breathing cycles in order to prevent lung collapse. In terms of the pulmonary drug delivery route, drugs pass through various parts of the respiratory system before reaching the alveoli. It is at this location that the lung surfactant functions as the air-blood barrier for drugs. As the field of nanomedicine advances, the use of nanoparticles (NPs) as drug delivery vehicles is becoming very important. This is due to the advantages NPs provide with their large surface area and potential specific targeting. Therefore, studying the interaction of NPs with lung surfactant and whether they affect its stability becomes very essential. The aim of this research is to develop a biomimetic model of the human lung surfactant followed by a biophysical analysis of the interaction of polymeric NPs. This biomimetic model will function as a fast initial mode of testing for whether NPs affect the stability of the human lung surfactant. The model developed thus far is an 8-component lipid system that contains major PC and PG lipids. Recently, a custom made 16:0/16:1 PC and PG lipids were added to the model system. In the human lung surfactant, these lipids constitute 16% of the total lipid component. According to the author’s knowledge, there is not much monolayer data on the biophysical analysis of the 16:0/16:1 lipids, therefore more analysis will be discussed here. Biophysical techniques such as the Langmuir Trough is used for stability measurements which monitors changes to a monolayer's surface pressure upon NP interaction. Furthermore, Brewster Angle Microscopy (BAM) employed to visualize changes to the lateral domain organization. Results show preferential interactions of NPs with different lipid groups that is also dependent on the monolayer fluidity. Furthermore, results show that the film stability upon compression is unaffected, but there are significant changes in the lateral domain organization of the lung surfactant upon NP addition. This research is significant in the field of pulmonary drug delivery. It is shown that NPs within a certain size range are safe for the pulmonary route, but little is known about the mode of interaction of those polymeric NPs. Moreover, this work will provide additional information about the nanotoxicology of NPs tested.

Keywords: Brewster angle microscopy, lipids, lung surfactant, nanoparticles

Procedia PDF Downloads 170
28013 Introducing Data-Driven Learning into Chinese Higher Education English for Academic Purposes Writing Instructional Settings

Authors: Jingwen Ou

Abstract:

Writing for academic purposes in a second or foreign language is one of the most important and the most demanding skills to be mastered by non-native speakers. Traditionally, the EAP writing instruction at the tertiary level encompasses the teaching of academic genre knowledge, more specifically, the disciplinary writing conventions, the rhetorical functions, and specific linguistic features. However, one of the main sources of challenges in English academic writing for L2 students at the tertiary level can still be found in proficiency in academic discourse, especially vocabulary, academic register, and organization. Data-Driven Learning (DDL) is defined as “a pedagogical approach featuring direct learner engagement with corpus data”. In the past two decades, the rising popularity of the application of the data-driven learning (DDL) approach in the field of EAP writing teaching has been noticed. Such a combination has not only transformed traditional pedagogy aided by published DDL guidebooks in classroom use but also triggered global research on corpus use in EAP classrooms. This study endeavors to delineate a systematic review of research in the intersection of DDL and EAP writing instruction by conducting a systematic literature review on both indirect and direct DDL practice in EAP writing instructional settings in China. Furthermore, the review provides a synthesis of significant discoveries emanating from prior research investigations concerning Chinese university students’ perception of Data-Driven Learning (DDL) and the subsequent impact on their academic writing performance following corpus-based training. Research papers were selected from Scopus-indexed journals and core journals from two main Chinese academic databases (CNKI and Wanfang) published in both English and Chinese over the last ten years based on keyword searches. Results indicated an insufficiency of empirical DDL research despite a noticeable upward trend in corpus research on discourse analysis and indirect corpus applications for material design by language teachers. Research on the direct use of corpora and corpus tools in DDL, particularly in combination with genre-based EAP teaching, remains a relatively small fraction of the whole body of research in Chinese higher education settings. Such scarcity is highly related to the prevailing absence of systematic training in English academic writing registers within most Chinese universities' EAP syllabi due to the Chinese English Medium Instruction policy, where only English major students are mandated to submit English dissertations. Findings also revealed that Chinese learners still held mixed attitudes towards corpus tools influenced by learner differences, limited access to language corpora, and insufficient pre-training on corpus theoretical concepts, despite their improvements in final academic writing performance.

Keywords: corpus linguistics, data-driven learning, EAP, tertiary education in China

Procedia PDF Downloads 39
28012 The Motivation of Israeli Arab Students to Study Education and Society at Multicultural College

Authors: Yael Cohen Azaria, Sara Zamir

Abstract:

This study examined what motivated Israeli Arab students to choose to study for a degree in education and society and the influence of this academic choice on them while they were studying. The study follows the qualitative paradigm of data collection and analysis, in a case study of a homogeneous group of Arab students in a Jewish multicultural academic institution. 33 students underwent semi-structured in-depth interviews. Findings show that the choice stemmed from a desire to lead social change within their own society; to imitate an educational role-model and to realize a dream of higher education. Among the female students, this field suits the role of the woman in Arab society. The interviewees claimed that the influence of their studies was that they felt more openness towards others and those who are different; they felt pride and self-confidence in their abilities, and the women mentioned that they felt empowered.

Keywords: education, higher education, Israeli Arabs, minorities

Procedia PDF Downloads 363
28011 Evaluation of Dynamic Log Files for Different Dose Rates in IMRT Plans

Authors: Saad Bin Saeed, Fayzan Ahmed, Shahbaz Ahmed, Amjad Hussain

Abstract:

The aim of this study is to evaluate dynamic log files (Dynalogs) at different dose rates by dose-volume histograms (DVH) and used as a (QA) procedure of IMRT. Seven patients of phase one head and neck cancer with similar OAR`s are selected randomly. Reference plans of dose rate 300 and 600 MU/Min with prescribed dose of 50Gy in 25 fractions for each patient is made. Dynalogs produced by delivery of reference plans processed by in-house MATLAB program which produces new field files contain actual positions of multi-leaf collimators (MLC`s) instead of planned positions in reference plans. Copies of reference plans are used to import new field files generated by MATLAB program and renamed as Dyn.plan. After dose calculations of Dyn.plans for different dose rates, DVH, and multiple linear regression tools are used to evaluate reference and Dyn.plans. The results indicate good agreement of correlation between different dose rate plans. The maximum dose difference among PTV and OAR`s are found to be less than 5% and 9% respectively. The study indicates the potential of dynalogs to be used as patient-specific QA of IMRT at different dose rate.

Keywords: IMRT, dynalogs, dose rate, DVH

Procedia PDF Downloads 523
28010 The Use Support Vector Machine and Back Propagation Neural Network for Prediction of Daily Tidal Levels Along The Jeddah Coast, Saudi Arabia

Authors: E. A. Mlybari, M. S. Elbisy, A. H. Alshahri, O. M. Albarakati

Abstract:

Sea level rise threatens to increase the impact of future storms and hurricanes on coastal communities. Accurate sea level change prediction and supplement is an important task in determining constructions and human activities in coastal and oceanic areas. In this study, support vector machines (SVM) is proposed to predict daily tidal levels along the Jeddah Coast, Saudi Arabia. The optimal parameter values of kernel function are determined using a genetic algorithm. The SVM results are compared with the field data and with back propagation (BP). Among the models, the SVM is superior to BPNN and has better generalization performance.

Keywords: tides, prediction, support vector machines, genetic algorithm, back-propagation neural network, risk, hazards

Procedia PDF Downloads 453
28009 Procedure Model for Data-Driven Decision Support Regarding the Integration of Renewable Energies into Industrial Energy Management

Authors: M. Graus, K. Westhoff, X. Xu

Abstract:

The climate change causes a change in all aspects of society. While the expansion of renewable energies proceeds, industry could not be convinced based on general studies about the potential of demand side management to reinforce smart grid considerations in their operational business. In this article, a procedure model for a case-specific data-driven decision support for industrial energy management based on a holistic data analytics approach is presented. The model is executed on the example of the strategic decision problem, to integrate the aspect of renewable energies into industrial energy management. This question is induced due to considerations of changing the electricity contract model from a standard rate to volatile energy prices corresponding to the energy spot market which is increasingly more affected by renewable energies. The procedure model corresponds to a data analytics process consisting on a data model, analysis, simulation and optimization step. This procedure will help to quantify the potentials of sustainable production concepts based on the data from a factory. The model is validated with data from a printer in analogy to a simple production machine. The overall goal is to establish smart grid principles for industry via the transformation from knowledge-driven to data-driven decisions within manufacturing companies.

Keywords: data analytics, green production, industrial energy management, optimization, renewable energies, simulation

Procedia PDF Downloads 426
28008 Modeling the Demand for the Healthcare Services Using Data Analysis Techniques

Authors: Elizaveta S. Prokofyeva, Svetlana V. Maltseva, Roman D. Zaitsev

Abstract:

Rapidly evolving modern data analysis technologies in healthcare play a large role in understanding the operation of the system and its characteristics. Nowadays, one of the key tasks in urban healthcare is to optimize the resource allocation. Thus, the application of data analysis in medical institutions to solve optimization problems determines the significance of this study. The purpose of this research was to establish the dependence between the indicators of the effectiveness of the medical institution and its resources. Hospital discharges by diagnosis; hospital days of in-patients and in-patient average length of stay were selected as the performance indicators and the demand of the medical facility. The hospital beds by type of care, medical technology (magnetic resonance tomography, gamma cameras, angiographic complexes and lithotripters) and physicians characterized the resource provision of medical institutions for the developed models. The data source for the research was an open database of the statistical service Eurostat. The choice of the source is due to the fact that the databases contain complete and open information necessary for research tasks in the field of public health. In addition, the statistical database has a user-friendly interface that allows you to quickly build analytical reports. The study provides information on 28 European for the period from 2007 to 2016. For all countries included in the study, with the most accurate and complete data for the period under review, predictive models were developed based on historical panel data. An attempt to improve the quality and the interpretation of the models was made by cluster analysis of the investigated set of countries. The main idea was to assess the similarity of the joint behavior of the variables throughout the time period under consideration to identify groups of similar countries and to construct the separate regression models for them. Therefore, the original time series were used as the objects of clustering. The hierarchical agglomerate algorithm k-medoids was used. The sampled objects were used as the centers of the clusters obtained, since determining the centroid when working with time series involves additional difficulties. The number of clusters used the silhouette coefficient. After the cluster analysis it was possible to significantly improve the predictive power of the models: for example, in the one of the clusters, MAPE error was only 0,82%, which makes it possible to conclude that this forecast is highly reliable in the short term. The obtained predicted values of the developed models have a relatively low level of error and can be used to make decisions on the resource provision of the hospital by medical personnel. The research displays the strong dependencies between the demand for the medical services and the modern medical equipment variable, which highlights the importance of the technological component for the successful development of the medical facility. Currently, data analysis has a huge potential, which allows to significantly improving health services. Medical institutions that are the first to introduce these technologies will certainly have a competitive advantage.

Keywords: data analysis, demand modeling, healthcare, medical facilities

Procedia PDF Downloads 133
28007 Dissimilarity-Based Coloring for Symbolic and Multivariate Data Visualization

Authors: K. Umbleja, M. Ichino, H. Yaguchi

Abstract:

In this paper, we propose a coloring method for multivariate data visualization by using parallel coordinates based on dissimilarity and tree structure information gathered during hierarchical clustering. The proposed method is an extension for proximity-based coloring that suffers from a few undesired side effects if hierarchical tree structure is not balanced tree. We describe the algorithm by assigning colors based on dissimilarity information, show the application of proposed method on three commonly used datasets, and compare the results with proximity-based coloring. We found our proposed method to be especially beneficial for symbolic data visualization where many individual objects have already been aggregated into a single symbolic object.

Keywords: data visualization, dissimilarity-based coloring, proximity-based coloring, symbolic data

Procedia PDF Downloads 159
28006 Commercial Law Between Custom and Islamic Law

Authors: Mohamed Zakareia Ghazy Aly Belal

Abstract:

Commercial law is the set of legal rules that apply to business and regulates the trade of trade. The meaning of this is that the commercial law regulates certain relations only that arises as a result of carrying out certain businesses. which are business, as it regulates the activity of a specific sect, the sect of merchants, and the commercial law as other branches of the law has characteristics that distinguish it from other laws and various, and various sources from which its basis is derived from It is the objective or material source. the historical source, the official source and the interpretative source, and we are limited to official sources and explanatory sources. so what do you see what these sources are, and what is their degree and strength in taking it in commercial disputes. The first topic / characteristics of commercial law. Commercial law has become necessary for the world of trade and economics, which cannot be dispensed with, given the reasons that have been set as legal rules for commercial field. In fact, it is sufficient to refer to the stability and stability of the environment, and in exchange for the movement and the speed in which the commercial environment is in addition to confidence and credit. the characteristic of speed and the characteristic of trust, and credit are the ones that justify the existence of commercial law. Business is fast, while civil business is slow, stable and stability. The person concludes civil transactions in his life only a little. And before doing any civil action. he must have a period of thinking and scrutiny, and the investigation is the person who wants the husband, he must have a period of thinking and scrutiny. as if the person who wants to acquire a house to live with with his family, he must search and investigate Discuss the price before the conclusion of a purchase contract. In the commercial field, transactions take place very quickly because the time factor has an important role in concluding deals and achieving profits. This is because the merchant in contracting about a specific deal would cause a loss to the merchant due to the linkage of the commercial law with the fluctuations of the economy and the market. The merchant may also conclude more than one deal in one and short time. And that is due to the absence of commercial law from the formalities and procedures that hinder commercial transactions.

Keywords: law, commercial law, business, commercial field

Procedia PDF Downloads 58
28005 The Impact of Data Science on Geography: A Review

Authors: Roberto Machado

Abstract:

We conducted a systematic review using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses methodology, analyzing 2,996 studies and synthesizing 41 of them to explore the evolution of data science and its integration into geography. By employing optimization algorithms, we accelerated the review process, significantly enhancing the efficiency and precision of literature selection. Our findings indicate that data science has developed over five decades, facing challenges such as the diversified integration of data and the need for advanced statistical and computational skills. In geography, the integration of data science underscores the importance of interdisciplinary collaboration and methodological innovation. Techniques like large-scale spatial data analysis and predictive algorithms show promise in natural disaster management and transportation route optimization, enabling faster and more effective responses. These advancements highlight the transformative potential of data science in geography, providing tools and methodologies to address complex spatial problems. The relevance of this study lies in the use of optimization algorithms in systematic reviews and the demonstrated need for deeper integration of data science into geography. Key contributions include identifying specific challenges in combining diverse spatial data and the necessity for advanced computational skills. Examples of connections between these two fields encompass significant improvements in natural disaster management and transportation efficiency, promoting more effective and sustainable environmental solutions with a positive societal impact.

Keywords: data science, geography, systematic review, optimization algorithms, supervised learning

Procedia PDF Downloads 0
28004 Developing Structured Sizing Systems for Manufacturing Ready-Made Garments of Indian Females Using Decision Tree-Based Data Mining

Authors: Hina Kausher, Sangita Srivastava

Abstract:

In India, there is a lack of standard, systematic sizing approach for producing readymade garments. Garments manufacturing companies use their own created size tables by modifying international sizing charts of ready-made garments. The purpose of this study is to tabulate the anthropometric data which covers the variety of figure proportions in both height and girth. 3,000 data has been collected by an anthropometric survey undertaken over females between the ages of 16 to 80 years from some states of India to produce the sizing system suitable for clothing manufacture and retailing. This data is used for the statistical analysis of body measurements, the formulation of sizing systems and body measurements tables. Factor analysis technique is used to filter the control body dimensions from a large number of variables. Decision tree-based data mining is used to cluster the data. The standard and structured sizing system can facilitate pattern grading and garment production. Moreover, it can exceed buying ratios and upgrade size allocations to retail segments.

Keywords: anthropometric data, data mining, decision tree, garments manufacturing, sizing systems, ready-made garments

Procedia PDF Downloads 124
28003 Room Temperature Sensitive Broadband Terahertz Photo Response Using Platinum Telluride Based Devices

Authors: Alka Jakhar, Harmanpreet Kaur Sandhu, Samaresh Das

Abstract:

The Terahertz (THz) technology-based devices are heightening at an alarming rate on account of the wide range of applications in imaging, security, communication, and spectroscopic field. The various available room operational THz detectors, including Golay cell, pyroelectric detector, field-effect transistors, and photoconductive antennas, have some limitations such as narrow-band response, slow response speed, transit time limits, and complex fabrication process. There is an urgent demand to explore new materials and device structures to accomplish efficient THz detection systems. Recently, TMDs including topological semimetals and topological insulators such as PtSe₂, MoTe₂, WSe₂, and PtTe₂ provide novel feasibility for photonic and optical devices. The peculiar properties of these materials, such as Dirac cone, fermions presence, nonlinear optical response, high conductivity, and ambient stability, make them worthy for the development of the THz devices. Here, the platinum telluride (PtTe₂) based devices have been demonstrated for THz detection in the frequency range of 0.1-1 THz. The PtTe₂ is synthesized by direct selenization of the sputtered platinum film on the high-resistivity silicon substrate by using the chemical vapor deposition (CVD) method. The Raman spectra, XRD, and XPS spectra confirm the formation of the thin PtTe₂ film. The PtTe₂ channel length is 5µm and it is connected with a bow-tie antenna for strong THz electric field confinement in the channel. The characterization of the devices has been carried out in a wide frequency range from 0.1-1 THz. The induced THz photocurrent is measured by using lock-in-amplifier after preamplifier. The maximum responsivity is achieved up to 1 A/W under self-biased mode. Further, this responsivity has been increased by applying biasing voltage. This photo response corresponds to low energy THz photons is mainly due to the photo galvanic effect in PtTe₂. The DC current is induced along the PtTe₂ channel, which is directly proportional to the amplitude of the incident THz electric field. Thus, these new topological semimetal materials provide new pathways for sensitive detection and sensing applications in the THz domain.

Keywords: terahertz, detector, responsivity, topological-semimetals

Procedia PDF Downloads 150
28002 Sustainability Assessment Tool for the Selection of Optimal Site Remediation Technologies for Contaminated Gasoline Sites

Authors: Connor Dunlop, Bassim Abbassi, Richard G. Zytner

Abstract:

Life cycle assessment (LCA) is a powerful tool established by the International Organization for Standardization (ISO) that can be used to assess the environmental impacts of a product or process from cradle to grave. Many studies utilize the LCA methodology within the site remediation field to compare various decontamination methods, including bioremediation, soil vapor extraction or excavation, and off-site disposal. However, with the authors' best knowledge, limited information is available in the literature on a sustainability tool that could be used to help with the selection of the optimal remediation technology. This tool, based on the LCA methodology, would consider site conditions like environmental, economic, and social impacts. Accordingly, this project was undertaken to develop a tool to assist with the selection of optimal sustainable technology. Developing a proper tool requires a large amount of data. As such, data was collected from previous LCA studies looking at site remediation technologies. This step identified knowledge gaps or limitations within project data. Next, utilizing the data obtained from the literature review and other organizations, an extensive LCA study is being completed following the ISO 14040 requirements. Initial technologies being compared include bioremediation, excavation with off-site disposal, and a no-remediation option for a generic gasoline-contaminated site. To complete the LCA study, the modelling software SimaPro is being utilized. A sensitivity analysis of the LCA results will also be incorporated to evaluate the impact on the overall results. Finally, the economic and social impacts associated with each option will then be reviewed to understand how they fluctuate at different sites. All the results will then be summarized, and an interactive tool using Excel will be developed to help select the best sustainable site remediation technology. Preliminary LCA results show improved sustainability for the decontamination of a gasoline-contaminated site for each technology compared to the no-remediation option. Sensitivity analyses are now being completed on on-site parameters to determine how the environmental impacts fluctuate at other contaminated gasoline locations as the parameters vary, including soil type and transportation distances. Additionally, the social improvements and overall economic costs associated with each technology are being reviewed. Utilizing these results, the sustainability tool created to assist in the selection of the overall best option will be refined.

Keywords: life cycle assessment, site remediation, sustainability tool, contaminated sites

Procedia PDF Downloads 51
28001 Muscle: The Tactile Texture Designed for the Blind

Authors: Chantana Insra

Abstract:

The research objective focuses on creating a prototype media of the tactile texture of muscles for educational institutes to help visually impaired students learn massage extra learning materials further than the ordinary curriculum. This media is designed as an extra learning material. The population in this study was 30 blinded students between 4th - 6th grades who were able to read Braille language. The research was conducted during the second semester in 2012 at The Bangkok School for the Blind. The method in choosing the population in the study was purposive sampling. The methodology of the research includes collecting data related to visually impaired people, the production of the tactile texture media, human anatomy and Thai traditional massage from literature reviews and field studies. This information was used for analyzing and designing 14 tactile texture pictures presented to experts to evaluate and test the media.

Keywords: blind, tactile texture, muscle, visual arts and design

Procedia PDF Downloads 264
28000 Climate Change and Tourism: A Scientometric Analysis Using Citespace

Authors: Yan Fang, Jie Yin, Bihu Wu

Abstract:

The interaction between climate change and tourism is one of the most promising research areas of recent decades. In this paper, a scientometric analysis of 976 academic publications between 1990 and 2015 related to climate change and tourism is presented in order to characterize the intellectual landscape by identifying and visualizing the evolution of the collaboration network, the co-citation network, and emerging trends of citation burst and keyword co-occurrence. The results show that the number of publications in this field has increased rapidly and it has become an interdisciplinary and multidisciplinary topic. The research areas are dominated by Australia, USA, Canada, New Zealand, and European countries, which have the most productive authors and institutions. The hot topics of climate change and tourism research in recent years are further identified, including the consequences of climate change for tourism, necessary adaptations, the vulnerability of the tourism industry, tourist behaviour and demand in response to climate change, and emission reductions in the tourism sector. The work includes an in-depth analysis of a major forum of climate change and tourism to help readers to better understand global trends in this field in the past 25 years.

Keywords: climate change, tourism, scientometrics, CiteSpace

Procedia PDF Downloads 400
27999 Numerical Simulation of Convective Flow of Nanofluids with an Oriented Magnetic Field in a Half Circular-Annulus

Authors: M. J. Uddin, M. M. Rahman

Abstract:

The unsteady convective heat transfer flow of nanofluids in a half circular-annulus shape enclosure using nonhomogeneous dynamic model has been investigated numerically. The round upper wall of the enclosure is maintained at constant low temperature whereas the bottom wall is heated by three different thermal conditions. The enclosure is permeated by a uniform magnetic field having variable orientation. The Brownian motion and thermophoretic phenomena of the nanoparticles are taken into account in model construction. The governing nonlinear momentum, energy, and concentration equations are solved numerically using Galerkin weighted residual finite element method. To discover the best performer, the average Nusselt number is demonstrated for different types of nanofluids. The heat transfer rate for different flow parameters, positions of the annulus, thicknesses of the half circular-annulus and thermal conditions is also exhibited.

Keywords: nanofluid, convection, semicircular-annulus, nonhomogeneous dynamic model, finite element method

Procedia PDF Downloads 210
27998 A Bibliometric Analysis of Research on E-learning in Physics Education: Trends, Patterns, and Future Directions

Authors: Siti Nurjanah, Supahar

Abstract:

E-learning has become an increasingly popular mode of instruction, particularly in the field of physics education, where it offers opportunities for interactive and engaging learning experiences. This research aims to analyze the trends of research that investigated e-learning in physics education. Data was extracted from Scopus's database using the keywords "physics" and "e-learning". Of the 380 articles obtained based on the search criteria, a trend analysis of the research was carried out with the help of RStudio using the biblioshiny package and VosViewer software. Analysis showed that publications on this topic have increased significantly from 2014 to 2021. The publication was dominated by researchers from the United States. The main journal that publishes articles on this topic is Proceedings Frontiers in Education Conference fie. The most widely cited articles generally focus on the effectiveness of Moodle for physics learning. Overall, this research provides an in-depth understanding of the trends and key findings of research related to e-learning in physics.

Keywords: bibliometric analysis, physics education, biblioshiny, E-learning

Procedia PDF Downloads 30
27997 Communication Tools Used in Teaching and Their Effects: An Empirical Study on the T. C. Selcuk University Samples

Authors: Sedat Simsek, Tugay Arat

Abstract:

Today's communication concept, which has a great revolution with the printing press which has been found by Gutenberg, has no boundary thanks to advanced communication devices and the internet. It is possible to take advantage in many areas, such as from medicine to social sciences or from mathematics to education, from the computers that was first produced for the purpose of military services. The use of these developing technologies in the field of education has created a great vision changes in both training and having education. Materials, which can be considered as basic communication resources and used in traditional education has begun to lose its significance, and some technologies have begun to replace them such as internet, computers, smart boards, projection devices and mobile phone. On the other hand, the programs and applications used in these technologies have also been developed. University students use virtual books instead of the traditional printed book, use cell phones instead of note books, use the internet and virtual databases instead of the library to research. They even submit their homework with interactive methods rather than printed materials. The traditional education system, these technologies, which increase productivity, have brought a new dimension to education. The aim of this study is to determine the influence of technologies in the learning process of students and to find whether is there any similarities and differences that arise from the their faculty that they have been educated and and their learning process. In addition to this, it is aimed to determine the level of ICT usage of students studying at the university level. In this context, the advantages and conveniences of the technology used by students are also scrutinized. In this study, we used surveys to collect data. The data were analyzed by using SPSS 16 statistical program with the appropriate testing.

Keywords: education, communication technologies, role of technology, teaching

Procedia PDF Downloads 292
27996 Reliability and Availability Analysis of Satellite Data Reception System using Reliability Modeling

Authors: Ch. Sridevi, S. P. Shailender Kumar, B. Gurudayal, A. Chalapathi Rao, K. Koteswara Rao, P. Srinivasulu

Abstract:

System reliability and system availability evaluation plays a crucial role in ensuring the seamless operation of complex satellite data reception system with consistent performance for longer periods. This paper presents a novel approach for the same using a case study on one of the antenna systems at satellite data reception ground station in India. The methodology involves analyzing system's components, their failure rates, system's architecture, generation of logical reliability block diagram model and estimating the reliability of the system using the component level mean time between failures considering exponential distribution to derive a baseline estimate of the system's reliability. The model is then validated with collected system level field failure data from the operational satellite data reception systems that includes failure occurred, failure time, criticality of the failure and repair times by using statistical techniques like median rank, regression and Weibull analysis to extract meaningful insights regarding failure patterns and practical reliability of the system and to assess the accuracy of the developed reliability model. The study mainly focused on identification of critical units within the system, which are prone to failures and have a significant impact on overall performance and brought out a reliability model of the identified critical unit. This model takes into account the interdependencies among system components and their impact on overall system reliability and provides valuable insights into the performance of the system to understand the Improvement or degradation of the system over a period of time and will be the vital input to arrive at the optimized design for future development. It also provides a plug and play framework to understand the effect on performance of the system in case of any up gradations or new designs of the unit. It helps in effective planning and formulating contingency plans to address potential system failures, ensuring the continuity of operations. Furthermore, to instill confidence in system users, the duration for which the system can operate continuously with the desired level of 3 sigma reliability was estimated that turned out to be a vital input to maintenance plan. System availability and station availability was also assessed by considering scenarios of clash and non-clash to determine the overall system performance and potential bottlenecks. Overall, this paper establishes a comprehensive methodology for reliability and availability analysis of complex satellite data reception systems. The results derived from this approach facilitate effective planning contingency measures, and provide users with confidence in system performance and enables decision-makers to make informed choices about system maintenance, upgrades and replacements. It also aids in identifying critical units and assessing system availability in various scenarios and helps in minimizing downtime and optimizing resource allocation.

Keywords: exponential distribution, reliability modeling, reliability block diagram, satellite data reception system, system availability, weibull analysis

Procedia PDF Downloads 73
27995 Facility Data Model as Integration and Interoperability Platform

Authors: Nikola Tomasevic, Marko Batic, Sanja Vranes

Abstract:

Emerging Semantic Web technologies can be seen as the next step in evolution of the intelligent facility management systems. Particularly, this considers increased usage of open source and/or standardized concepts for data classification and semantic interpretation. To deliver such facility management systems, providing the comprehensive integration and interoperability platform in from of the facility data model is a prerequisite. In this paper, one of the possible modelling approaches to provide such integrative facility data model which was based on the ontology modelling concept was presented. Complete ontology development process, starting from the input data acquisition, ontology concepts definition and finally ontology concepts population, was described. At the beginning, the core facility ontology was developed representing the generic facility infrastructure comprised of the common facility concepts relevant from the facility management perspective. To develop the data model of a specific facility infrastructure, first extension and then population of the core facility ontology was performed. For the development of the full-blown facility data models, Malpensa and Fiumicino airports in Italy, two major European air-traffic hubs, were chosen as a test-bed platform. Furthermore, the way how these ontology models supported the integration and interoperability of the overall airport energy management system was analyzed as well.

Keywords: airport ontology, energy management, facility data model, ontology modeling

Procedia PDF Downloads 435
27994 Modern Well Logs Technology to Improve Geological Model for Libyan Deep Sand Stone Reservoir

Authors: Tarek S. Duzan, Fisal Ben Ammer, Mohamed Sula

Abstract:

In some places within Sirt Basin-Libya, it has been noticed that seismic data below pre-upper cretaceous unconformity (PUK) is hopeless to resolve the large-scale structural features and is unable to fully determine reservoir delineation. Seismic artifacts (multiples) are observed in the reservoir zone (Nubian Formation) below PUK, which complicate the process of seismic interpretation. The nature of the unconformity and the structures below are still ambiguous and not fully understood which generates a significant gap in characterizing the geometry of the reservoir, the uncertainty accompanied with lack of reliable seismic data creates difficulties in building a robust geological model. High resolution dipmeter is highly useful in steeply dipping zones. This paper uses FMl and OBMl borehole images (dipmeter) to analyze the structures below the PUK unconformity from two wells drilled recently in the North Gialo field (a mature reservoir). In addition, borehole images introduce new evidences that the PUK unconformity is angular and the bedding planes within the Nubian formation (below PUK) are significantly titled. Structural dips extracted from high resolution borehole images are used to construct a new geological model by the utilization of latest software technology. Therefore, it is important to use the advance well logs technology such as FMI-HD for any future drilling and up-date the existing model in order to minimize the structural uncertainty.

Keywords: FMI (formation micro imager), OBMI (oil base mud imager), UBI (ultra sonic borehole imager), nub sandstone reservoir in North gialo

Procedia PDF Downloads 308
27993 Liver and Liver Lesion Segmentation From Abdominal CT Scans

Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid

Abstract:

The interpretation of medical images benefits from anatomical and physiological priors to optimize computer- aided diagnosis applications. Segmentation of liver and liver lesion is regarded as a major primary step in computer aided diagnosis of liver diseases. Precise liver segmentation in abdominal CT images is one of the most important steps for the computer-aided diagnosis of liver pathology. In this papers, a semi- automated method for medical image data is presented for the liver and liver lesion segmentation data using mathematical morphology. Our algorithm is currency in two parts. In the first, we seek to determine the region of interest by applying the morphological filters to extract the liver. The second step consists to detect the liver lesion. In this task; we proposed a new method developed for the semi-automatic segmentation of the liver and hepatic lesions. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to improve the quality of the original image and image gradient by applying the spatial filter followed by the morphological filters. The second step consists to calculate the internal and external markers of the liver and hepatic lesions. Thereafter we proceed to the liver and hepatic lesions segmentation by the watershed transform controlled by markers. The validation of the developed algorithm is done using several images. Obtained results show the good performances of our proposed algorithm

Keywords: anisotropic diffusion filter, CT images, hepatic lesion segmentation, Liver segmentation, morphological filter, the watershed algorithm

Procedia PDF Downloads 439
27992 Examining the Skills of Establishing Number and Space Relations of Science Students with the 'Integrative Perception Test'

Authors: Ni̇sa Yeni̇kalayci, Türkan Aybi̇ke Akarca

Abstract:

The ability of correlation the number and space relations, one of the basic scientific process skills, is being used in the transformation of a two-dimensional object into a three-dimensional image or in the expression of symmetry axes of the object. With this research, it is aimed to determine the ability of science students to establish number and space relations. The research was carried out with a total of 90 students studying in the first semester of the Science Education program of a state university located in the Turkey’s Black Sea Region in the fall semester of 2017-2018 academic year. An ‘Integrative Perception Test (IPT)’ was designed by the researchers to collect the data. Within the scope of IPT, the courses and workbooks specific to the field of science were scanned and the ones without symmetrical structure from the visual items belonging to the ‘Physics - Chemistry – Biology’ sub-fields were selected and listed. During the application, it was expected that students would imagine and draw images of the missing half of the visual items that were given incomplete in the first place. The data obtained from the test in which there are 30 images or pictures in total (f Physics = 10, f Chemistry = 10, f Biology = 10) were analyzed descriptively based on the drawings created by the students as ‘complete (2 points), incomplete/wrong (1 point), empty (0 point)’. For the teaching of new concepts in small aged groups, images or pictures showing symmetrical structures and similar applications can also be used.

Keywords: integrative perception, number and space relations, science education, scientific process skills

Procedia PDF Downloads 144
27991 Implementation of Smart Card Automatic Fare Collection Technology in Small Transit Agencies for Standards Development

Authors: Walter E. Allen, Robert D. Murray

Abstract:

Many large transit agencies have adopted RFID technology and electronic automatic fare collection (AFC) or smart card systems, but small and rural agencies remain tied to obsolete manual, cash-based fare collection. Small countries or transit agencies can benefit from the implementation of smart card AFC technology with the promise of increased passenger convenience, added passenger satisfaction and improved agency efficiency. For transit agencies, it reduces revenue loss, improves passenger flow and bus stop data. For countries, further implementation into security, distribution of social services or currency transactions can provide greater benefits. However, small countries or transit agencies cannot afford expensive proprietary smart card solutions typically offered by the major system suppliers. Deployment of Contactless Fare Media System (CFMS) Standard eliminates the proprietary solution, ultimately lowering the cost of implementation. Acumen Building Enterprise, Inc. chose the Yuma County Intergovernmental Public Transportation Authority (YCIPTA) existing proprietary YCAT smart card system to implement CFMS. The revised system enables the purchase of fare product online with prepaid debit or credit cards using the Payment Gateway Processor. Open and interoperable smart card standards for transit have been developed. During the 90-day Pilot Operation conducted, the transit agency gathered the data from the bus AcuFare 200 Card Reader, loads (copies) the data to a USB Thumb Drive and uploads the data to the Acumen Host Processing Center for consolidation of the data into the transit agency master data file. The transition from the existing proprietary smart card data format to the new CFMS smart card data format was transparent to the transit agency cardholders. It was proven that open standards and interoperability design can work and reduce both implementation and operational costs for small transit agencies or countries looking to expand smart card technology. Acumen was able to avoid the implementation of the Payment Card Industry (PCI) Data Security Standards (DSS) which is expensive to develop and costly to operate on a continuing basis. Due to the substantial additional complexities of implementation and the variety of options presented to the transit agency cardholder, Acumen chose to implement only the Directed Autoload. To improve the implementation efficiency and the results for a similar undertaking, it should be considered that some passengers lack credit cards and are averse to technology. There are more than 1,300 small and rural agencies in the United States. This grows by 10 fold when considering small countries or rural locations throughout Latin American and the world. Acumen is evaluating additional countries, sites or transit agency that can benefit from the smart card systems. Frequently, payment card systems require extensive security procedures for implementation. The Project demonstrated the ability to purchase fare value, rides and passes with credit cards on the internet at a reasonable cost without highly complex security requirements.

Keywords: automatic fare collection, near field communication, small transit agencies, smart cards

Procedia PDF Downloads 270
27990 Evidence-Based in Telemonitoring of Users with Pacemakers at Five Years after Implant: The Poniente Study

Authors: Antonio Lopez-Villegas, Daniel Catalan-Matamoros, Remedios Lopez-Liria

Abstract:

Objectives: The purpose of this study was to analyze clinical data, health-related quality of life (HRQoL) and functional capacity of patients using a telemonitoring follow-up system (TM) compared to patients followed-up through standard outpatient visits (HM) 5 years after the implantation of a pacemaker. Methods: This is a controlled, non-randomised, nonblinded clinical trial, with data collection carried out at 5 years after the pacemakers implant. The study was developed at Hospital de Poniente (Almeria, Spain), between October 2012 and November 2013. The same clinical outcomes were analyzed in both follow-up groups. Health-Related Quality of Life and Functional Capacity was assessed through EuroQol-5D (EQ-5D) questionnaire and Duke Activity Status Index (DASI) respectively. Sociodemographic characteristics and clinical data were also analyzed. Results: 5 years after pacemaker implant, 55 of 82 initial patients finished the study. Users with pacemakers were assigned to either a conventional follow-up group at hospital (HM=34, 50 initials) or a telemonitoring system group (TM=21, 32 initials). No significant differences were found between both groups according to sociodemographic characteristics, clinical data, Health-Related Quality of Life and Functional Capacity according to medical record and EQ5D and DASI questionnaires. In addition, conventional follow-up visits to hospital were reduced in 44,84% (p < 0,001) in the telemonitoring group in relation to hospital monitoring group. Conclusion: Results obtained in this study suggest that the telemonitoring of users with pacemakers is an equivalent option to conventional follow-up at hospital, in terms of Health-Related Quality of Life and Functional Capacity. Furthermore, it allows for the early detection of cardiovascular and pacemakers-related problem events and significantly reduces the number of in-hospital visits. Trial registration: ClinicalTrials.gov NCT02234245. The PONIENTE study has been funded by the General Secretariat for Research, Development and Innovation, Regional Government of Andalusia (Spain), project reference number PI/0256/2017, under the research call 'Development and Innovation Projects in the Field of Biomedicine and Health Sciences', 2017.

Keywords: cardiovascular diseases, health-related quality of life, pacemakers follow-up, remote monitoring, telemedicine

Procedia PDF Downloads 117
27989 Introducing Global Navigation Satellite System Capabilities into IoT Field-Sensing Infrastructures for Advanced Precision Agriculture Services

Authors: Savvas Rogotis, Nikolaos Kalatzis, Stergios Dimou-Sakellariou, Nikolaos Marianos

Abstract:

As precision holds the key for the introduction of distinct benefits in agriculture (e.g., energy savings, reduced labor costs, optimal application of inputs, improved products, and yields), it steadily becomes evident that new initiatives should focus on rendering Precision Agriculture (PA) more accessible to the average farmer. PA leverages on technologies such as the Internet of Things (IoT), earth observation, robotics and positioning systems (e.g., the Global Navigation Satellite System – GNSS - as well as individual positioning systems like GPS, Glonass, Galileo) that allow: from simple data georeferencing to optimal navigation of agricultural machinery to even more complex tasks like Variable Rate Applications. An identified customer pain point is that, from one hand, typical triangulation-based positioning systems are not accurate enough (with errors up to several meters), while on the other hand, high precision positioning systems reaching centimeter-level accuracy, are very costly (up to thousands of euros). Within this paper, a Ground-Based Augmentation System (GBAS) is introduced, that can be adapted to any existing IoT field-sensing station infrastructure. The latter should cover a minimum set of requirements, and in particular, each station should operate as a fixed, obstruction-free towards the sky, energy supplying unit. Station augmentation will allow them to function in pairs with GNSS rovers following the differential GNSS base-rover paradigm. This constitutes a key innovation element for the proposed solution that encompasses differential GNSS capabilities into an IoT field-sensing infrastructure. Integrating this kind of information supports the provision of several additional PA beneficial services such as spatial mapping, route planning, and automatic field navigation of unmanned vehicles (UVs). Right at the heart of the designed system, there is a high-end GNSS toolkit with base-rover variants and Real-Time Kinematic (RTK) capabilities. The GNSS toolkit had to tackle all availability, performance, interfacing, and energy-related challenges that are faced for a real-time, low-power, and reliable in the field operation. Specifically, in terms of performance, preliminary findings exhibit a high rover positioning precision that can even reach less than 10-centimeters. As this precision is propagated to the full dataset collection, it enables tractors, UVs, Android-powered devices, and measuring units to deal with challenging real-world scenarios. The system is validated with the help of Gaiatrons, a mature network of agro-climatic telemetry stations with presence all over Greece and beyond ( > 60.000ha of agricultural land covered) that constitutes part of “gaiasense” (www.gaiasense.gr) smart farming (SF) solution. Gaiatrons constantly monitor atmospheric and soil parameters, thus, providing exact fit to operational requirements asked from modern SF infrastructures. Gaiatrons are ultra-low-cost, compact, and energy-autonomous stations with a modular design that enables the integration of advanced GNSS base station capabilities on top of them. A set of demanding pilot demonstrations has been initiated in Stimagka, Greece, an area with a diverse geomorphological landscape where grape cultivation is particularly popular. Pilot demonstrations are in the course of validating the preliminary system findings in its intended environment, tackle all technical challenges, and effectively highlight the added-value offered by the system in action.

Keywords: GNSS, GBAS, precision agriculture, RTK, smart farming

Procedia PDF Downloads 107