Search results for: humanitarian data ecosystem
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25296

Search results for: humanitarian data ecosystem

21966 Evaluate the Changes in Stress Level Using Facial Thermal Imaging

Authors: Amin Derakhshan, Mohammad Mikaili, Mohammad Ali Khalilzadeh, Amin Mohammadian

Abstract:

This paper proposes a stress recognition system from multi-modal bio-potential signals. For stress recognition, Support Vector Machines (SVM) and LDA are applied to design the stress classifiers and its characteristics are investigated. Using gathered data under psychological polygraph experiments, the classifiers are trained and tested. The pattern recognition method classifies stressful from non-stressful subjects based on labels which come from polygraph data. The successful classification rate is 96% for 12 subjects. It means that facial thermal imaging due to its non-contact advantage could be a remarkable alternative for psycho-physiological methods.

Keywords: stress, thermal imaging, face, SVM, polygraph

Procedia PDF Downloads 472
21965 Entrepreneurs’ Perceptions of the Economic, Social and Physical Impacts of Tourism

Authors: Oktay Emir

Abstract:

The objective of this study is to determine how entrepreneurs perceive the economic, social and physical impacts of tourism. The study was conducted in the city of Afyonkarahisar, Turkey, which is rich in thermal tourism resources and investments. A survey was used as the data collection method, and the questionnaire was applied to 472 entrepreneurs. A simple random sampling method was used to identify the sample. Independent sampling t-tests and ANOVA tests were used to analyse the data obtained. Additionally, some statistically significant differences (p<0.05) were found based on the participants’ demographic characteristics regarding their opinions about the social, economic and physical impacts of tourism activities.

Keywords: tourism, perception, entrepreneurship, entrepreneurs, structural equation modelling

Procedia PDF Downloads 439
21964 Unsupervised Learning and Similarity Comparison of Water Mass Characteristics with Gaussian Mixture Model for Visualizing Ocean Data

Authors: Jian-Heng Wu, Bor-Shen Lin

Abstract:

The temperature-salinity relationship is one of the most important characteristics used for identifying water masses in marine research. Temperature-salinity characteristics, however, may change dynamically with respect to the geographic location and is quite sensitive to the depth at the same location. When depth is taken into consideration, however, it is not easy to compare the characteristics of different water masses efficiently for a wide range of areas of the ocean. In this paper, the Gaussian mixture model was proposed to analyze the temperature-salinity-depth characteristics of water masses, based on which comparison between water masses may be conducted. Gaussian mixture model could model the distribution of a random vector and is formulated as the weighting sum for a set of multivariate normal distributions. The temperature-salinity-depth data for different locations are first used to train a set of Gaussian mixture models individually. The distance between two Gaussian mixture models can then be defined as the weighting sum of pairwise Bhattacharyya distances among the Gaussian distributions. Consequently, the distance between two water masses may be measured fast, which allows the automatic and efficient comparison of the water masses for a wide range area. The proposed approach not only can approximate the distribution of temperature, salinity, and depth directly without the prior knowledge for assuming the regression family, but may restrict the complexity by controlling the number of mixtures when the amounts of samples are unevenly distributed. In addition, it is critical for knowledge discovery in marine research to represent, manage and share the temperature-salinity-depth characteristics flexibly and responsively. The proposed approach has been applied to a real-time visualization system of ocean data, which may facilitate the comparison of water masses by aggregating the data without degrading the discriminating capabilities. This system provides an interface for querying geographic locations with similar temperature-salinity-depth characteristics interactively and for tracking specific patterns of water masses, such as the Kuroshio near Taiwan or those in the South China Sea.

Keywords: water mass, Gaussian mixture model, data visualization, system framework

Procedia PDF Downloads 131
21963 Optimal Data Selection in Non-Ergodic Systems: A Tradeoff between Estimator Convergence and Representativeness Errors

Authors: Jakob Krause

Abstract:

Past Financial Crisis has shown that contemporary risk management models provide an unjustified sense of security and fail miserably in situations in which they are needed the most. In this paper, we start from the assumption that risk is a notion that changes over time and therefore past data points only have limited explanatory power for the current situation. Our objective is to derive the optimal amount of representative information by optimizing between the two adverse forces of estimator convergence, incentivizing us to use as much data as possible, and the aforementioned non-representativeness doing the opposite. In this endeavor, the cornerstone assumption of having access to identically distributed random variables is weakened and substituted by the assumption that the law of the data generating process changes over time. Hence, in this paper, we give a quantitative theory on how to perform statistical analysis in non-ergodic systems. As an application, we discuss the impact of a paragraph in the last iteration of proposals by the Basel Committee on Banking Regulation. We start from the premise that the severity of assumptions should correspond to the robustness of the system they describe. Hence, in the formal description of physical systems, the level of assumptions can be much higher. It follows that every concept that is carried over from the natural sciences to economics must be checked for its plausibility in the new surroundings. Most of the probability theory has been developed for the analysis of physical systems and is based on the independent and identically distributed (i.i.d.) assumption. In Economics both parts of the i.i.d. assumption are inappropriate. However, only dependence has, so far, been weakened to a sufficient degree. In this paper, an appropriate class of non-stationary processes is used, and their law is tied to a formal object measuring representativeness. Subsequently, that data set is identified that on average minimizes the estimation error stemming from both, insufficient and non-representative, data. Applications are far reaching in a variety of fields. In the paper itself, we apply the results in order to analyze a paragraph in the Basel 3 framework on banking regulation with severe implications on financial stability. Beyond the realm of finance, other potential applications include the reproducibility crisis in the social sciences (but not in the natural sciences) and modeling limited understanding and learning behavior in economics.

Keywords: banking regulation, non-ergodicity, risk management, semimartingale modeling

Procedia PDF Downloads 135
21962 Sustainable Happiness of Thai People: Monitoring the Thai Happiness Index

Authors: Kalayanee Senasu

Abstract:

This research investigates the influences of different factors on the happiness of Thai people, including both general factors and sustainable ones. Additionally, this study also monitors Thai people’s happiness via Thai Happiness Index developed in 2017. Besides reflecting happiness level of Thai people, this index also identifies related important issues. The data were collected by both secondary related data and primary survey data collected by interviewed questionnaires. The research data were from stratified multi-stage sampling in region, province, district, and enumeration area, and simple random sampling in each enumeration area. The research data cover 20 provinces, including Bangkok and 4-5 provinces in each region of the North, Northeastern, Central, and South. There were 4,960 usable respondents who were at least 15 years old. Statistical analyses included both descriptive and inferential statistics, including hierarchical regression and one-way ANOVA. The Alkire and Foster method was adopted to develop and calculate the Thai happiness index. The results reveal that the quality of household economy plays the most important role in predicting happiness. The results also indicate that quality of family, quality of health, and effectiveness of public administration in the provincial level have positive effects on happiness at about similar levels. For the socio-economic factors, the results reveal that age, education level, and household revenue have significant effects on happiness. For computing Thai happiness index (THaI), the result reveals the 2018 THaI value is 0.556. When people are divided into four groups depending upon their degree of happiness, it is found that a total of 21.1% of population are happy, with 6.0% called deeply happy and 15.1% called extensively happy. A total of 78.9% of population are not-yet-happy, with 31.8% called narrowly happy, and 47.1% called unhappy. A group of happy population reflects the happiness index THaI valued of 0.789, which is much higher than the THaI valued of 0.494 of the not-yet-happy population. Overall Thai people have higher happiness compared to 2017 when the happiness index was 0.506.

Keywords: happiness, quality of life, sustainability, Thai Happiness Index

Procedia PDF Downloads 157
21961 Technology Road Mapping in the Fourth Industrial Revolution: A Comprehensive Analysis and Strategic Framework

Authors: Abdul Rahman Hamdan

Abstract:

The Fourth Industrial Revolution (4IR) has brought unprecedented technological advancements that have disrupted many industries worldwide. In keeping up with the technological advances and rapid disruption by the introduction of many technological advancements brought forth by the 4IR, the use of technology road mapping has emerged as one of the critical tools for organizations to leverage. Technology road mapping can be used by many companies to guide them to become more adaptable and anticipate future transformation and innovation, and avoid being redundant or irrelevant due to the rapid changes in technological advancement. This research paper provides a comprehensive analysis of technology road mapping within the context of the 4IR. The objectives of the paper are to provide companies with practical insights and a strategic framework of technology road mapping for them to navigate the fast-changing nature of the 4IR. This study also contributes to the understanding and practice of technology road mapping in the 4IR and, at the same time, provides organizations with the necessary tools and critical insight to navigate the 4IR transformation by leveraging technology road mapping. Based on the literature review and case studies, the study analyses key principles, methodologies, and best practices in technology road mapping and integrates them with the unique characteristics and challenges of the 4IR. The research paper gives the background of the fourth industrial revolution. It explores the disruptive potential of technologies in the 4IR and the critical need for technology road mapping that consists of strategic planning and foresight to remain competitive and relevant in the 4IR era. It also highlights the importance of technology road mapping as an organisation’s proactive approach to align the organisation’s objectives and resources to their technology and product development in meeting the fast-evolving technological 4IR landscape. The paper also includes the theoretical foundations of technology road mapping and examines various methodological approaches, and identifies external stakeholders in the process, such as external experts, stakeholders, collaborative platforms, and cross-functional teams to ensure an integrated and robust technological roadmap for the organisation. Moreover, this study presents a comprehensive framework for technology road mapping in the 4IR by incorporating key elements and processes such as technology assessment, competitive intelligence, risk analysis, and resource allocation. It provides a framework for implementing technology road mapping from strategic planning, goal setting, and technology scanning to road mapping visualisation, implementation planning, monitoring, and evaluation. In addition, the study also addresses the challenges and limitations related to technology roadmapping in 4IR, including the gap analysis. In conclusion of the study, the study will propose a set of practical recommendations for organizations that intend to leverage technology road mapping as a strategic tool in the 4IR in driving innovation and becoming competitive in the current and future ecosystem.

Keywords: technology management, technology road mapping, technology transfer, technology planning

Procedia PDF Downloads 56
21960 Using Electrical Impedance Tomography to Control a Robot

Authors: Shayan Rezvanigilkolaei, Shayesteh Vefaghnematollahi

Abstract:

Electrical impedance tomography is a non-invasive medical imaging technique suitable for medical applications. This paper describes an electrical impedance tomography device with the ability to navigate a robotic arm to manipulate a target object. The design of the device includes various hardware and software sections to perform medical imaging and control the robotic arm. In its hardware section an image is formed by 16 electrodes which are located around a container. This image is used to navigate a 3DOF robotic arm to reach the exact location of the target object. The data set to form the impedance imaging is obtained by having repeated current injections and voltage measurements between all electrode pairs. After performing the necessary calculations to obtain the impedance, information is transmitted to the computer. This data is fed and then executed in MATLAB which is interfaced with EIDORS (Electrical Impedance Tomography Reconstruction Software) to reconstruct the image based on the acquired data. In the next step, the coordinates of the center of the target object are calculated by image processing toolbox of MATLAB (IPT). Finally, these coordinates are used to calculate the angles of each joint of the robotic arm. The robotic arm moves to the desired tissue with the user command.

Keywords: electrical impedance tomography, EIT, surgeon robot, image processing of electrical impedance tomography

Procedia PDF Downloads 264
21959 Impact of UV on Toxicity of Zn²⁺ and ZnO Nanoparticles to Lemna minor

Authors: Gabriela Kalcikova, Gregor Marolt, Anita Jemec Kokalj, Andreja Zgajnar Gotvajn

Abstract:

Since the 90’s, nanotechnology is one of the fastest growing fields of science. Nanomaterials are increasingly becoming part of many products and technologies. Metal oxide nanoparticles are among the most used nanomaterials. Zinc oxide nanoparticles (nZnO) is widely used due to its versatile properties; it has been used in products including plastics, paints, food, batteries, solar cells and cosmetic products. It is also a very effective photocatalyst used for water treatment. Such expanding application of nZnO increases their possible occurrence in the environment. In the aquatic ecosystem nZnO interact with natural environmental factors such as UV radiation, and thus it is essential to evaluate possible interaction between them. In this context, the aim of our study was to evaluate combined ecotoxicity of nZnO and Zn²⁺ on duckweed Lemna minor in presence or absence UV. Inhibition of vegetative growth of duckweed Lemna minor was monitored over a period of 7 days in multi-well plates. After the experiment, specific growth rate was determined. ZnO nanoparticles used were of primary size 13.6 ± 1.7 nm. The test was conducted with nominal nZnO and Zn²⁺ (in form of ZnCl₂) concentrations of 1, 10, 100 mg/L. Experiment was repeated with presence of natural intensity of UV (8h UV, 10 W/m² UVA, 0.5 W/m² UVB). Concentration of Zn during the test was determined by ICP-MS. In the regular experiment (absence of UV) the specific growth rate was slightly increased by low concentrations of nZnO and Zn²⁺ in comparison to control. However, 10 and 100 mg/L of Zn²⁺ resulted in 45% and 68% inhibition of the specific growth rate, respectively. In case of nZnO both concentrations (10 and 100 mg/L) resulted in similar ~ 30% inhibition and the response was not dose-dependent. The lack of the dose-response relationship is often observed in case of nanoparticles. The possible explanation is that the physical impact prevails instead of chemical ones. In the presence of UV the toxicity of Zn²⁺ was increased and 100 mg/L of Zn²⁺ caused total inhibition of the specific growth rate (100%). On the other hand, 100 mg/L of nZnO resulted in low inhibition (19%) in comparison to the experiment without UV (30%). It is thus expected, that tested nZnO is low photoactive, but could have a good UV absorption and/or reflective properties and thus protect duckweed against UV impacts. Measured concentration of Zn in the test suspension decreased only about 4% after 168h in the case of ZnCl₂. On the other hand concentration of Zn in nZnO test decreased by 80%. It is expected that nZnO were partially dissolved in the medium and at the same time agglomeration and sedimentation of particles took place and thus the concentration of Zn at the water level decreased. Results of our study indicated, that nZnO combined with UV of natural intensity does not increase toxicity of nZnO, but slightly protect the plant against UV negative effects. When Zn²⁺ and ZnO results are compared it seems that dissolved Zn plays a central role in the nZnO toxicity.

Keywords: duckweed, environmental factors, nanoparticles, toxicity

Procedia PDF Downloads 320
21958 Introduction of Robust Multivariate Process Capability Indices

Authors: Behrooz Khalilloo, Hamid Shahriari, Emad Roghanian

Abstract:

Process capability indices (PCIs) are important concepts of statistical quality control and measure the capability of processes and how much processes are meeting certain specifications. An important issue in statistical quality control is parameter estimation. Under the assumption of multivariate normality, the distribution parameters, mean vector and variance-covariance matrix must be estimated, when they are unknown. Classic estimation methods like method of moment estimation (MME) or maximum likelihood estimation (MLE) makes good estimation of the population parameters when data are not contaminated. But when outliers exist in the data, MME and MLE make weak estimators of the population parameters. So we need some estimators which have good estimation in the presence of outliers. In this work robust M-estimators for estimating these parameters are used and based on robust parameter estimators, robust process capability indices are introduced. The performances of these robust estimators in the presence of outliers and their effects on process capability indices are evaluated by real and simulated multivariate data. The results indicate that the proposed robust capability indices perform much better than the existing process capability indices.

Keywords: multivariate process capability indices, robust M-estimator, outlier, multivariate quality control, statistical quality control

Procedia PDF Downloads 274
21957 Appropriation of Cryptocurrencies as a Payment Method by South African Retailers

Authors: Neliswa Dyosi

Abstract:

Purpose - Using an integrated Technology-Organization-Environment (TOE) framework and the model of technology appropriation (MTA) as a theoretical lens, this interpretive qualitative study seeks to understand and explain the factors that influence the appropriation, non-appropriation, and disappropriation of bitcoin as a payment method by South African retailers. Design/methodology/approach –The study adopts the interpretivist philosophical paradigm. Multiple case studies will be adopted as a research strategy. For data collection, the study follows a qualitative approach. Qualitative data will be collected from the six retailers in various industries. Semi-structured interviews and documents will be used as the data collection techniques. Purposive and snowballing sampling techniques will be used to identify participants within the organizations. Data will be analyzed using thematic analysis. Originality/value - Using the deduction approach, the study seeks to provide a descriptive and explanatory contribution to theory. The study contributes to theory development by integrating the MTA and TOE frameworks as a means to understand technology adoption behaviors of organizations, in this case, retailers. This is also the first study that looks at an integrated approach of the Technology-Organization-Environment (TOE) framework and the MTA framework to understand the adoption and use of a payment method. South Africa is ranked amongst the top ten countries in the world on cryptocurrency adoption. There is, however, still a dearth of literature on the current state of adoption and usage of bitcoin as a payment method in South Africa. The study will contribute to the existing literature as bitcoin cryptocurrency is gaining popularity as an alternative payment method across the globe.

Keywords: cryptocurrency, bitcoin, payment methods, blockchain, appropriation, online retailers, TOE framework, disappropriation, non-appropriation

Procedia PDF Downloads 121
21956 Simulation of Turbulent Flow in Channel Using Generalized Hydrodynamic Equations

Authors: Alex Fedoseyev

Abstract:

This study explores Generalized Hydrodynamic Equations (GHE) for the simulation of turbulent flows. The GHE was derived from the Generalized Boltzmann Equation (GBE) by Alexeev (1994). GBE was obtained by first principles from the chain of Bogolubov kinetic equations and considered particles of finite dimensions, Alexeev (1994). The GHE has new terms, temporal and spatial fluctuations compared to the Navier-Stokes equations (NSE). These new terms have a timescale multiplier τ, and the GHE becomes the NSE when τ is zero. The nondimensional τ is a product of the Reynolds number and the squared length scale ratio, τ=Re*(l/L)², where l is the apparent Kolmogorov length scale, and L is a hydrodynamic length scale. The turbulence phenomenon is not well understood and is not described by NSE. An additional one or two equations are required for the turbulence model, which may have to be tuned for specific problems. We show that, in the case of the GHE, no additional turbulence model is needed, and the turbulent velocity profile is obtained from the GHE. The 2D turbulent channel and circular pipe flows were investigated using a numerical solution of the GHE for several cases. The solutions are compared with the experimental data in the circular pipes and 2D channels by Nicuradse (1932, Prandtl Lab), Hussain and Reynolds (1975), Wei and Willmarth (1989), Van Doorne (2007), theory by Wosnik, Castillo and George (2000), and the relevant experiments on Superpipe setup at Princeton, data by Zagarola (1996) and Zagarola and Smits (1998), the Reynolds number is from Re=7200 to Re=960000. The numerical solution data compared well with the experimental data, as well as with the approximate analytical solution for turbulent flow in channel Fedoseyev (2023). The obtained results confirm that the Alexeev generalized hydrodynamic theory (GHE) is in good agreement with the experiments for turbulent flows. The proposed approach is limited to 2D and 3D axisymmetric channel geometries. Further work will extend this approach by including channels with square and rectangular cross-sections.

Keywords: comparison with experimental data. generalized hydrodynamic equations, numerical solution, turbulent boundary layer, turbulent flow in channel

Procedia PDF Downloads 55
21955 Analyzing Current Transformers Saturation Characteristics for Different Connected Burden Using LabVIEW Data Acquisition Tool

Authors: D. Subedi, S. Pradhan

Abstract:

Current transformers are an integral part of power system because it provides a proportional safe amount of current for protection and measurement applications. However when the power system experiences an abnormal situation leading to huge current flow, then this huge current is proportionally injected to the protection and metering circuit. Since the protection and metering equipment’s are designed to withstand only certain amount of current with respect to time, these high currents pose a risk to man and equipment. Therefore during such instances, the CT saturation characteristics have a huge influence on the safety of both man and equipment and also on the reliability of the protection and metering system. This paper shows the effect of burden on the Accuracy Limiting factor/ Instrument security factor of current transformers and also the change in saturation characteristics of the CT’s. The response of the CT to varying levels of overcurrent at different connected burden will be captured using the data acquisition software LabVIEW. Analysis is done on the real time data gathered using LabVIEW. Variation of current transformer saturation characteristics with changes in burden will be discussed.

Keywords: accuracy limiting factor, burden, current transformer, instrument security factor, saturation characteristics

Procedia PDF Downloads 407
21954 An Ecological Grandeur: Environmental Ethics in Buddhist Perspective

Authors: Merina Islam

Abstract:

There are many environmental problems. Various counter measures have been taken for environmental problems. Philosophy is an important contributor to environmental studies as it takes deep interest in meaning analysis of the concept environment and other related concepts. The Buddhist frame, which is virtue ethical, remains a better alternative to the traditional environmental outlook. Granting the unique role of man in immoral deliberations, the Buddhist approach, however, maintains a holistic concept of ecological harmony. Buddhist environmental ethics is more concerned about the complete moral community, the total ecosystem, than any particular species within the community. The moral reorientation proposed here has resemblance to the concept of 'deep ecology. Given the present day prominence of virtue ethics, we need to explore further into the Buddhist virtue theory, so that a better framework to treat the natural world would be ensured. Environment has turned out to be one of the most widely discussed issues in the recent times. Buddhist concepts such as Pratityasamutpadavada, Samvrit Satya, Paramartha Satya, Shunyata, Sanghatvada, Bodhisattva, Santanvada and others deal with interdependence in terms of both internal as well external ecology. The internal ecology aims at mental well-being whereas external ecology deals with physical well-being. The fundamental Buddhist concepts for dealing with environmental Problems are where the environment has the same value as humans as from the two Buddhist doctrines of the Non-duality of Life and its Environment and the Origination in Dependence; and the inevitability of overcoming environmental problems through the practice of the way of the Bodhisattva, because environmental problems are evil for people and nature. Buddhism establishes that there is a relationship among all the constituents of the world. There is nothing in the world which is independent from any other thing. Everything is dependent on others. The realization that everything in the universe is mutually interdependent also shows that the man cannot keep itself unaffected from ecology. This paper would like to focus how the Buddhist’s identification of nature and the Dhamma can contribute toward transforming our understanding, attitudes, and actions regarding the care of the earth. Environmental Ethics in Buddhism presents a logical and thorough examination of the metaphysical and ethical dimensions of early Buddhist literature. From the Buddhist viewpoint, humans are not in a category that is distinct and separate from other sentient beings, nor are they intrinsically superior. All sentient beings are considered to have the Buddha-nature, that is, the potential to become fully enlightened. Buddhists do not believe in treating of non-human sentient beings as objects for human consumption. The significance of Buddhist theory of interdependence can be understood from the fact that it shows that one’s happiness or suffering originates from ones realization or non-realization respectively of the dependent nature of everything. It is obvious, even without emphasis, which in the context of deep ecological crisis of today there is a need to infuse the consciousness of interdependence.

Keywords: Buddhism, deep ecology, environmental problems, Pratityasamutpadavada

Procedia PDF Downloads 299
21953 Crowdsensing Project in the Brazilian Municipality of Florianópolis for the Number of Visitors Measurement

Authors: Carlos Roberto De Rolt, Julio da Silva Dias, Rafael Tezza, Luca Foschini, Matteo Mura

Abstract:

The seasonal population fluctuation presents a challenge to touristic cities since the number of inhabitants can double according to the season. The aim of this work is to develop a model that correlates the waste collected with the population of the city and also allow cooperation between the inhabitants and the local government. The model allows public managers to evaluate the impact of the seasonal population fluctuation on waste generation and also to improve planning resource utilization throughout the year. The study uses data from the company that collects the garbage in Florianópolis, a Brazilian city that presents the profile of a city that attracts tourists due to numerous beaches and warm weather. The fluctuations are caused by the number of people that come to the city throughout the year for holidays, summer time vacations or business events. Crowdsensing will be accomplished through smartphones with access to an app for data collection, with voluntary participation of the population. Crowdsensing participants can access information collected in waves for this portal. Crowdsensing represents an innovative and participatory approach which involves the population in gathering information to improve the quality of life. The management of crowdsensing solutions plays an essential role given the complexity to foster collaboration, establish available sensors and collect and process the collected data. Practical implications of this tool described in this paper refer, for example, to the management of seasonal tourism in a large municipality, whose public services are impacted by the floating of the population. Crowdsensing and big data support managers in predicting the arrival, permanence, and movement of people in a given urban area. Also, by linking crowdsourced data to databases from other public service providers - e.g., water, garbage collection, electricity, public transport, telecommunications - it is possible to estimate the floating of the population of an urban area affected by seasonal tourism. This approach supports the municipality in increasing the effectiveness of resource allocation while, at the same time, increasing the quality of the service as perceived by citizens and tourists.

Keywords: big data, dashboards, floating population, smart city, urban management solutions

Procedia PDF Downloads 275
21952 Debris Flow Mapping Using Geographical Information System Based Model and Geospatial Data in Middle Himalayas

Authors: Anand Malik

Abstract:

The Himalayas with high tectonic activities poses a great threat to human life and property. Climate change is another reason which triggering extreme events multiple fold effect on high mountain glacial environment, rock falls, landslides, debris flows, flash flood and snow avalanches. One such extreme event of cloud burst along with breach of moraine dammed Chorabri Lake occurred from June 14 to June 17, 2013, triggered flooding of Saraswati and Mandakini rivers in the Kedarnath Valley of Rudraprayag district of Uttrakhand state of India. As a result, huge volume of water with its high velocity created a catastrophe of the century, which resulted into loss of large number of human/animals, pilgrimage, tourism, agriculture and property. Thus a comprehensive assessment of debris flow hazards requires GIS-based modeling using numerical methods. The aim of present study is to focus on analysis and mapping of debris flow movements using geospatial data with flow-r (developed by team at IGAR, University of Lausanne). The model is based on combined probabilistic and energetic algorithms for the assessment of spreading of flow with maximum run out distances. Aster Digital Elevation Model (DEM) with 30m x 30m cell size (resolution) is used as main geospatial data for preparing the run out assessment, while Landsat data is used to analyze land use land cover change in the study area. The results of the study area show that model can be applied with great accuracy as the model is very useful in determining debris flow areas. The results are compared with existing available landslides/debris flow maps. ArcGIS software is used in preparing run out susceptibility maps which can be used in debris flow mitigation and future land use planning.

Keywords: debris flow, geospatial data, GIS based modeling, flow-R

Procedia PDF Downloads 257
21951 Causes of Terrorism: Perceptions of University Students of Teacher Training Institutions

Authors: Saghir Ahmad, Abid Hussain Ch, Misbah Malik, Ayesha Batool

Abstract:

Terrorism is the marvel in which dreadful circumstance is made by a gathering of individuals who view themselves as abused by society. Terrorism is the unlawful utilization of power or viciousness by a man or a sorted out gathering by the general population or property with the aim of intimidation or compulsion of social orders or governments frequently for ideological or political reasons. Terrorism is as old as people. The main aim of the study was to find out the causes of terrorism through the perceptions of the universities students of teacher training institutions. This study was quantitative in nature. Survey method was used to collect data. A sample of two hundred and sixty seven students was selected from public universities. A five point Likert scale was used to collect data. Mean, Standard deviation, independent sample t-test, and One Way ANOVA were applied to analyze the data. The major findings of the study indicated that students perceived the main causes of terrorism are poverty, foreign interference, wrong concept of Islamization, and social injustice. It is also concluded that mostly, students think that drone attacks are promoting the terrorist activities. The education is key to eliminate the terrorism. There is need to educate the people and specially youngsters to bring the peace in the world.

Keywords: dreadful circumstance, governments, power, students, terrorism

Procedia PDF Downloads 531
21950 Livestock Production in Vietnam: Technical Efficiency and Productivity Performance Based on Regional Differences

Authors: Diep Thanh Tung

Abstract:

This study aims to measure technical efficiency and examine productivity performance of livestock production in regions of Vietnam based on a panel data of 2008–2012. After four years, although there are improvements in efficiency of some regions, low technical efficiency, poor performance of productivity and its compositions are dominant features in almost regions. Households which much depend on livestock income in agricultural income or agricultural income in total income are more vulnerable than the others in term of livestock production.

Keywords: data envelopment analysis, meta-frontier, Malmquist, technical efficiency, livestock production

Procedia PDF Downloads 691
21949 Determining the Information Technologies Usage and Learning Preferences of Construction

Authors: Naci Büyükkaracığan, Yıldırım Akyol

Abstract:

Information technology is called the technology which provides transmission of information elsewhere regardless of time, location, distance. Today, information technology is providing the occurrence of ground breaking changes in all areas of our daily lives. Information can be reached quickly to millions of people with help of information technology. In this Study, effects of information technology on students for educations and their learning preferences were demonstrated with using data obtained from questionnaires administered to students of 2015-2016 academic year at Selcuk University Kadınhanı Faik İçil Vocational School Construction Department. The data was obtained by questionnaire consisting of 30 questions that was prepared by the researchers. SPSS 21.00 package programme was used for statistical analysis of data. Chi-square tests, Mann-Whitney U test, Kruskal-Wallis and Kolmogorov-Smirnov tests were used in the data analysis for Descriptiving statistics. In a study conducted with the participation of 61 students, 93.4% of students' reputation of their own information communication device (computer, smart phone, etc.) That have been shown to be at the same rate and to the internet. These are just a computer of itself, then 45.90% of the students. The main reasons for the students' use of the Internet, social networking sites are 85.24%, 13.11% following the news of the site, as seen. All student assignments in information technology, have stated that they use in the preparation of the project. When students acquire scientific knowledge in the profession regarding their preferred sources evaluated were seen exactly when their preferred internet. Male students showed that daily use of information technology while compared to female students was statistically significantly less. Construction Package program where students are eager to learn about the reputation of 72.13% and 91.80% identified in the well which they agreed that an indispensable element in the professional advancement of information technology.

Keywords: information technologies, computer, construction, internet, learning systems

Procedia PDF Downloads 287
21948 Students' Perceptions of Assessment and Feedback in Higher Education

Authors: Jonathan Glazzard

Abstract:

National student satisfaction data in England demonstrate that undergraduate students are less satisfied overall with assessment and feedback than other aspects of their higher education courses. Given that research findings suggest that high-quality feedback is a critical factor associated with academic achievement, it is important that feedback enables students to demonstrate improved academic achievement in their subsequent assessments. Given the growing importance of staff-student partnerships in higher education, this research examined students’ perceptions of assessment and feedback in one UK university. Students’ perceptions were elicited through the use of a university-wide survey which was completed by undergraduate students. In addition, three focus groups were used to provide qualitative student perception data across the three university Facilities. The data indicate that whilst students valued detailed feedback on their work, less detailed feedback could be compensated for by the development of pre-assessment literacy skills which are front-loaded into courses. Assessment literacy skills valued by students included the use of clear assessment criteria and assignment briefings which enabled students to fully understand the assessment task. Additionally, students valued assessment literacy pre-assessment tasks which enabled them to understand the standards which they were expected to achieve. Students valued opportunities for self and peer assessment prior to the final assessment and formative assessment feedback which matched the summative assessment feedback. Students also valued dialogic face-to-face feedback after receiving written feedback Above all, students valued feedback which was particular to their work and which gave recognition for the effort they had put into completing specific assessments. The data indicate that there is a need for higher education lecturers to receive systematic training in assessment and feedback which provides a comprehensive grounding in pre-assessment literacy skills.

Keywords: formative assessment, summative assessment, feedback, marking

Procedia PDF Downloads 305
21947 Dendroremediation of a Defunct Lead Acid Battery Recycling Site

Authors: Alejandro Ruiz-Olivares, M. del Carmen González-Chávez, Rogelio Carrillo-González, Martha Reyes-Ramos, Javier Suárez Espinosa

Abstract:

Use of automobiles has increased and proportionally, the demand for batteries to impulse them. When the device is aged, all the battery materials are reused through lead acid battery recycling (LABR). Importation of used lead acid batteries in Mexico has increased in the last years since many recycling factories have been settled in the country. Inadequate disposal of lead-acid battery recycling (LABR) wastes left soil severely polluted with Pb, Cu, and salts (Na+, SO2− 4, PO3− 4). Soil organic amendments may contribute with essential nutrients and sequester (scavenger compounds) metals to allow plant establishment. The objective of this research was to revegetate a former lead-acid battery recycling site aided with organic amendments. Seven tree species (Acacia farnesiana, Casuarina equisetifolia, Cupressus lusitanica, Eucalyptus obliqua, Fraxinus excelsior, Prosopis laevigata and Pinus greggii) and two organic amendments (vermicompost and vermicompost + sawdust mixture) were tested for phytoremediation of a defunct LABR site. Plants were irrigated during the dry season. Monitoring of the soils was carried out during the experiment: Available metals, salts concentrations and their spatial pattern in soil were analyzed. Plant species and amendments were compared through analysis of covariance and longitudinal analysis. High concentrations of extractable (DTPA-TEA-CaCl₂) metals (up to 15,685 mg kg⁻¹ and 478 mg kg⁻¹ for Pb and Cu) and soluble salts (292 mg kg-1 and 23,578 mg kg-1 for PO3− 4and SO2− 4) were found in the soil after three and six months of setting up the experiment. Lead and Cu concentrations were depleted in the rhizosphere after amendments addition. Spatial pattern of PO3− 4, SO2− 4 and DTPA-extractable Pb and Cu changed slightly through time. In spite of extreme soil conditions the plant species planted: A. farnesiana, E. obliqua, C. equisetifolia and F. excelsior had 100% of survival. Available metals and salts differently affected each species. In addition, negative effect on growth due to Pb accumulated in shoots was observed only in C. lusitanica. Many specimens accumulated high concentrations of Pb ( > 1000 mg kg-1) in shoots. C. equisetifolia and C. lusitanica had the best rate of growth. Based on the results, all the evaluated species may be useful for revegetation of Pb-polluted soils. Besides their use in phytoremediation, some ecosystem services can be obtained from the woodland such as encourage wildlife, wood production, and carbon sequestration. Further research should be conducted to analyze these services.

Keywords: heavy metals, inadequate disposal, organic amendments, phytoremediation with trees

Procedia PDF Downloads 270
21946 Cybersecurity Assessment of Decentralized Autonomous Organizations in Smart Cities

Authors: Claire Biasco, Thaier Hayajneh

Abstract:

A smart city is the integration of digital technologies in urban environments to enhance the quality of life. Smart cities capture real-time information from devices, sensors, and network data to analyze and improve city functions such as traffic analysis, public safety, and environmental impacts. Current smart cities face controversy due to their reliance on real-time data tracking and surveillance. Internet of Things (IoT) devices and blockchain technology are converging to reshape smart city infrastructure away from its centralized model. Connecting IoT data to blockchain applications would create a peer-to-peer, decentralized model. Furthermore, blockchain technology powers the ability for IoT device data to shift from the ownership and control of centralized entities to individuals or communities with Decentralized Autonomous Organizations (DAOs). In the context of smart cities, DAOs can govern cyber-physical systems to have a greater influence over how urban services are being provided. This paper will explore how the core components of a smart city now apply to DAOs. We will also analyze different definitions of DAOs to determine their most important aspects in relation to smart cities. Both categorizations will provide a solid foundation to conduct a cybersecurity assessment of DAOs in smart cities. It will identify the benefits and risks of adopting DAOs as they currently operate. The paper will then provide several mitigation methods to combat cybersecurity risks of DAO integrations. Finally, we will give several insights into what challenges will be faced by DAO and blockchain spaces in the coming years before achieving a higher level of maturity.

Keywords: blockchain, IoT, smart city, DAO

Procedia PDF Downloads 99
21945 A Versatile Data Processing Package for Ground-Based Synthetic Aperture Radar Deformation Monitoring

Authors: Zheng Wang, Zhenhong Li, Jon Mills

Abstract:

Ground-based synthetic aperture radar (GBSAR) represents a powerful remote sensing tool for deformation monitoring towards various geohazards, e.g. landslides, mudflows, avalanches, infrastructure failures, and the subsidence of residential areas. Unlike spaceborne SAR with a fixed revisit period, GBSAR data can be acquired with an adjustable temporal resolution through either continuous or discontinuous operation. However, challenges arise from processing high temporal-resolution continuous GBSAR data, including the extreme cost of computational random-access-memory (RAM), the delay of displacement maps, and the loss of temporal evolution. Moreover, repositioning errors between discontinuous campaigns impede the accurate measurement of surface displacements. Therefore, a versatile package with two complete chains is developed in this study in order to process both continuous and discontinuous GBSAR data and address the aforementioned issues. The first chain is based on a small-baseline subset concept and it processes continuous GBSAR images unit by unit. Images within a window form a basic unit. By taking this strategy, the RAM requirement is reduced to only one unit of images and the chain can theoretically process an infinite number of images. The evolution of surface displacements can be detected as it keeps temporarily-coherent pixels which are present only in some certain units but not in the whole observation period. The chain supports real-time processing of the continuous data and the delay of creating displacement maps can be shortened without waiting for the entire dataset. The other chain aims to measure deformation between discontinuous campaigns. Temporal averaging is carried out on a stack of images in a single campaign in order to improve the signal-to-noise ratio of discontinuous data and minimise the loss of coherence. The temporal-averaged images are then processed by a particular interferometry procedure integrated with advanced interferometric SAR algorithms such as robust coherence estimation, non-local filtering, and selection of partially-coherent pixels. Experiments are conducted using both synthetic and real-world GBSAR data. Displacement time series at the level of a few sub-millimetres are achieved in several applications (e.g. a coastal cliff, a sand dune, a bridge, and a residential area), indicating the feasibility of the developed GBSAR data processing package for deformation monitoring of a wide range of scientific and practical applications.

Keywords: ground-based synthetic aperture radar, interferometry, small baseline subset algorithm, deformation monitoring

Procedia PDF Downloads 146
21944 Perception of Hygiene Knowledge among Staff Working in Top Five Famous Restaurants of Male’

Authors: Zulaikha Reesha Rashaad

Abstract:

One of the major factors which can contribute greatly to success of catering businesses is to employ food and beverage staff having sound hygiene knowledge. Individuals having sound knowledge of hygiene has a higher chance of following safe food practices in food production. One of the leading causes of food poisoning and food borne illnesses has been identified as lack of hygiene knowledge among food and beverage staff working in catering establishments and restaurants. This research aims to analyze the hygiene knowledge among food and beverage staff working in top five restaurants of Male’, in relation to their age, educational background, occupation and training. The research uses quantitative and descriptive methods in data collection and in data analysis. Data was obtained through random sampling technique with self-administered survey questionnaires which was completed by 60 respondents working in 5 different restaurants operating at top level in Male’. The respondents of the research were service staff and chefs working in these restaurants. The responses to the questionnaires have been analyzed by using SPSS. The results of the research indicated that age, education level, occupation and training correlated with hygiene knowledge perception scores.

Keywords: food and beverage staff, food poisoning, food production, hygiene knowledge

Procedia PDF Downloads 271
21943 Image Features Comparison-Based Position Estimation Method Using a Camera Sensor

Authors: Jinseon Song, Yongwan Park

Abstract:

In this paper, propose method that can user’s position that based on database is built from single camera. Previous positioning calculate distance by arrival-time of signal like GPS (Global Positioning System), RF(Radio Frequency). However, these previous method have weakness because these have large error range according to signal interference. Method for solution estimate position by camera sensor. But, signal camera is difficult to obtain relative position data and stereo camera is difficult to provide real-time position data because of a lot of image data, too. First of all, in this research we build image database at space that able to provide positioning service with single camera. Next, we judge similarity through image matching of database image and transmission image from user. Finally, we decide position of user through position of most similar database image. For verification of propose method, we experiment at real-environment like indoor and outdoor. Propose method is wide positioning range and this method can verify not only position of user but also direction.

Keywords: positioning, distance, camera, features, SURF(Speed-Up Robust Features), database, estimation

Procedia PDF Downloads 335
21942 AI Applications in Accounting: Transforming Finance with Technology

Authors: Alireza Karimi

Abstract:

Artificial Intelligence (AI) is reshaping various industries, and accounting is no exception. With the ability to process vast amounts of data quickly and accurately, AI is revolutionizing how financial professionals manage, analyze, and report financial information. In this article, we will explore the diverse applications of AI in accounting and its profound impact on the field. Automation of Repetitive Tasks: One of the most significant contributions of AI in accounting is automating repetitive tasks. AI-powered software can handle data entry, invoice processing, and reconciliation with minimal human intervention. This not only saves time but also reduces the risk of errors, leading to more accurate financial records. Pattern Recognition and Anomaly Detection: AI algorithms excel at pattern recognition. In accounting, this capability is leveraged to identify unusual patterns in financial data that might indicate fraud or errors. AI can swiftly detect discrepancies, enabling auditors and accountants to focus on resolving issues rather than hunting for them. Real-Time Financial Insights: AI-driven tools, using natural language processing and computer vision, can process documents faster than ever. This enables organizations to have real-time insights into their financial status, empowering decision-makers with up-to-date information for strategic planning. Fraud Detection and Prevention: AI is a powerful tool in the fight against financial fraud. It can analyze vast transaction datasets, flagging suspicious activities and reducing the likelihood of financial misconduct going unnoticed. This proactive approach safeguards a company's financial integrity. Enhanced Data Analysis and Forecasting: Machine learning, a subset of AI, is used for data analysis and forecasting. By examining historical financial data, AI models can provide forecasts and insights, aiding businesses in making informed financial decisions and optimizing their financial strategies. Artificial Intelligence is fundamentally transforming the accounting profession. From automating mundane tasks to enhancing data analysis and fraud detection, AI is making financial processes more efficient, accurate, and insightful. As AI continues to evolve, its role in accounting will only become more significant, offering accountants and finance professionals powerful tools to navigate the complexities of modern finance. Embracing AI in accounting is not just a trend; it's a necessity for staying competitive in the evolving financial landscape.

Keywords: artificial intelligence, accounting automation, financial analysis, fraud detection, machine learning in finance

Procedia PDF Downloads 48
21941 The Influences of Accountants’ Potential Performance on Their Working Process: Government Savings Bank, Northeast, Thailand

Authors: Prateep Wajeetongratana

Abstract:

The purpose of this research was to study the influence of accountants’ potential performance on their working process, a case study of Government Savings Banks in the northeast of Thailand. The independent variables included accounting knowledge, accounting skill, accounting value, accounting ethics, and accounting attitude, while the dependent variable included the success of the working process. A total of 155 accountants working for Government Savings Banks were selected by random sampling. A questionnaire was used as a tool for collecting data. Descriptive statistics in this research included percentage, mean, and multiple regression analyses. The findings revealed that the majority of accountants were female with an age between 35-40 years old. Most of the respondents had an undergraduate degree with ten years of experience. Moreover, the factors of accounting knowledge, accounting skill, accounting a value and accounting ethics and accounting attitude were rated at a high level. The findings from regression analysis of observation data revealed a causal relationship in that the observation data could explain at least 51 percent of the success in the accountants’ working process.

Keywords: influence, potential performance, success, working process

Procedia PDF Downloads 213
21940 Investigation of Public Perception of Air Pollution and Life Quality in Tehran

Authors: Roghayeh Karami, Ahmad Gharaei

Abstract:

Backgrounds and objectives: This study was undertaken at four different sites (north polluted, south polluted, south healthy and north healthy) in Tehran, in order to examine whether there was a relationship between publicly available air quality data and the public’s perception of air quality and to suggest some guidelines for reducing air pollution. Materials and Methods: A total of 200 people were accidentally filled out the research questionnaires at mentioned sites and air quality data were obtained simultaneously from the Air Quality Control Department. Data was analyzed in Excel and SPSS software. Results: Clean air and secure job were of great importance to people comparing to other pleasant aspect of life. Also air pollution and fear of dangerous diseases were the most important of people concerns. The Indies bored /news paper services on air quality were little used by the public as a means of obtaining information on air pollution. Using public transportation and avoid unessential journeys are the most important ways for reducing air pollution. Conclusion: The results reveal that the public’s perception of air quality is not a reliable indicator of the actual levels of air pollution. Current earths to down actions are not effective and enough in reducing air pollution, therefore it seems participatory management and public participation is suitable guideline.

Keywords: air pollution, quality of life, opinion poll, public participation

Procedia PDF Downloads 475
21939 Analysis Of Magnetic Anomaly Data For Identification Subsurface Structure Geothermal Manifestations Area Candi Umbul, Grabag, Magelang, Central Java Province, Indonesia

Authors: Ikawati Wulandari

Abstract:

Acquisition of geomagnetic field has been done at Geothermal manifestation Candi Umbul, Grabag, Magelang, Central Java Province on 10-12 May 2013. The purpose of this research to study sub-surface structure condition and the structure which control the hot springs manifestation. The research area have size of 1,5 km x 2 km and measurement spacing of 150 m. Total magnetic field data, the position, and the north pole direction have acquired by Proton Precession Magnetometer (PPM), Global Positioning System (GPS), and of geology compass, respectively. The raw data has been processed and performed using IGRF (International Geomagnetics Reference Field) correction to obtain total field magnetic anomaly. Upward continuation was performed at 100 meters height using software Magpick. Analysis conclude horizontal position of the body causing anomaly which is located at hot springs manifestation, and it stretch along Northeast - Southwest, which later interpreted as normal fault. This hotsprings manifestation was controlled by the downward fault which becomes a weak zone where hot water from underground the geothermal reservoir leakage

Keywords: PPM, Geothermal, Fault, Grabag

Procedia PDF Downloads 441
21938 Cirrhosis Mortality Prediction as Classification using Frequent Subgraph Mining

Authors: Abdolghani Ebrahimi, Diego Klabjan, Chenxi Ge, Daniela Ladner, Parker Stride

Abstract:

In this work, we use machine learning and novel data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. To the best of our knowledge, this is the first work to apply modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.

Keywords: machine learning, liver cirrhosis, subgraph mining, supervised learning

Procedia PDF Downloads 126
21937 Self-Efficacy Perceptions of Pre-Service Art and Music Teachers towards the Use of Information and Communication Technologies

Authors: Agah Tugrul Korucu

Abstract:

Information and communication technologies have become an important part of our daily lives with significant investments in technology in the 21st century. Individuals are more willing to design and implement computer-related activities, and they are the main component of computer self-efficacy and self-efficacy related to the fact that the increase in information technology, with operations in parallel with these activities more successful. The Self-efficacy level is a significant factor which determines how individuals act in events, situations and difficult processes. It is observed that individuals with higher self-efficacy perception of computers who encounter problems related to computer use overcome them more easily. Therefore, this study aimed to examine self-efficacy perceptions of pre-service art and music teachers towards the use of information and communication technologies in terms of different variables. Research group consists of 60 pre-service teachers who are studying at Necmettin Erbakan University Ahmet Keleşoğlu Faculty of Education Art and Music department. As data collection tool of the study; “personal information form” developed by the researcher and used to collect demographic data and "the perception scale related to self-efficacy of informational technology" are used. The scale is 5-point Likert-type scale. It consists of 27 items. The Kaiser-Meyer-Olkin (KMO) sample compliance value is found 0.959. The Cronbach alpha reliability coefficient of the scale is found to be 0.97. computer-based statistical software package (SPSS 21.0) is used in order to analyze the data collected by data collection tools; descriptive statistics, t-test, analysis of variance are used as statistical techniques.

Keywords: self-efficacy perceptions, teacher candidate, information and communication technologies, art teacher

Procedia PDF Downloads 314