Search results for: heterogeneous massive data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25154

Search results for: heterogeneous massive data

24734 To Handle Data-Driven Software Development Projects Effectively

Authors: Shahnewaz Khan

Abstract:

Machine learning (ML) techniques are often used in projects for creating data-driven applications. These tasks typically demand additional research and analysis. The proper technique and strategy must be chosen to ensure the success of data-driven projects. Otherwise, even exerting a lot of effort, the necessary development might not always be possible. In this post, an effort to examine the workflow of data-driven software development projects and its implementation process in order to describe how to manage a project successfully. Which will assist in minimizing the added workload.

Keywords: data, data-driven projects, data science, NLP, software project

Procedia PDF Downloads 58
24733 Nuclear Terrorism Decision Making: A Comparative Study of South Asian Nuclear Weapons States

Authors: Muhammad Jawad Hashmi

Abstract:

The idea of nuclear terrorism is as old as nuclear weapons but the global concerns of likelihood of nuclear terrorism are uncertain. Post 9/11 trends manifest that terrorists are believers of massive causalities. Innovation in terrorist’s tactics, sophisticated weaponry, vulnerability, theft and smuggling of nuclear/radiological material, connections between terrorists, black market and rough regimes are signaling seriousness of upcoming challenges as well as global trends of “terror-transnationalism.” Furthermore, the International-Atomic-Energy-Agency’s database recorded 2734 incidents regarding misuse, unauthorized possession, trafficking of nuclear material etc. Since, this data also includes incidents from south Asia, so, there is every possibility to claim that such illicit activities may increase in future, mainly due to expansion of nuclear industry in South Asia. Moreover, due to such mishaps the region is vulnerable to threats of nuclear terrorism. This is also a reason that the region is in limelight along with issues such as rapidly growing nuclear arsenals, nuclear safety and security, terrorism and political instability. With this backdrop, this study is aimed to investigate the prevailing threats and challenges in South Asia vis a vis nuclear safety and security. A comparative analysis of the overall capabilities would be done to identify the areas of cooperation to eliminate the probability of nuclear/radiological terrorism in the region.

Keywords: nuclear terrorism, safety, security, South Asia, india, Pakistan

Procedia PDF Downloads 337
24732 AMBICOM: An Ambient Computing Middleware Architecture for Heterogeneous Environments

Authors: Ekrem Aksoy, Nihat Adar, Selçuk Canbek

Abstract:

Ambient Computing or Ambient Intelligence (AmI) is emerging area in computer science aiming to create intelligently connected environments and Internet of Things. In this paper, we propose communication middleware architecture for AmI. This middleware architecture addresses problems of communication, networking, and abstraction of applications, although there are other aspects (e.g. HCI and Security) within general AmI framework. Within this middleware architecture, any application developer might address HCI and Security issues with extensibility features of this platform.

Keywords: AmI, ambient computing, middleware, distributed-systems, software-defined networking

Procedia PDF Downloads 260
24731 Remote Sensing and Geographic Information Systems for Identifying Water Catchments Areas in the Northwest Coast of Egypt for Sustainable Agricultural Development

Authors: Mohamed Aboelghar, Ayman Abou Hadid, Usama Albehairy, Asmaa Khater

Abstract:

Sustainable agricultural development of the desert areas of Egypt under the pressure of irrigation water scarcity is a significant national challenge. Existing water harvesting techniques on the northwest coast of Egypt do not ensure the optimal use of rainfall for agricultural purposes. Basin-scale hydrology potentialities were studied to investigate how available annual rainfall could be used to increase agricultural production. All data related to agricultural production included in the form of geospatial layers. Thematic classification of Sentinal-2 imagery was carried out to produce the land cover and crop maps following the (FAO) system of land cover classification. Contour lines and spot height points were used to create a digital elevation model (DEM). Then, DEM was used to delineate basins, sub-basins, and water outlet points using the Soil and Water Assessment Tool (Arc SWAT). Main soil units of the study area identified from Land Master Plan maps. Climatic data collected from existing official sources. The amount of precipitation, surface water runoff, potential, and actual evapotranspiration for the years (2004 to 2017) shown as results of (Arc SWAT). The land cover map showed that the two tree crops (olive and fig) cover 195.8 km2 when herbaceous crops (barley and wheat) cover 154 km2. The maximum elevation was 250 meters above sea level when the lowest one was 3 meters below sea level. The study area receives a massive variable amount of precipitation; however, water harvesting methods are inappropriate to store water for purposes.

Keywords: water catchements, remote sensing, GIS, sustainable agricultural development

Procedia PDF Downloads 94
24730 A CM-Based Model for 802.11 Networks Security Policies Enforcement

Authors: Karl Mabiala Dondia, Jing Ma

Abstract:

In recent years, networks based on the 802.11 standards have gained a prolific deployment. The reason for this massive acceptance of the technology by both home users and corporations is assuredly due to the "plug-and-play" nature of the technology and the mobility. The lack of physical containment due to inherent nature of the wireless medium makes maintenance very challenging from a security standpoint. This study examines via continuous monitoring various predictable threats that 802.11 networks can face, how they are executed, where each attack may be executed and how to effectively defend against them. The key goal is to identify the key components of an effective wireless security policy.

Keywords: wireless LAN, IEEE 802.11 standards, continuous monitoring, security policy

Procedia PDF Downloads 355
24729 Forced Immigration to Turkey: The Socio-Spatial Impacts of Syrian Immigrants on Turkish Cities

Authors: Tolga Levent

Abstract:

Throughout the past few decades, forced immigration has been a significant problem for many developing countries. Turkey is one of those countries, which has experienced lots of forced immigration waves in the Republican era. However, the ongoing forced immigration wave of Syrians started with Syrian Civil War in 2011, is strikingly influential due to its intensity. In six years, approximately 3,4 million Syrians have entered to Turkey and presented high-level spatial concentrations in certain cities proximate to the Syrian border. These concentrations make Syrians and their problems relatively visible, especially in those cities. The problems of Syrians in Turkish cities could be associated with all dimensions of daily lives. Within economical dimension, high rates of Syrian unemployment push them to informal jobs offering very low wages. The financial aids they continuously demand from public authorities trigger anti-Syrian behaviors of local communities. Moreover, their relatively limited social adaptation capacities increase integration problems within social dimension day by day. Even, there are problems related to public health dimension such as the reappearance of certain child's illnesses due to the insufficiency of vaccination of Syrian children. These problems are significant but relatively easy to be prevented by using different types of management strategies and structural policies. However, there are other types of problems -urban problems- emerging with socio-spatial impacts of Syrians on Turkish cities in a very short period of time. There are relatively limited amount of studies about these impacts since they are difficult to be comprehended. The aim of the study, in this respect, is to understand these rapidly-emerging impacts and urban problems resulted from this massive immigration influx and to discuss new qualities of urban planning facing them. In the first part, there is a brief historical consideration of forced immigration waves in Turkey. These waves are important to make comparison with the ongoing immigration wave and to understand its significance. The second part is about quantitative and qualitative analyses of the spatial existence of Syrian immigrants in the city of Mersin, as an example of cities where Syrians are highly concentrated. By using official data from public authorities, quantitative statistical analyses are made to detect spatial concentrations of Syrians at neighborhood level. As methods of qualitative research, observations and in-depth interviews are used to define socio-spatial impacts of Syrians. The main results show that there emerges 'cities in cities' though sharp socio-spatial segregations which change density surfaces; produce unforeseen land-use patterns; result in inadequacies of public services and create degradations/deteriorations of urban environments occupied by Syrians. All these problems are significant; however, Turkish planning system does not have a capacity to cope with them. In the final part, there is a discussion about new qualities of urban planning facing these impacts and urban problems. The main point of discussion is the possibility of resilient urban planning under the conditions of uncertainty and unpredictability fostered by immigration crisis. Such a resilient planning approach might provide an option for countries aiming to cope with negative socio-spatial impacts of massive immigration influxes.

Keywords: cities, forced immigration, Syrians, urban planning

Procedia PDF Downloads 232
24728 Impacts of Community Forest on Forest Resources Management and Livelihood Improvement of Local People in Nepal

Authors: Samipraj Mishra

Abstract:

Despite the successful implementation of community forestry program, a number of pros and cons have been raised on Terai community forestry in the case of lowland locally called Terai region of Nepal, which is climatically belongs to tropical humid and possessed high quality forests in terms of ecology and economy. The study aims to investigate the local pricing strategy of forest products and its impacts on equitable forest benefit sharing, collection of community fund and carrying out livelihood improvement activities. The study was carried out on six community forests revealed that local people have substantially benefited from the community forests. However, being the region is heterogeneous by socio-economic conditions and forest resources have higher economical potential, the decision of low pricing strategy made by the local people have created inequality problems while sharing the forest benefits, and poorly contributed to community fund collection and consequently carrying out limited activities of livelihood improvement. The paper argued that the decision of low pricing strategy of forest products is counter-productive to promote the equitable benefit sharing in the areas of heterogeneous socio-economic conditions with high value forests. The low pricing strategy has been increasing accessibility of better off households at higher rate than poor; as such households always have higher affording capacity. It is also defective to increase the community fund and carry out activities of livelihood improvement effectively. The study concluded that unilateral decentralized forest policy and decision-making autonomy to the local people seems questionable unless their decision-making capacities are enriched sufficiently. Therefore, it is recommended that empowerment of decision-making capacity of local people and their respective institutions together with policy and program formulation are prerequisite for efficient and equitable community forest management and its long-term sustainability.

Keywords: community forest, livelihood, socio-economy, pricing system, Nepal

Procedia PDF Downloads 255
24727 Top-Down Approach for Fabricating Hematite Nanowire Arrays

Authors: Seungmin Shin, Jin-Baek Kim

Abstract:

Hematite (α-Fe2O3) has very good semiconducting properties with a band gap of 2.1 eV and is antiferromagnetic. Due to its electrochemical stability, low toxicity, wide abundance, and low-cost, hematite, it is a particularly attractive material for photoelectrochemical cells. Additionally, hematite has also found applications in gas sensing, field emission, heterogeneous catalysis, and lithium-ion battery electrodes. Here, we discovered a new universal top-down method for the synthesis of one-dimensional hematite nanowire arrays. Various shapes and lengths of hematite nanowire have been easily fabricated over large areas by sequential processes. The obtained hematite nanowire arrays are promising candidates as photoanodes in photoelectrochemical solar cells.

Keywords: hematite, lithography, nanowire, top-down process

Procedia PDF Downloads 226
24726 The Impact of Animal-Assisted Pedagogy on Social Participation in Heterogenous Classrooms: A Survey Considering the Pupils Perspective on Animal-Assisted Teaching

Authors: Mona Maria Mombeck

Abstract:

Social participation in heterogeneous classrooms is one of the main goals in inclusive education. Children with special educational needs (SEN) and children with learning difficulties, or behavioural problems not diagnosed as SEN, are more likely to be excluded by other children than others. It is proven that the presence of dogs, as well as contact with dogs, increases the likelihood of positive social behaviour between humans. Therefore, animal-assisted pedagogy may be presumed to be a constructive way of inclusive teaching and facing the challenges of social inclusion in school classes. This study investigates the presence of a friendly dog in heterogeneous groups of pupils in order to evaluate the influence of dogs on facets of social participation of children in school. 30 German pupils, aged from 10 to 14, in four classes were questioned about their social participation before and after they were educated for a year in school with animal-assisted-pedagogy, using the problem-concerned interview method. In addition, the post-interview includes some general questions about the putative differences or similarities of being educated with and without a dog. The interviews were analysed with the qualitative-content-analysis using QDA software. The results showed that a dog has a positive impact on the atmosphere, student relationships, and well-being in class. Regarding the atmosphere, the pupils mainly argued that the improvement was caused by taking into account the dog’s well-being, respecting the dog-related rules, and by emotional self-regulation. It can be supposed that children regard the rules concerning the dog as more relevant to them than rules, not concerning the dog even if they require the same behaviour and goal. Furthermore, a dog has a positive impact on emotional self-regulation and, therefore, on pupil’s behaviour in class and the atmosphere. In terms of the statements about relationships, the dog’s presence was mainly seen to provide both a unifying aim and a uniting topic to talk about. The improved well-being was described as a feeling of joy and peace of mind. Moreover, the teacher was evaluated as more friendly and trustworthy after animal-assisted teaching. Nevertheless, animal-assisted pedagogy can, rarely, cause problems as well, such as jealousy, distraction, or concerns about the well-being of the dog. The study could prove the relevance of animal-assisted pedagogy for facing the challenges of social participation in inclusive education.

Keywords: animal-assisted-pedagogy, inclusive education, human-animal-interactions, social participation

Procedia PDF Downloads 97
24725 Evaluating Urban Land Expansion Using Geographic Information System and Remote Sensing in Kabul City, Afghanistan

Authors: Ahmad Sharif Ahmadi, Yoshitaka Kajita

Abstract:

With massive population expansion and fast economic development in last decade, urban land has increasingly expanded and formed high informal development territory in Kabul city. This paper investigates integrated urbanization trends in Kabul city since the formation of the basic structure of the present city using GIS and remote sensing. This study explores the spatial and temporal difference of urban land expansion and land use categories among different time intervals, 1964-1978 and 1978-2008 from 1964 to 2008 in Kabul city. Furthermore, the goal of this paper is to understand the extent of urban land expansion and the factors driving urban land expansion in Kabul city. Many factors like population expansion, the return of refugees from neighboring countries and significant economic growth of the city affected urban land expansion. Across all the study area urban land expansion rate, population expansion rate and economic growth rate have been compared to analyze the relationship of driving forces with urban land expansion. Based on urban land change data detected by interpreting land use maps, it was found that in the entire study area the urban territory has been expanded by 14 times between 1964 and 2008.

Keywords: GIS, Kabul city, land use, urban land expansion, urbanization

Procedia PDF Downloads 313
24724 Physics-Based Earthquake Source Models for Seismic Engineering: Analysis and Validation for Dip-Slip Faults

Authors: Percy Galvez, Anatoly Petukhin, Paul Somerville, Ken Miyakoshi, Kojiro Irikura, Daniel Peter

Abstract:

Physics-based dynamic rupture modelling is necessary for estimating parameters such as rupture velocity and slip rate function that are important for ground motion simulation, but poorly resolved by observations, e.g. by seismic source inversion. In order to generate a large number of physically self-consistent rupture models, whose rupture process is consistent with the spatio-temporal heterogeneity of past earthquakes, we use multicycle simulations under the heterogeneous rate-and-state (RS) friction law for a 45deg dip-slip fault. We performed a parametrization study by fully dynamic rupture modeling, and then, a set of spontaneous source models was generated in a large magnitude range (Mw > 7.0). In order to validate rupture models, we compare the source scaling relations vs. seismic moment Mo for the modeled rupture area S, as well as average slip Dave and the slip asperity area Sa, with similar scaling relations from the source inversions. Ground motions were also computed from our models. Their peak ground velocities (PGV) agree well with the GMPE values. We obtained good agreement of the permanent surface offset values with empirical relations. From the heterogeneous rupture models, we analyzed parameters, which are critical for ground motion simulations, i.e. distributions of slip, slip rate, rupture initiation points, rupture velocities, and source time functions. We studied cross-correlations between them and with the friction weakening distance Dc value, the only initial heterogeneity parameter in our modeling. The main findings are: (1) high slip-rate areas coincide with or are located on an outer edge of the large slip areas, (2) ruptures have a tendency to initiate in small Dc areas, and (3) high slip-rate areas correlate with areas of small Dc, large rupture velocity and short rise-time.

Keywords: earthquake dynamics, strong ground motion prediction, seismic engineering, source characterization

Procedia PDF Downloads 126
24723 The Relationship Between Artificial Intelligence, Data Science, and Privacy

Authors: M. Naidoo

Abstract:

Artificial intelligence often requires large amounts of good quality data. Within important fields, such as healthcare, the training of AI systems predominately relies on health and personal data; however, the usage of this data is complicated by various layers of law and ethics that seek to protect individuals’ privacy rights. This research seeks to establish the challenges AI and data sciences pose to (i) informational rights, (ii) privacy rights, and (iii) data protection. To solve some of the issues presented, various methods are suggested, such as embedding values in technological development, proper balancing of rights and interests, and others.

Keywords: artificial intelligence, data science, law, policy

Procedia PDF Downloads 89
24722 Neighbour Cell List Reduction in Multi-Tier Heterogeneous Networks

Authors: Mohanad Alhabo, Naveed Nawaz

Abstract:

The ongoing call or data session must be maintained to ensure a good quality of service. This can be accomplished by performing the handover procedure while the user is on the move. However, the dense deployment of small cells in 5G networks is a challenging issue due to the extensive number of handovers. In this paper, a neighbour cell list method is proposed to reduce the number of target small cells and hence minimizing the number of handovers. The neighbour cell list is built by omitting cells that could cause an unnecessary handover and handover failure because of short time of stay of the user in these cells. A multi-attribute decision making technique, simple additive weighting, is then applied to the optimized neighbour cell list. Multi-tier small cells network is considered in this work. The performance of the proposed method is analysed and compared with that of the existing methods. Results disclose that our method has decreased the candidate small cell list, unnecessary handovers, handover failure, and short time of stay cells compared to the competitive method.

Keywords: handover, HetNets, multi-attribute decision making, small cells

Procedia PDF Downloads 91
24721 Increasing Number of NGOs and Their Conduct: A Case Study of Far Western Region of Nepal

Authors: Raju Thapa

Abstract:

Non-Governmental Organizations (NGOs) are conducting activities in Nepal with the overall objective to strengthen peace, progress and prosperity in the society. Based on the research objectives, this study has tried to trace out the reasons behind massive growth of NGOs and the trends that have shaped the handling and functioning of NGOs in the Kailali district. The outcomes of this research are quite embarrassing for NGOs officials. Based on the findings of this research, NGOs are expected to review their guiding principal, integrity and conduct for the betterment of the society.

Keywords: NGO, trends, increasing, conduct, integrity, guiding principle, legal, governance, human resources, public trust, financial, collaboration, networking

Procedia PDF Downloads 383
24720 A Local Tensor Clustering Algorithm to Annotate Uncharacterized Genes with Many Biological Networks

Authors: Paul Shize Li, Frank Alber

Abstract:

A fundamental task of clinical genomics is to unravel the functions of genes and their associations with disorders. Although experimental biology has made efforts to discover and elucidate the molecular mechanisms of individual genes in the past decades, still about 40% of human genes have unknown functions, not to mention the diseases they may be related to. For those biologists who are interested in a particular gene with unknown functions, a powerful computational method tailored for inferring the functions and disease relevance of uncharacterized genes is strongly needed. Studies have shown that genes strongly linked to each other in multiple biological networks are more likely to have similar functions. This indicates that the densely connected subgraphs in multiple biological networks are useful in the functional and phenotypic annotation of uncharacterized genes. Therefore, in this work, we have developed an integrative network approach to identify the frequent local clusters, which are defined as those densely connected subgraphs that frequently occur in multiple biological networks and consist of the query gene that has few or no disease or function annotations. This is a local clustering algorithm that models multiple biological networks sharing the same gene set as a three-dimensional matrix, the so-called tensor, and employs the tensor-based optimization method to efficiently find the frequent local clusters. Specifically, massive public gene expression data sets that comprehensively cover dynamic, physiological, and environmental conditions are used to generate hundreds of gene co-expression networks. By integrating these gene co-expression networks, for a given uncharacterized gene that is of biologist’s interest, the proposed method can be applied to identify the frequent local clusters that consist of this uncharacterized gene. Finally, those frequent local clusters are used for function and disease annotation of this uncharacterized gene. This local tensor clustering algorithm outperformed the competing tensor-based algorithm in both module discovery and running time. We also demonstrated the use of the proposed method on real data of hundreds of gene co-expression data and showed that it can comprehensively characterize the query gene. Therefore, this study provides a new tool for annotating the uncharacterized genes and has great potential to assist clinical genomic diagnostics.

Keywords: local tensor clustering, query gene, gene co-expression network, gene annotation

Procedia PDF Downloads 112
24719 Leveraging Advanced Technologies and Data to Eliminate Abandoned, Lost, or Otherwise Discarded Fishing Gear and Derelict Fishing Gear

Authors: Grant Bifolchi

Abstract:

As global environmental problems continue to have highly adverse effects, finding long-term, sustainable solutions to combat ecological distress are of growing paramount concern. Ghost Gear—also known as abandoned, lost or otherwise discarded fishing gear (ALDFG) and derelict fishing gear (DFG)—represents one of the greatest threats to the world’s oceans, posing a significant hazard to human health, livelihoods, and global food security. In fact, according to the UN Food and Agriculture Organization (FAO), abandoned, lost and discarded fishing gear represents approximately 10% of marine debris by volume. Around the world, many governments, governmental and non-profit organizations are doing their best to manage the reporting and retrieval of nets, lines, ropes, traps, floats and more from their respective bodies of water. However, these organizations’ ability to effectively manage files and documents about the environmental problem further complicates matters. In Ghost Gear monitoring and management, organizations face additional complexities. Whether it’s data ingest, industry regulations and standards, garnering actionable insights into the location, security, and management of data, or the application of enforcement due to disparate data—all of these factors are placing massive strains on organizations struggling to save the planet from the dangers of Ghost Gear. In this 90-minute educational session, globally recognized Ghost Gear technology expert Grant Bifolchi CET, BBA, Bcom, will provide real-world insight into how governments currently manage Ghost Gear and the technology that can accelerate success in combatting ALDFG and DFG. In this session, attendees will learn how to: • Identify specific technologies to solve the ingest and management of Ghost Gear data categories, including type, geo-location, size, ownership, regional assignment, collection and disposal. • Provide enhanced access to authorities, fisheries, independent fishing vessels, individuals, etc., while securely controlling confidential and privileged data to globally recognized standards. • Create and maintain processing accuracy to effectively track ALDFG/DFG reporting progress—including acknowledging receipt of the report and sharing it with all pertinent stakeholders to ensure approvals are secured. • Enable and utilize Business Intelligence (BI) and Analytics to store and analyze data to optimize organizational performance, maintain anytime-visibility of report status, user accountability, scheduling, management, and foster governmental transparency. • Maintain Compliance Reporting through highly defined, detailed and automated reports—enabling all stakeholders to share critical insights with internal colleagues, regulatory agencies, and national and international partners.

Keywords: ghost gear, ALDFG, DFG, abandoned, lost or otherwise discarded fishing gear, data, technology

Procedia PDF Downloads 78
24718 The 2017 Summer Campaign for Night Sky Brightness Measurements on the Tuscan Coast

Authors: Andrea Giacomelli, Luciano Massetti, Elena Maggi, Antonio Raschi

Abstract:

The presentation will report the activities managed during the Summer of 2017 by a team composed by staff from a University Department, a National Research Council Institute, and an outreach NGO, collecting measurements of night sky brightness and other information on artificial lighting, in order to characterize light pollution issues on portions of the Tuscan coast, in Central Italy. These activities combine measurements collected by the principal scientists, citizen science observations led by students, and outreach events targeting a broad audience. This campaign aggregates the efforts of three actors: the BuioMetria Partecipativa project, which started collecting light pollution data on a national scale in 2008 with an environmental engineering and free/open source GIS core team; the Institute of Biometeorology from the National Research Council, with ongoing studies on light and urban vegetation and a consolidated track record in environmental education and citizen science; the Department of Biology from the University of Pisa, which started experiments to assess the impact of light pollution in coastal environments in 2015. While the core of the activities concerns in situ data, the campaign will account also for remote sensing data, thus considering heterogeneous data sources. The aim of the campaign is twofold: (1) To test actions of citizen and student engagement in monitoring sky brightness (2) To collect night sky brightness data and test a protocol for applications to studies on the ecological impact of light pollution, with a special focus on marine coastal ecosystems. The collaboration of an interdisciplinary team in the study of artificial lighting issues is not a common case in Italy, and the possibility of undertaking the campaign in Tuscany has the added value of operating in one of the territories where it is possible to observe both sites with extremely high lighting levels, and areas with extremely low light pollution, especially in the Southern part of the region. Combining environmental monitoring and communication actions in the context of the campaign, this effort will contribute to the promotion of night skies with a good quality as an important asset for the sustainability of coastal ecosystems, as well as to increase citizen awareness through star gazing, night photography and actively participating in field campaign measurements.

Keywords: citizen science, light pollution, marine coastal biodiversity, environmental education

Procedia PDF Downloads 155
24717 Design-Based Elements to Sustain Participant Activity in Massive Open Online Courses: A Case Study

Authors: C. Zimmermann, E. Lackner, M. Ebner

Abstract:

Massive Open Online Courses (MOOCs) are increasingly popular learning hubs that are boasting considerable participant numbers, innovative technical features, and a multitude of instructional resources. Still, there is a high level of evidence showing that almost all MOOCs suffer from a declining frequency of participant activity and fairly low completion rates. In this paper, we would like to share the lessons learned in implementing several design patterns that have been suggested in order to foster participant activity. Our conclusions are based on experiences with the ‘Dr. Internet’ MOOC, which was created as an xMOOC to raise awareness for a more critical approach to online health information: participants had to diagnose medical case studies. There is a growing body of recommendations (based on Learning Analytics results from earlier xMOOCs) as to how the decline in participant activity can be alleviated. One promising focus in this regard is instructional design patterns, since they have a tremendous influence on the learner’s motivation, which in turn is a crucial trigger of learning processes. Since Medieval Age storytelling, micro-learning units and specific comprehensible, narrative structures were chosen to animate the audience to follow narration. Hence, MOOC participants are not likely to abandon a course or information channel when their curiosity is kept at a continuously high level. Critical aspects that warrant consideration in this regard include shorter course duration, a narrative structure with suspense peaks (according to the ‘storytelling’ approach), and a course schedule that is diversified and stimulating, yet easy to follow. All of these criteria have been observed within the design of the Dr. Internet MOOC: 1) the standard eight week course duration was shortened down to six weeks, 2) all six case studies had a special quiz format and a corresponding resolution video which was made available in the subsequent week, 3) two out of six case studies were split up in serial video sequences to be presented over the span of two weeks, and 4) the videos were generally scheduled in a less predictable sequence. However, the statistical results from the first run of the MOOC do not indicate any strong influences on the retention rate, so we conclude with some suggestions as to why this might be and what aspects need further consideration.

Keywords: case study, Dr. internet, experience, MOOCs, design patterns

Procedia PDF Downloads 240
24716 An Infinite Mixture Model for Modelling Stutter Ratio in Forensic Data Analysis

Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer

Abstract:

Forensic DNA analysis has received much attention over the last three decades, due to its incredible usefulness in human identification. The statistical interpretation of DNA evidence is recognised as one of the most mature fields in forensic science. Peak heights in an Electropherogram (EPG) are approximately proportional to the amount of template DNA in the original sample being tested. A stutter is a minor peak in an EPG, which is not masking as an allele of a potential contributor, and considered as an artefact that is presumed to be arisen due to miscopying or slippage during the PCR. Stutter peaks are mostly analysed in terms of stutter ratio that is calculated relative to the corresponding parent allele height. Analysis of mixture profiles has always been problematic in evidence interpretation, especially with the presence of PCR artefacts like stutters. Unlike binary and semi-continuous models; continuous models assign a probability (as a continuous weight) for each possible genotype combination, and significantly enhances the use of continuous peak height information resulting in more efficient reliable interpretations. Therefore, the presence of a sound methodology to distinguish between stutters and real alleles is essential for the accuracy of the interpretation. Sensibly, any such method has to be able to focus on modelling stutter peaks. Bayesian nonparametric methods provide increased flexibility in applied statistical modelling. Mixture models are frequently employed as fundamental data analysis tools in clustering and classification of data and assume unidentified heterogeneous sources for data. In model-based clustering, each unknown source is reflected by a cluster, and the clusters are modelled using parametric models. Specifying the number of components in finite mixture models, however, is practically difficult even though the calculations are relatively simple. Infinite mixture models, in contrast, do not require the user to specify the number of components. Instead, a Dirichlet process, which is an infinite-dimensional generalization of the Dirichlet distribution, is used to deal with the problem of a number of components. Chinese restaurant process (CRP), Stick-breaking process and Pólya urn scheme are frequently used as Dirichlet priors in Bayesian mixture models. In this study, we illustrate an infinite mixture of simple linear regression models for modelling stutter ratio and introduce some modifications to overcome weaknesses associated with CRP.

Keywords: Chinese restaurant process, Dirichlet prior, infinite mixture model, PCR stutter

Procedia PDF Downloads 307
24715 A Machine Learning Framework Based on Biometric Measurements for Automatic Fetal Head Anomalies Diagnosis in Ultrasound Images

Authors: Hanene Sahli, Aymen Mouelhi, Marwa Hajji, Amine Ben Slama, Mounir Sayadi, Farhat Fnaiech, Radhwane Rachdi

Abstract:

Fetal abnormality is still a public health problem of interest to both mother and baby. Head defect is one of the most high-risk fetal deformities. Fetal head categorization is a sensitive task that needs a massive attention from neurological experts. In this sense, biometrical measurements can be extracted by gynecologist doctors and compared with ground truth charts to identify normal or abnormal growth. The fetal head biometric measurements such as Biparietal Diameter (BPD), Occipito-Frontal Diameter (OFD) and Head Circumference (HC) needs to be monitored, and expert should carry out its manual delineations. This work proposes a new approach to automatically compute BPD, OFD and HC based on morphological characteristics extracted from head shape. Hence, the studied data selected at the same Gestational Age (GA) from the fetal Ultrasound images (US) are classified into two categories: Normal and abnormal. The abnormal subjects include hydrocephalus, microcephaly and dolichocephaly anomalies. By the use of a support vector machines (SVM) method, this study achieved high classification for automated detection of anomalies. The proposed method is promising although it doesn't need expert interventions.

Keywords: biometric measurements, fetal head malformations, machine learning methods, US images

Procedia PDF Downloads 268
24714 The Mediating Role of Artificial Intelligence (AI) Driven Customer Experience in the Relationship Between AI Voice Assistants and Brand Usage Continuance

Authors: George Cudjoe Agbemabiese, John Paul Kosiba, Michael Boadi Nyamekye, Vanessa Narkie Tetteh, Caleb Nunoo, Mohammed Muniru Husseini

Abstract:

The smartphone industry continues to experience massive growth, evidenced by expanding markets and an increasing number of brands, models and manufacturers. As technology advances rapidly, manufacturers of smartphones are consistently introducing new innovations to keep up with the latest evolving industry trends and customer demand for more modern devices. This study aimed to assess the influence of artificial intelligence (AI) voice assistant (VA) on improving customer experience, resulting in the continuous use of mobile brands. Specifically, this article assesses the role of hedonic, utilitarian, and social benefits provided by AIVA on customer experience and the continuance intention to use mobile phone brands. Using a primary data collection instrument, the quantitative approach was adopted to examine the study's variables. Data from 348 valid responses were used for the analysis based on structural equation modeling (SEM) with AMOS version 23. Three main factors were identified to influence customer experience, which results in continuous usage of mobile phone brands. These factors are social benefits, hedonic benefits, and utilitarian benefits. In conclusion, a significant and positive relationship exists between the factors influencing customer experience for continuous usage of mobile phone brands. The study concludes that mobile brands that invest in delivering positive user experiences are in a better position to improve usage and increase preference for their brands. The study recommends that mobile brands consider and research their prospects' and customers' social, hedonic, and utilitarian needs to provide them with desired products and experiences.

Keywords: artificial intelligence, continuance usage, customer experience, smartphone industry

Procedia PDF Downloads 57
24713 Full Characterization of Heterogeneous Antibody Samples under Denaturing and Native Conditions on a Hybrid Quadrupole-Orbitrap Mass Spectrometer

Authors: Rowan Moore, Kai Scheffler, Eugen Damoc, Jennifer Sutton, Aaron Bailey, Stephane Houel, Simon Cubbon, Jonathan Josephs

Abstract:

Purpose: MS analysis of monoclonal antibodies (mAbs) at the protein and peptide levels is critical during development and production of biopharmaceuticals. The compositions of current generation therapeutic proteins are often complex due to various modifications which may affect efficacy. Intact proteins analyzed by MS are detected in higher charge states that also provide more complexity in mass spectra. Protein analysis in native or native-like conditions with zero or minimal organic solvent and neutral or weakly acidic pH decreases charge state value resulting in mAb detection at higher m/z ranges with more spatial resolution. Methods: Three commercially available mAbs were used for all experiments. Intact proteins were desalted online using size exclusion chromatography (SEC) or reversed phase chromatography coupled on-line with a mass spectrometer. For streamlined use of the LC- MS platform we used a single SEC column and alternately selected specific mobile phases to perform separations in either denaturing or native-like conditions: buffer A (20 % ACN, 0.1 % FA) with Buffer B (100 mM ammonium acetate). For peptide analysis mAbs were proteolytically digested with and without prior reduction and alkylation. The mass spectrometer used for all experiments was a commercially available Thermo Scientific™ hybrid Quadrupole-Orbitrap™ mass spectrometer, equipped with the new BioPharma option which includes a new High Mass Range (HMR) mode that allows for improved high mass transmission and mass detection up to 8000 m/z. Results: We have analyzed the profiles of three mAbs under reducing and native conditions by direct infusion with offline desalting and with on-line desalting via size exclusion and reversed phase type columns. The presence of high salt under denaturing conditions was found to influence the observed charge state envelope and impact mass accuracy after spectral deconvolution. The significantly lower charge states observed under native conditions improves the spatial resolution of protein signals and has significant benefits for the analysis of antibody mixtures, e.g. lysine variants, degradants or sequence variants. This type of analysis requires the detection of masses beyond the standard mass range ranging up to 6000 m/z requiring the extended capabilities available in the new HMR mode. We have compared each antibody sample that was analyzed individually with mixtures in various relative concentrations. For this type of analysis, we observed that apparent native structures persist and ESI is benefited by the addition of low amounts of acetonitrile and formic acid in combination with the ammonium acetate-buffered mobile phase. For analyses on the peptide level we analyzed reduced/alkylated, and non-reduced proteolytic digests of the individual antibodies separated via reversed phase chromatography aiming to retrieve as much information as possible regarding sequence coverage, disulfide bridges, post-translational modifications such as various glycans, sequence variants, and their relative quantification. All data acquired were submitted to a single software package for analysis aiming to obtain a complete picture of the molecules analyzed. Here we demonstrate the capabilities of the mass spectrometer to fully characterize homogeneous and heterogeneous therapeutic proteins on one single platform. Conclusion: Full characterization of heterogeneous intact protein mixtures by improved mass separation on a quadrupole-Orbitrap™ mass spectrometer with extended capabilities has been demonstrated.

Keywords: disulfide bond analysis, intact analysis, native analysis, mass spectrometry, monoclonal antibodies, peptide mapping, post-translational modifications, sequence variants, size exclusion chromatography, therapeutic protein analysis, UHPLC

Procedia PDF Downloads 345
24712 Outcome of Patients Undergoing Hemicraniectomy for Malignant Middle Cerebral Artery Infarction: A 5 Year Retrospective Study at Perpetual Succour Hospital, Cebu City, Philippines

Authors: Adelson G. Guillarte, M. D., Noel J. Belonguel, Jarungchai Anton S. Vatanagul

Abstract:

Patients with malignant middle cerebral infarction (MCA) (with massive brain swelling and herniation) were reported to have a mortality rate of 80% even with the appropriate conservative medical therapy. European Trials (DECIMAL, DESTINY I, and II, HAMLET) showed significant improvement in mortality and functional outcome with hemicraniectomy. No known published local studies in the region, thus a local study is vital. This is a single center, retrospective, descriptive, cross-sectional, chart review study which includes ≥18 year-old patients with malignant MCA infarction, who underwent hemicraniectomy, and those who were given conservative medical therapy alone, from January 2008 to December 2012 at Perpetual Succour Hospital. Excluded were patients whose charts are with insufficient data, prior MCA stroke, with concomitant intracerebral hemorrhage and with other serious medical conditions or terminal illnesses. Minimum of 32 populations were needed. Data were presented in mean, standard deviation, frequency and percentage distribution. Man n Whitney U test and Chi Square test were used. P-values lesser than 0.05 alpha were considered statistically significant. A total of 672 stroke patients were admitted. 34 patients pass the inclusion criteria. 9 underwent hemicraniectomy and 25 were treated by conservative medical therapy alone. Although not statistically significant (64% vs 33%, p=0.112) there were more patients noted improved in the conservative treatment group. Meanwhile, the Hemicraniectomy group have increased percentage of mortality (67%) (p=0.112). There was a decreasing trend in the average NIHSS score in both groups from admission to post-op 7 days (p=0.198, p=0.78). A bigger multicenter prospective study is recommended to control inherent biases and limitations of a retrospective and smaller study.

Keywords: cerebral infarct, hemicraniectomy, ischemic stroke, malignant middle cerebral artery (MCA) infarct

Procedia PDF Downloads 300
24711 Algorithms used in Spatial Data Mining GIS

Authors: Vahid Bairami Rad

Abstract:

Extracting knowledge from spatial data like GIS data is important to reduce the data and extract information. Therefore, the development of new techniques and tools that support the human in transforming data into useful knowledge has been the focus of the relatively new and interdisciplinary research area ‘knowledge discovery in databases’. Thus, we introduce a set of database primitives or basic operations for spatial data mining which are sufficient to express most of the spatial data mining algorithms from the literature. This approach has several advantages. Similar to the relational standard language SQL, the use of standard primitives will speed-up the development of new data mining algorithms and will also make them more portable. We introduced a database-oriented framework for spatial data mining which is based on the concepts of neighborhood graphs and paths. A small set of basic operations on these graphs and paths were defined as database primitives for spatial data mining. Furthermore, techniques to efficiently support the database primitives by a commercial DBMS were presented.

Keywords: spatial data base, knowledge discovery database, data mining, spatial relationship, predictive data mining

Procedia PDF Downloads 435
24710 Destination Management Organization in the Digital Era: A Data Framework to Leverage Collective Intelligence

Authors: Alfredo Fortunato, Carmelofrancesco Origlia, Sara Laurita, Rossella Nicoletti

Abstract:

In the post-pandemic recovery phase of tourism, the role of a Destination Management Organization (DMO) as a coordinated management system of all the elements that make up a destination (attractions, access, marketing, human resources, brand, pricing, etc.) is also becoming relevant for local territories. The objective of a DMO is to maximize the visitor's perception of value and quality while ensuring the competitiveness and sustainability of the destination, as well as the long-term preservation of its natural and cultural assets, and to catalyze benefits for the local economy and residents. In carrying out the multiple functions to which it is called, the DMO can leverage a collective intelligence that comes from the ability to pool information, explicit and tacit knowledge, and relationships of the various stakeholders: policymakers, public managers and officials, entrepreneurs in the tourism supply chain, researchers, data journalists, schools, associations and committees, citizens, etc. The DMO potentially has at its disposal large volumes of data and many of them at low cost, that need to be properly processed to produce value. Based on these assumptions, the paper presents a conceptual framework for building an information system to support the DMO in the intelligent management of a tourist destination tested in an area of southern Italy. The approach adopted is data-informed and consists of four phases: (1) formulation of the knowledge problem (analysis of policy documents and industry reports; focus groups and co-design with stakeholders; definition of information needs and key questions); (2) research and metadatation of relevant sources (reconnaissance of official sources, administrative archives and internal DMO sources); (3) gap analysis and identification of unconventional information sources (evaluation of traditional sources with respect to the level of consistency with information needs, the freshness of information and granularity of data; enrichment of the information base by identifying and studying web sources such as Wikipedia, Google Trends, Booking.com, Tripadvisor, websites of accommodation facilities and online newspapers); (4) definition of the set of indicators and construction of the information base (specific definition of indicators and procedures for data acquisition, transformation, and analysis). The framework derived consists of 6 thematic areas (accommodation supply, cultural heritage, flows, value, sustainability, and enabling factors), each of which is divided into three domains that gather a specific information need to be represented by a scheme of questions to be answered through the analysis of available indicators. The framework is characterized by a high degree of flexibility in the European context, given that it can be customized for each destination by adapting the part related to internal sources. Application to the case study led to the creation of a decision support system that allows: •integration of data from heterogeneous sources, including through the execution of automated web crawling procedures for data ingestion of social and web information; •reading and interpretation of data and metadata through guided navigation paths in the key of digital story-telling; •implementation of complex analysis capabilities through the use of data mining algorithms such as for the prediction of tourist flows.

Keywords: collective intelligence, data framework, destination management, smart tourism

Procedia PDF Downloads 101
24709 Data Stream Association Rule Mining with Cloud Computing

Authors: B. Suraj Aravind, M. H. M. Krishna Prasad

Abstract:

There exist emerging applications of data streams that require association rule mining, such as network traffic monitoring, web click streams analysis, sensor data, data from satellites etc. Data streams typically arrive continuously in high speed with huge amount and changing data distribution. This raises new issues that need to be considered when developing association rule mining techniques for stream data. This paper proposes to introduce an improved data stream association rule mining algorithm by eliminating the limitation of resources. For this, the concept of cloud computing is used. Inclusion of this may lead to additional unknown problems which needs further research.

Keywords: data stream, association rule mining, cloud computing, frequent itemsets

Procedia PDF Downloads 478
24708 Rural Tourism in Essaouira in Morocco: From the Appropriation of Space to the Sustainability of Exploitation

Authors: Hadach Mohamed

Abstract:

In Essaouira, tourism is the main economic activity, and the destination has a place in the segment of rural and sustainable tourism. The hinterland of the destination has natural and tourist potential of great attractiveness, but the natives still appropriate the territory and are faced with the dilemma of appropriation and tourist exploitation. This article analyzes the determinants of the appropriation of a rural tourist space in light of the massive touristification and the need to set up income-generating activities for the inhabitants. After a review of the literature, a survey was carried out among the main actors of tourism in the destination to evaluate the question of the appropriation of the tourist space and the sustainability of a destination.

Keywords: rural tourism, sustainability, appropriation, tourism destination

Procedia PDF Downloads 87
24707 Use of Cloud-Based Virtual Classroom in Connectivism Learning Process to Enhance Information Literacy and Self-Efficacy for Undergraduate Students

Authors: Kulachai Kultawanich, Prakob Koraneekij, Jaitip Na-Songkhla

Abstract:

The way of learning has been changed into a new paradigm since the improvement of network and communication technology, so learners have to interact with massive amount of the information. Thus, information literacy has become a critical set of abilities required by every college and university in the world. Connectivism is considered to be an alternative way to design information literacy course in online learning environment, such as Virtual Classroom (VC). With the change of learning pedagogy, VC is employed to improve the social capability by integrating cloud-based technology. This paper aims to study the use of Cloud-based Virtual Classroom (CBVC) in Connectivism learning process to enhance information literacy and self-efficacy of twenty-one undergraduate students who registered in an e-publishing course at Chulalongkorn University. The data were gathered during 6 weeks of the study by using the following instruments: (1) Information literacy test (2) Information literacy rubrics (3) Information Literacy Self-Efficacy (ILSE) Scales and (4) Questionnaire. The result indicated that students have information literacy and self-efficacy posttest mean scores higher than pretest mean scores at .05 level of significant after using CBVC in Connectivism learning process. Additionally, the study identified that the Connectivism learning process proved useful for developing information rich environment and a sense of community, and the CBVC proved useful for developing social connection.

Keywords: cloud-based, virtual classroom, connectivism, information literacy

Procedia PDF Downloads 436
24706 A Comprehensive Survey and Improvement to Existing Privacy Preserving Data Mining Techniques

Authors: Tosin Ige

Abstract:

Ethics must be a condition of the world, like logic. (Ludwig Wittgenstein, 1889-1951). As important as data mining is, it possess a significant threat to ethics, privacy, and legality, since data mining makes it difficult for an individual or consumer (in the case of a company) to control the accessibility and usage of his data. This research focuses on Current issues and the latest research and development on Privacy preserving data mining methods as at year 2022. It also discusses some advances in those techniques while at the same time highlighting and providing a new technique as a solution to an existing technique of privacy preserving data mining methods. This paper also bridges the wide gap between Data mining and the Web Application Programing Interface (web API), where research is urgently needed for an added layer of security in data mining while at the same time introducing a seamless and more efficient way of data mining.

Keywords: data, privacy, data mining, association rule, privacy preserving, mining technique

Procedia PDF Downloads 140
24705 Big Data: Concepts, Technologies and Applications in the Public Sector

Authors: A. Alexandru, C. A. Alexandru, D. Coardos, E. Tudora

Abstract:

Big Data (BD) is associated with a new generation of technologies and architectures which can harness the value of extremely large volumes of very varied data through real time processing and analysis. It involves changes in (1) data types, (2) accumulation speed, and (3) data volume. This paper presents the main concepts related to the BD paradigm, and introduces architectures and technologies for BD and BD sets. The integration of BD with the Hadoop Framework is also underlined. BD has attracted a lot of attention in the public sector due to the newly emerging technologies that allow the availability of network access. The volume of different types of data has exponentially increased. Some applications of BD in the public sector in Romania are briefly presented.

Keywords: big data, big data analytics, Hadoop, cloud

Procedia PDF Downloads 288