Search results for: data security
25196 Importance of New Policies of Process Management for Internet of Things Based on Forensic Investigation
Authors: Venkata Venugopal Rao Gudlur
Abstract:
The Proposed Policies referred to as “SOP”, on the Internet of Things (IoT) based Forensic Investigation into Process Management is the latest revolution to save time and quick solution for investigators. The forensic investigation process has been developed over many years from time to time it has been given the required information with no policies in investigation processes. This research reveals that the current IoT based forensic investigation into Process Management based is more connected to devices which is the latest revolution and policies. All future development in real-time information on gathering monitoring is evolved with smart sensor-based technologies connected directly to IoT. This paper present conceptual framework on process management. The smart devices are leading the way in terms of automated forensic models and frameworks established by different scholars. These models and frameworks were mostly focused on offering a roadmap for performing forensic operations with no policies in place. These initiatives would bring a tremendous benefit to process management and IoT forensic investigators proposing policies. The forensic investigation process may enhance more security and reduced data losses and vulnerabilities.Keywords: Internet of Things, Process Management, Forensic Investigation, M2M Framework
Procedia PDF Downloads 10225195 Ensemble Methods in Machine Learning: An Algorithmic Approach to Derive Distinctive Behaviors of Criminal Activity Applied to the Poaching Domain
Authors: Zachary Blanks, Solomon Sonya
Abstract:
Poaching presents a serious threat to endangered animal species, environment conservations, and human life. Additionally, some poaching activity has even been linked to supplying funds to support terrorist networks elsewhere around the world. Consequently, agencies dedicated to protecting wildlife habitats have a near intractable task of adequately patrolling an entire area (spanning several thousand kilometers) given limited resources, funds, and personnel at their disposal. Thus, agencies need predictive tools that are both high-performing and easily implementable by the user to help in learning how the significant features (e.g. animal population densities, topography, behavior patterns of the criminals within the area, etc) interact with each other in hopes of abating poaching. This research develops a classification model using machine learning algorithms to aid in forecasting future attacks that is both easy to train and performs well when compared to other models. In this research, we demonstrate how data imputation methods (specifically predictive mean matching, gradient boosting, and random forest multiple imputation) can be applied to analyze data and create significant predictions across a varied data set. Specifically, we apply these methods to improve the accuracy of adopted prediction models (Logistic Regression, Support Vector Machine, etc). Finally, we assess the performance of the model and the accuracy of our data imputation methods by learning on a real-world data set constituting four years of imputed data and testing on one year of non-imputed data. This paper provides three main contributions. First, we extend work done by the Teamcore and CREATE (Center for Risk and Economic Analysis of Terrorism Events) research group at the University of Southern California (USC) working in conjunction with the Department of Homeland Security to apply game theory and machine learning algorithms to develop more efficient ways of reducing poaching. This research introduces ensemble methods (Random Forests and Stochastic Gradient Boosting) and applies it to real-world poaching data gathered from the Ugandan rain forest park rangers. Next, we consider the effect of data imputation on both the performance of various algorithms and the general accuracy of the method itself when applied to a dependent variable where a large number of observations are missing. Third, we provide an alternate approach to predict the probability of observing poaching both by season and by month. The results from this research are very promising. We conclude that by using Stochastic Gradient Boosting to predict observations for non-commercial poaching by season, we are able to produce statistically equivalent results while being orders of magnitude faster in computation time and complexity. Additionally, when predicting potential poaching incidents by individual month vice entire seasons, boosting techniques produce a mean area under the curve increase of approximately 3% relative to previous prediction schedules by entire seasons.Keywords: ensemble methods, imputation, machine learning, random forests, statistical analysis, stochastic gradient boosting, wildlife protection
Procedia PDF Downloads 29225194 The Learning Loops in the Public Realm Project in South Verona: Air Quality and Noise Pollution Participatory Data Collection towards Co-Design, Planning and Construction of Mitigation Measures in Urban Areas
Authors: Massimiliano Condotta, Giovanni Borga, Chiara Scanagatta
Abstract:
Urban systems are places where the various actors involved interact and enter in conflict, in particular with reference to topics such as traffic congestion and security. But topics of discussion, and often clash because of their strong complexity, are air and noise pollution. For air pollution, the complexity stems from the fact that atmospheric pollution is due to many factors, but above all, the observation and measurement of the amount of pollution of a transparent, mobile and ethereal element like air is very difficult. Often the perceived condition of the inhabitants does not coincide with the real conditions, because it is conditioned - sometimes in positive ways other in negative ways - from many other factors such as the presence, or absence, of natural elements such as trees or rivers. These problems are seen with noise pollution as well, which is also less considered as an issue even if it’s problematic just as much as air quality. Starting from these opposite positions, it is difficult to identify and implement valid, and at the same time shared, mitigation solutions for the problem of urban pollution (air and noise pollution). The LOOPER (Learning Loops in the Public Realm) project –described in this paper – wants to build and test a methodology and a platform for participatory co-design, planning, and construction process inside a learning loop process. Novelties in this approach are various; the most relevant are three. The first is that citizens participation starts since from the research of problems and air quality analysis through a participatory data collection, and that continues in all process steps (design and construction). The second is that the methodology is characterized by a learning loop process. It means that after the first cycle of (1) problems identification, (2) planning and definition of design solution and (3) construction and implementation of mitigation measures, the effectiveness of implemented solutions is measured and verified through a new participatory data collection campaign. In this way, it is possible to understand if the policies and design solution had a positive impact on the territory. As a result of the learning process produced by the first loop, it will be possible to improve the design of the mitigation measures and start the second loop with new and more effective measures. The third relevant aspect is that the citizens' participation is carried out via Urban Living Labs that involve all stakeholder of the city (citizens, public administrators, associations of all urban stakeholders,…) and that the Urban Living Labs last for all the cycling of the design, planning and construction process. The paper will describe in detail the LOOPER methodology and the technical solution adopted for the participatory data collection and design and construction phases.Keywords: air quality, co-design, learning loops, noise pollution, urban living labs
Procedia PDF Downloads 36525193 ‘The Guilt Complex’: Assessing the Guilt of Youth Returning From Terrorist Groups in the Narratives of Justice Presentation on the Methodological Opportunities and Concerns in Operational Research
Authors: Arpita Mitra
Abstract:
The research explores the concept of ‘guilt’ as understood in relation to children and young individuals associated with terrorist groups who are exiting these groups and returning to civilian lives (‘young returnees’). The study explores young returnees’ guilt – in its psychological, legal, and sociological manifestations and how it contributes to experiences of reintegration and justice administration. Streamlining it further, the research question on assessing guilt engages with young adults – between 18 and 30 years – who were part of a terrorist organization during their formative years and have returned to civilian life. Overall, the findings of the said research are intended to contribute first-hand operational research to criminological literature as well as transitional justice mechanisms with regard to narratives on truth, justice, reparations and institutional reform/guarantees of non-recurrence. Particularly for this paper, the focus of the paper shall be on one aspect of this research, that is, on the added value of conducting operational research and the methodological challenges encountered during this process with regard to informed consent, data protection, mental health and security considerations for the respondents and researcher.Keywords: terrorism, reintegration, young returnees, criminology
Procedia PDF Downloads 5925192 DISGAN: Efficient Generative Adversarial Network-Based Method for Cyber-Intrusion Detection
Authors: Hongyu Chen, Li Jiang
Abstract:
Ubiquitous anomalies endanger the security of our system con- stantly. They may bring irreversible damages to the system and cause leakage of privacy. Thus, it is of vital importance to promptly detect these anomalies. Traditional supervised methods such as Decision Trees and Support Vector Machine (SVM) are used to classify normality and abnormality. However, in some case, the abnormal status are largely rarer than normal status, which leads to decision bias of these methods. Generative adversarial network (GAN) has been proposed to handle the case. With its strong generative ability, it only needs to learn the distribution of normal status, and identify the abnormal status through the gap between it and the learned distribution. Nevertheless, existing GAN-based models are not suitable to process data with discrete values, leading to immense degradation of detection performance. To cope with the discrete features, in this paper, we propose an efficient GAN-based model with specifically-designed loss function. Experiment results show that our model outperforms state-of-the-art models on discrete dataset and remarkably reduce the overhead.Keywords: GAN, discrete feature, Wasserstein distance, multiple intermediate layers
Procedia PDF Downloads 12925191 Flowing Online Vehicle GPS Data Clustering Using a New Parallel K-Means Algorithm
Authors: Orhun Vural, Oguz Bayat, Rustu Akay, Osman N. Ucan
Abstract:
This study presents a new parallel approach clustering of GPS data. Evaluation has been made by comparing execution time of various clustering algorithms on GPS data. This paper aims to propose a parallel based on neighborhood K-means algorithm to make it faster. The proposed parallelization approach assumes that each GPS data represents a vehicle and to communicate between vehicles close to each other after vehicles are clustered. This parallelization approach has been examined on different sized continuously changing GPS data and compared with serial K-means algorithm and other serial clustering algorithms. The results demonstrated that proposed parallel K-means algorithm has been shown to work much faster than other clustering algorithms.Keywords: parallel k-means algorithm, parallel clustering, clustering algorithms, clustering on flowing data
Procedia PDF Downloads 22225190 Socio-Economic Child’S Wellbeing Impasse in South Africa: Towards a Theory-Based Solution Model
Authors: Paulin Mbecke
Abstract:
Research Issue: Under economic constraints, socio-economic conditions of households worsen discounting child’s wellbeing to the bottom of many governments and households’ priority lists. In such situation, many governments fail to rebalance priorities in providing services such as education, housing and social security which are the prerequisites for the wellbeing of children. Consequently, many households struggle to respond to basic needs especially those of children. Although economic conditions play a crucial role in creating prosperity or poverty in households and therefore the wellbeing or misery for children; they are not the sole cause. Research Insights: The review of the South African Index of Multiple Deprivation and the South African Child Gauge establish the extent to which economic conditions impact on the wellbeing or misery of children. The analysis of social, cultural, environmental and structural theories demonstrates that non-economic factors contribute equally to the wellbeing or misery of children, yet, they are disregarded. In addition, the assessment of a child abuse database proves a weak correlation between economic factors (prosperity or poverty) and child’s wellbeing or misery. Theoretical Implications: Through critical social research theory and modelling, the paper proposes a Theory-Based Model that combines different factors to facilitate the understanding of child’s wellbeing or misery. Policy Implications: The proposed model assists in broad policy and decision making and reviews processes in promoting child’s wellbeing and in preventing, intervening and managing child’s misery with regard to education, housing, and social security.Keywords: children, child’s misery, child’s wellbeing, household’s despair, household’s prosperity
Procedia PDF Downloads 28425189 Reflecting Socio-Political Needs in Education Policy-Making: An Exploratory Study of Vietnam's Key Education Reforms (1945-2017)
Authors: Linh Tong
Abstract:
This paper aims to contribute to the understanding of key education reforms in Vietnam from 1945 to 2017, which reflects an evolution of socio-political needs of the Socialist Republic of Vietnam throughout this period. It explores the contextual conditions, motivations and ambitions influencing the formation of the education reforms in Vietnam. It also looks, from an applied practical perspective, at the influence of politics on education policy-making. The research methodology includes a content analysis of curriculum designs proposed by the Ministry of Education and Training, relevant resolutions and executive orders passed by the National Assembly and the Prime Minister, as well as interviews with experts and key stakeholders. The results point to a particular configuration of factors which have been inspiring the shape and substance of these reforms and which have most certainly influenced their implementation. This configuration evolves from the immediate needs to erase illiteracy and cultivate socialist economic model at the beginning of Vietnam’s independence in 1945-1975, to a renewed urge to adopt market-oriented economy in 1986 and cautiously communicate with the outside world until 2000s, and to currently a demonstrated desire to fully integrate into the global economy and tackle with rising concerns about national security (the South China Sea Dispute), environmental sustainability, construction of a knowledge economy, and a rule-of-law society. Overall, the paper attempts to map Vietnam’s socio-political needs with the changing sets of goals and expected outcomes in teaching and learning methodologies and practices as introduced in Vietnamese key education reforms.Keywords: curriculum development, knowledge society, national security, politics of education policy-making, Vietnam's education reforms
Procedia PDF Downloads 15225188 Cognitive Science Based Scheduling in Grid Environment
Authors: N. D. Iswarya, M. A. Maluk Mohamed, N. Vijaya
Abstract:
Grid is infrastructure that allows the deployment of distributed data in large size from multiple locations to reach a common goal. Scheduling data intensive applications becomes challenging as the size of data sets are very huge in size. Only two solutions exist in order to tackle this challenging issue. First, computation which requires huge data sets to be processed can be transferred to the data site. Second, the required data sets can be transferred to the computation site. In the former scenario, the computation cannot be transferred since the servers are storage/data servers with little or no computational capability. Hence, the second scenario can be considered for further exploration. During scheduling, transferring huge data sets from one site to another site requires more network bandwidth. In order to mitigate this issue, this work focuses on incorporating cognitive science in scheduling. Cognitive Science is the study of human brain and its related activities. Current researches are mainly focused on to incorporate cognitive science in various computational modeling techniques. In this work, the problem solving approach of human brain is studied and incorporated during the data intensive scheduling in grid environments. Here, a cognitive engine is designed and deployed in various grid sites. The intelligent agents present in CE will help in analyzing the request and creating the knowledge base. Depending upon the link capacity, decision will be taken whether to transfer data sets or to partition the data sets. Prediction of next request is made by the agents to serve the requesting site with data sets in advance. This will reduce the data availability time and data transfer time. Replica catalog and Meta data catalog created by the agents assist in decision making process.Keywords: data grid, grid workflow scheduling, cognitive artificial intelligence
Procedia PDF Downloads 39425187 Heritage and Tourism in the Era of Big Data: Analysis of Chinese Cultural Tourism in Catalonia
Authors: Xinge Liao, Francesc Xavier Roige Ventura, Dolores Sanchez Aguilera
Abstract:
With the development of the Internet, the study of tourism behavior has rapidly expanded from the traditional physical market to the online market. Data on the Internet is characterized by dynamic changes, and new data appear all the time. In recent years the generation of a large volume of data was characterized, such as forums, blogs, and other sources, which have expanded over time and space, together they constitute large-scale Internet data, known as Big Data. This data of technological origin that derives from the use of devices and the activity of multiple users is becoming a source of great importance for the study of geography and the behavior of tourists. The study will focus on cultural heritage tourist practices in the context of Big Data. The research will focus on exploring the characteristics and behavior of Chinese tourists in relation to the cultural heritage of Catalonia. Geographical information, target image, perceptions in user-generated content will be studied through data analysis from Weibo -the largest social networks of blogs in China. Through the analysis of the behavior of heritage tourists in the Big Data environment, this study will understand the practices (activities, motivations, perceptions) of cultural tourists and then understand the needs and preferences of tourists in order to better guide the sustainable development of tourism in heritage sites.Keywords: Barcelona, Big Data, Catalonia, cultural heritage, Chinese tourism market, tourists’ behavior
Procedia PDF Downloads 13825186 Towards A Framework for Using Open Data for Accountability: A Case Study of A Program to Reduce Corruption
Authors: Darusalam, Jorish Hulstijn, Marijn Janssen
Abstract:
Media has revealed a variety of corruption cases in the regional and local governments all over the world. Many governments pursued many anti-corruption reforms and have created a system of checks and balances. Three types of corruption are faced by citizens; administrative corruption, collusion and extortion. Accountability is one of the benchmarks for building transparent government. The public sector is required to report the results of the programs that have been implemented so that the citizen can judge whether the institution has been working such as economical, efficient and effective. Open Data is offering solutions for the implementation of good governance in organizations who want to be more transparent. In addition, Open Data can create transparency and accountability to the community. The objective of this paper is to build a framework of open data for accountability to combating corruption. This paper will investigate the relationship between open data, and accountability as part of anti-corruption initiatives. This research will investigate the impact of open data implementation on public organization.Keywords: open data, accountability, anti-corruption, framework
Procedia PDF Downloads 33725185 Effectiveness of Climate Smart Agriculture in Managing Field Stresses in Robusta Coffee
Authors: Andrew Kirabira
Abstract:
This study is an investigation into the effectiveness of climate-smart agriculture (CSA) technologies in improving productivity through managing biotic and abiotic stresses in the coffee agroecological zones of Uganda. The motive is to enhance farmer livelihoods. The study was initiated as a result of the decreasing productivity of the crop in Uganda caused by the increasing prevalence of pests, diseases and abiotic stresses. Despite 9 years of farmers’ application of CSA, productivity has stagnated between 700kg -800kg/ha/yr which is only 26% of the 3-5tn/ha/yr that CSA is capable of delivering if properly applied. This has negatively affected the incomes of the 10.6 million people along the crop value chain which has in essence affected the country’s national income. In 2019/20 FY for example, Uganda suffered a deficit of $40m out of singularly the increasing incidence of one pest; BCTB. The amalgamation of such trends cripples the realization of SDG #1 and #13 which are the eradication of poverty and mitigation of climate change, respectively. In probing CSA’s effectiveness in curbing such a trend, this study is guided by the objectives of; determining the existing farmers’ knowledge and perceptions of CSA amongst the coffee farmers in the diverse coffee agro-ecological zones of Uganda; examining the relationship between the use of CSA and prevalence of selected coffee pests, diseases and abiotic stresses; ascertaining the difference in the market organization and pricing between conventionally and CSA produced coffee; and analyzing the prevailing policy environment concerning the use of CSA in coffee production. The data collection research design is descriptive in nature; collecting data from farmers and agricultural extension workers in the districts of Ntungamo, Iganga and Luweero; each of these districts representing a distinct coffee agroecological zone. Policy custodian officers at district, cooperatives and at the crop’s overseeing national authority were also interviewed.Keywords: climate change, food security, field stresses, Productivity
Procedia PDF Downloads 5725184 The Road Ahead: Merging Human Cyber Security Expertise with Generative AI
Authors: Brennan Lodge
Abstract:
Amidst a complex regulatory landscape, Retrieval Augmented Generation (RAG) emerges as a transformative tool for Governance Risk and Compliance (GRC) officers. This paper details the application of RAG in synthesizing Large Language Models (LLMs) with external knowledge bases, offering GRC professionals an advanced means to adapt to rapid changes in compliance requirements. While the development for standalone LLM’s (Large Language Models) is exciting, such models do have their downsides. LLM’s cannot easily expand or revise their memory, and they can’t straightforwardly provide insight into their predictions, and may produce “hallucinations.” Leveraging a pre-trained seq2seq transformer and a dense vector index of domain-specific data, this approach integrates real-time data retrieval into the generative process, enabling gap analysis and the dynamic generation of compliance and risk management content. We delve into the mechanics of RAG, focusing on its dual structure that pairs parametric knowledge contained within the transformer model with non-parametric data extracted from an updatable corpus. This hybrid model enhances decision-making through context-rich insights, drawing from the most current and relevant information, thereby enabling GRC officers to maintain a proactive compliance stance. Our methodology aligns with the latest advances in neural network fine-tuning, providing a granular, token-level application of retrieved information to inform and generate compliance narratives. By employing RAG, we exhibit a scalable solution that can adapt to novel regulatory challenges and cybersecurity threats, offering GRC officers a robust, predictive tool that augments their expertise. The granular application of RAG’s dual structure not only improves compliance and risk management protocols but also informs the development of compliance narratives with pinpoint accuracy. It underscores AI’s emerging role in strategic risk mitigation and proactive policy formation, positioning GRC officers to anticipate and navigate the complexities of regulatory evolution confidently.Keywords: cybersecurity, gen AI, retrieval augmented generation, cybersecurity defense strategies
Procedia PDF Downloads 9525183 Assessing Denitrification-Disintegration Model’s Efficacy in Simulating Greenhouse Gas Emissions, Crop Growth, Yield, and Soil Biochemical Processes in Moroccan Context
Authors: Mohamed Boullouz, Mohamed Louay Metougui
Abstract:
Accurate modeling of greenhouse gas (GHG) emissions, crop growth, soil productivity, and biochemical processes is crucial considering escalating global concerns about climate change and the urgent need to improve agricultural sustainability. The application of the denitrification-disintegration (DNDC) model in the context of Morocco's unique agro-climate is thoroughly investigated in this study. Our main research hypothesis is that the DNDC model offers an effective and powerful tool for precisely simulating a wide range of significant parameters, including greenhouse gas emissions, crop growth, yield potential, and complex soil biogeochemical processes, all consistent with the intricate features of environmental Moroccan agriculture. In order to verify these hypotheses, a vast amount of field data covering Morocco's various agricultural regions and encompassing a range of soil types, climatic factors, and crop varieties had to be gathered. These experimental data sets will serve as the foundation for careful model calibration and subsequent validation, ensuring the accuracy of simulation results. In conclusion, the prospective research findings add to the global conversation on climate-resilient agricultural practices while encouraging the promotion of sustainable agricultural models in Morocco. A policy architect's and an agricultural actor's ability to make informed decisions that not only advance food security but also environmental stability may be strengthened by the impending recognition of the DNDC model as a potent simulation tool tailored to Moroccan conditions.Keywords: greenhouse gas emissions, DNDC model, sustainable agriculture, Moroccan cropping systems
Procedia PDF Downloads 6525182 Syndromic Surveillance Framework Using Tweets Data Analytics
Authors: David Ming Liu, Benjamin Hirsch, Bashir Aden
Abstract:
Syndromic surveillance is to detect or predict disease outbreaks through the analysis of medical sources of data. Using social media data like tweets to do syndromic surveillance becomes more and more popular with the aid of open platform to collect data and the advantage of microblogging text and mobile geographic location features. In this paper, a Syndromic Surveillance Framework is presented with machine learning kernel using tweets data analytics. Influenza and the three cities Abu Dhabi, Al Ain and Dubai of United Arabic Emirates are used as the test disease and trial areas. Hospital cases data provided by the Health Authority of Abu Dhabi (HAAD) are used for the correlation purpose. In our model, Latent Dirichlet allocation (LDA) engine is adapted to do supervised learning classification and N-Fold cross validation confusion matrix are given as the simulation results with overall system recall 85.595% performance achieved.Keywords: Syndromic surveillance, Tweets, Machine Learning, data mining, Latent Dirichlet allocation (LDA), Influenza
Procedia PDF Downloads 11625181 Developing Indoor Enhanced Bio Composite Vertical Smart Farming System for Climbing Food Plant
Authors: S. Mokhtar, R. Ibrahim, K. Abdan, A. Rashidi
Abstract:
The population in the world are growing in very fast rate. It is expected that urban growth and development would create serious questions of food production and processing, transport, and consumption. Future smart green city policies are emerging to support new ways of visualizing, organizing and managing the city and its flows towards developing more sustainable cities in ensuring food security while maintaining its biodiversity. This is a survey paper analyzing the feasibility of developing a smart vertical farming system for climbing food plant to meet the need of food consumption in urban cities with an alternative green material. This paper documents our investigation on specific requirement for farming high valued climbing type food plant suitable for vertical farming, development of appropriate biocomposite material composition, and design recommendations for developing a new smart vertical farming system inside urban buildings. Results include determination of suitable specific climbing food plant species and material manufacturing processes for reinforcing natural fiber for biocomposite material. The results are expected to become recommendations for developing alternative structural materials for climbing food plant later on towards the development of the future smart vertical farming system. This paper contributes to supporting urban farming in cities and promotes green materials for preserving the environment. Hence supporting efforts in food security agenda especially for developing nations.Keywords: biocomposite, natural reinforce fiber, smart farming, vertical farming
Procedia PDF Downloads 16525180 Analysis of Urban Population Using Twitter Distribution Data: Case Study of Makassar City, Indonesia
Authors: Yuyun Wabula, B. J. Dewancker
Abstract:
In the past decade, the social networking app has been growing very rapidly. Geolocation data is one of the important features of social media that can attach the user's location coordinate in the real world. This paper proposes the use of geolocation data from the Twitter social media application to gain knowledge about urban dynamics, especially on human mobility behavior. This paper aims to explore the relation between geolocation Twitter with the existence of people in the urban area. Firstly, the study will analyze the spread of people in the particular area, within the city using Twitter social media data. Secondly, we then match and categorize the existing place based on the same individuals visiting. Then, we combine the Twitter data from the tracking result and the questionnaire data to catch the Twitter user profile. To do that, we used the distribution frequency analysis to learn the visitors’ percentage. To validate the hypothesis, we compare it with the local population statistic data and land use mapping released by the city planning department of Makassar local government. The results show that there is the correlation between Twitter geolocation and questionnaire data. Thus, integration the Twitter data and survey data can reveal the profile of the social media users.Keywords: geolocation, Twitter, distribution analysis, human mobility
Procedia PDF Downloads 31425179 Analysis and Rule Extraction of Coronary Artery Disease Data Using Data Mining
Authors: Rezaei Hachesu Peyman, Oliyaee Azadeh, Salahzadeh Zahra, Alizadeh Somayyeh, Safaei Naser
Abstract:
Coronary Artery Disease (CAD) is one major cause of disability in adults and one main cause of death in developed. In this study, data mining techniques including Decision Trees, Artificial neural networks (ANNs), and Support Vector Machine (SVM) analyze CAD data. Data of 4948 patients who had suffered from heart diseases were included in the analysis. CAD is the target variable, and 24 inputs or predictor variables are used for the classification. The performance of these techniques is compared in terms of sensitivity, specificity, and accuracy. The most significant factor influencing CAD is chest pain. Elderly males (age > 53) have a high probability to be diagnosed with CAD. SVM algorithm is the most useful way for evaluation and prediction of CAD patients as compared to non-CAD ones. Application of data mining techniques in analyzing coronary artery diseases is a good method for investigating the existing relationships between variables.Keywords: classification, coronary artery disease, data-mining, knowledge discovery, extract
Procedia PDF Downloads 65725178 Sensor Data Analysis for a Large Mining Major
Authors: Sudipto Shanker Dasgupta
Abstract:
One of the largest mining companies wanted to look at health analytics for their driverless trucks. These trucks were the key to their supply chain logistics. The automated trucks had multi-level sub-assemblies which would send out sensor information. The use case that was worked on was to capture the sensor signal from the truck subcomponents and analyze the health of the trucks from repair and replacement purview. Open source software was used to stream the data into a clustered Hadoop setup in Amazon Web Services cloud and Apache Spark SQL was used to analyze the data. All of this was achieved through a 10 node amazon 32 core, 64 GB RAM setup real-time analytics was achieved on ‘300 million records’. To check the scalability of the system, the cluster was increased to 100 node setup. This talk will highlight how Open Source software was used to achieve the above use case and the insights on the high data throughput on a cloud set up.Keywords: streaming analytics, data science, big data, Hadoop, high throughput, sensor data
Procedia PDF Downloads 40425177 Formation of Self Help Groups (SHGs) Protected Human Rights and Ensured Human Security of Female Sex Workers at Brothel in Bangladesh
Authors: Md. Nurul Alom Siddikqe
Abstract:
The purpose of this intervention was to describe how the marginalized people protect their rights and increase their self-dignity and self-esteem among brothel-based sex workers in 6 cities which are the victim of trafficked who came from different periphery areas Bangladesh. Eventually the sex workers are tortured by the pimp, clients, Msahi (so called guardian of bonded sex workers), Babu (So called husband) highly discriminated, vulnerable and stigmatized due to their occupation, movement, behavior and activities, which has got social disapproval. However, stigma, discrimination and violation of human rights not only bar them to access legal services, education of their kids, health, movement of outside of brothel, deprived of funeral after death, but also make them inaccessible due to their invisibility. Conducted an assessment among brothel-based sex workers setup to know their knowledge on human rights and find out their harassment and violence in their community. Inspired them to think about to be united and also assisted them to formation of self help group (SHG). Developed capacity of the SHG and developed leadership of its members through different trainings like administrative, financial management, public speaking and resource mobilization. Developed strategy to enhance the capacity of SHG so that they can collectively claim their rights and develop strategic partnership and network with the relevant service provider’s for restoring all sorts of rights. Conducted meeting with stakeholder including duty bearers, civil society organizations, media people and local government initiatives. Developed Networking with human rights commission, local elite, religious leaders and form human right watch committees at community level. Organized rally and observed national and international days along with government counterparts. By utilizing the project resources the members of SHG became capable to raise their collective voices against violence, discrimination and stigma as well as protected them from insecurity. The members of SHG have been participating in social program/event the SHG got membership of district level NGO coordination meeting through invitation from Deputy Commissioner, Civil Surgeon and Social welfare office of Government of Bangladesh. The Law Enforcement Agency is ensuring safety and security and the education department of government enrolled their children in primary level education. The Government provided land for grave yard after death for the Muslim sex workers and same for the other religious group. The SHGs are registered with government respective authorities. The SHGs are working with support from different development partners and implementing different projects sometime as consortium leaders. Opportunity created to take the vocational training from the government reputed department. The harassment by the clients reduced remarkably, babu, Mashi and other counterparts recognized the sex workers rights and ensure security with government counterpart access increased in legal, health and education. Indications are that the brothel based sex workers understood about their rights and became capable of ensuring their security through working under the self-help groups meaningfully.Keywords: brothel, discrimination, harassment, stigma
Procedia PDF Downloads 35825176 Safeguarding the Construction Industry: Interrogating and Mitigating Emerging Risks from AI in Construction
Authors: Abdelrhman Elagez, Rolla Monib
Abstract:
This empirical study investigates the observed risks associated with adopting Artificial Intelligence (AI) technologies in the construction industry and proposes potential mitigation strategies. While AI has transformed several industries, the construction industry is slowly adopting advanced technologies like AI, introducing new risks that lack critical analysis in the current literature. A comprehensive literature review identified a research gap, highlighting the lack of critical analysis of risks and the need for a framework to measure and mitigate the risks of AI implementation in the construction industry. Consequently, an online survey was conducted with 24 project managers and construction professionals, possessing experience ranging from 1 to 30 years (with an average of 6.38 years), to gather industry perspectives and concerns relating to AI integration. The survey results yielded several significant findings. Firstly, respondents exhibited a moderate level of familiarity (66.67%) with AI technologies, while the industry's readiness for AI deployment and current usage rates remained low at 2.72 out of 5. Secondly, the top-ranked barriers to AI adoption were identified as lack of awareness, insufficient knowledge and skills, data quality concerns, high implementation costs, absence of prior case studies, and the uncertainty of outcomes. Thirdly, the most significant risks associated with AI use in construction were perceived to be a lack of human control (decision-making), accountability, algorithm bias, data security/privacy, and lack of legislation and regulations. Additionally, the participants acknowledged the value of factors such as education, training, organizational support, and communication in facilitating AI integration within the industry. These findings emphasize the necessity for tailored risk assessment frameworks, guidelines, and governance principles to address the identified risks and promote the responsible adoption of AI technologies in the construction sector.Keywords: risk management, construction, artificial intelligence, technology
Procedia PDF Downloads 9925175 Kidnapping of Migrants by Drug Cartels in Mexico as a New Trend in Contemporary Slavery
Authors: Itze Coronel Salomon
Abstract:
The rise of organized crime and violence related to drug cartels in Mexico has created serious challenges for the authorities to provide security to those who live within its borders. However, to achieve a significant improvement in security is absolute respect for fundamental human rights by the authorities. Irregular migrants in Mexico are at serious risk of abuse. Research by Amnesty International as well as reports of the NHRC (National Human Rights) in Mexico, have indicated the major humanitarian crisis faced by thousands of migrants traveling in the shadows. However, the true extent of the problem remains invisible to the general population. The fact that federal and state governments leave no proper record of abuse and do not publish reliable data contributes to ignorance and misinformation, often spread by the media that portray migrants as the source of crime rather than their victims. Discrimination and intolerance against irregular migrants can generate greater hostility and exclusion. According to the modus operandi that has been recorded criminal organizations and criminal groups linked to drug trafficking structures deprive migrants of their liberty for forced labor and illegal activities related to drug trafficking, even some have been kidnapped for be trained as murderers . If the victim or their family cannot pay the ransom, the kidnapped person may suffer torture, mutilation and amputation of limbs or death. Migrant women are victims of sexual abuse during her abduction as well. In 2011, at least 177 bodies were identified in the largest mass grave found in Mexico, located in the town of San Fernando, in the border state of Tamaulipas, most of the victims were killed by blunt instruments, and most seemed to be immigrants and travelers passing through the country. With dozens of small graves discovered in northern Mexico, this may suggest a change in tactics between organized crime groups to the different means of obtaining revenue and reduce murder profile methods. Competition and conflict over territorial control drug trafficking can provide strong incentives for organized crime groups send signals of violence to the authorities and rival groups. However, as some Mexican organized crime groups are increasingly looking to take advantage of income and vulnerable groups, such as Central American migrants seem less interested in advertising his work to authorities and others, and more interested in evading detection and confrontation. This paper pretends to analyze the introduction of this new trend of kidnapping migrants for forced labors by drug cartels in Mexico into the forms of contemporary slavery and its implications.Keywords: international law, migration, transnational organized crime
Procedia PDF Downloads 41625174 Ecosystem Model for Environmental Applications
Authors: Cristina Schreiner, Romeo Ciobanu, Marius Pislaru
Abstract:
This paper aims to build a system based on fuzzy models that can be implemented in the assessment of ecological systems, to determine appropriate methods of action for reducing adverse effects on environmental and implicit the population. The model proposed provides new perspective for environmental assessment, and it can be used as a practical instrument for decision-making.Keywords: ecosystem model, environmental security, fuzzy logic, sustainability of habitable regions
Procedia PDF Downloads 42025173 Data-Centric Anomaly Detection with Diffusion Models
Authors: Sheldon Liu, Gordon Wang, Lei Liu, Xuefeng Liu
Abstract:
Anomaly detection, also referred to as one-class classification, plays a crucial role in identifying product images that deviate from the expected distribution. This study introduces Data-centric Anomaly Detection with Diffusion Models (DCADDM), presenting a systematic strategy for data collection and further diversifying the data with image generation via diffusion models. The algorithm addresses data collection challenges in real-world scenarios and points toward data augmentation with the integration of generative AI capabilities. The paper explores the generation of normal images using diffusion models. The experiments demonstrate that with 30% of the original normal image size, modeling in an unsupervised setting with state-of-the-art approaches can achieve equivalent performances. With the addition of generated images via diffusion models (10% equivalence of the original dataset size), the proposed algorithm achieves better or equivalent anomaly localization performance.Keywords: diffusion models, anomaly detection, data-centric, generative AI
Procedia PDF Downloads 8225172 The Communist Party of China’s Approach to Human Rights and the Death Penalty in China since 1979
Authors: Huang Gui
Abstract:
The issues of human rights and death penalty are always drawing attentions from international scholars, critics and observers, activities and Chinese scholars, and most of them looking at these problems are just doing with such legal or political from a single perspective, but the real relationship between Chinese political regime and legislation is often ignored. In accordance with the Constitution of P.R.C., Communist Party of China (CPC) does not merely play a key role in political field, but in legislation and law enforcement as well. Therefore, the legislation has to implement the party’s theory and outlook, and realize the party’s policies. So is the death penalty system, though it is only concrete punishment system. Considering this point, basic upon the introducing the relationship between CPC and legislation, this paper would like to explore the shifting of CPC’s outlook on human rights and the death penalty system changes in different eras. In Maoist era, the issue of human rights was rejected and deemed as an exclusion zone, and the death penalty was unjustifiably imposed; human rights were politically recognized and accepted in Deng era, but CPC has its own viewpoints on it. CPC emphasized on national security and stability in that era, and the individual human rights weren’t taken correspondingly and reasonably account of. The death penalty was abused and deemed as an important measure to control crime. In post-Deng, human rights were gradually developed and recognized. The term of ‘state respect and protect human rights’ is contained in Constitution of P.R.C., and the individual human rights are gradually valued, but the CPC still focus on state security, development, and stability, the individual right to life hasn’t been enough valued like the right to substance. Although the steps of reforming death penalty are taking, there are still 46 crimes punishable by death. CPC should change its outlook and pay more attention to the right to life, and try to abolish death penalty de facto and de jure.Keywords: criminal law, communist party of China, death penalty, human rights, China
Procedia PDF Downloads 41625171 Parallel Vector Processing Using Multi Level Orbital DATA
Authors: Nagi Mekhiel
Abstract:
Many applications use vector operations by applying single instruction to multiple data that map to different locations in conventional memory. Transferring data from memory is limited by access latency and bandwidth affecting the performance gain of vector processing. We present a memory system that makes all of its content available to processors in time so that processors need not to access the memory, we force each location to be available to all processors at a specific time. The data move in different orbits to become available to other processors in higher orbits at different time. We use this memory to apply parallel vector operations to data streams at first orbit level. Data processed in the first level move to upper orbit one data element at a time, allowing a processor in that orbit to apply another vector operation to deal with serial code limitations inherited in all parallel applications and interleaved it with lower level vector operations.Keywords: Memory Organization, Parallel Processors, Serial Code, Vector Processing
Procedia PDF Downloads 27025170 Reconstructability Analysis for Landslide Prediction
Authors: David Percy
Abstract:
Landslides are a geologic phenomenon that affects a large number of inhabited places and are constantly being monitored and studied for the prediction of future occurrences. Reconstructability analysis (RA) is a methodology for extracting informative models from large volumes of data that work exclusively with discrete data. While RA has been used in medical applications and social science extensively, we are introducing it to the spatial sciences through applications like landslide prediction. Since RA works exclusively with discrete data, such as soil classification or bedrock type, working with continuous data, such as porosity, requires that these data are binned for inclusion in the model. RA constructs models of the data which pick out the most informative elements, independent variables (IVs), from each layer that predict the dependent variable (DV), landslide occurrence. Each layer included in the model retains its classification data as a primary encoding of the data. Unlike other machine learning algorithms that force the data into one-hot encoding type of schemes, RA works directly with the data as it is encoded, with the exception of continuous data, which must be binned. The usual physical and derived layers are included in the model, and testing our results against other published methodologies, such as neural networks, yields accuracy that is similar but with the advantage of a completely transparent model. The results of an RA session with a data set are a report on every combination of variables and their probability of landslide events occurring. In this way, every combination of informative state combinations can be examined.Keywords: reconstructability analysis, machine learning, landslides, raster analysis
Procedia PDF Downloads 6625169 Marine Ecosystem Mapping of Taman Laut Labuan: The First Habitat Mapping Effort to Support Marine Parks Management in Malaysia
Authors: K. Ismail, A. Ali, R. C. Hasan, I. Khalil, Z. Bachok, N. M. Said, A. M. Muslim, M. S. Che Din, W. S. Chong
Abstract:
The marine ecosystem in Malaysia holds invaluable potential in terms of economics, food security, pharmaceuticals components and protection from natural hazards. Although exploration of oil and gas industry and fisheries are active within Malaysian waters, knowledge of the seascape and ecological functioning of benthic habitats is still extremely poor in the marine parks around Malaysia due to the lack of detailed seafloor information. Consequently, it is difficult to manage marine resources effectively, protect ecologically important areas and set legislation to safeguard the marine parks. The limited baseline data hinders scientific linkage to support effective marine spatial management in Malaysia. This became the main driver behind the first seabed mapping effort at the national level. Taman Laut Labuan (TLL) is located to the west coast of Sabah and to the east of South China Sea. The total area of TLL is approximately 158.15 km2, comprises of three islands namely Pulau Kuraman, Rusukan Besar and Rusukan Kecil and is characterised by shallow fringing reef with few submerged shallow reef. The unfamiliar rocky shorelines limit the survey of multibeam echosounder to area with depth more than 10 m. Whereas, singlebeam and side scan sonar systems were used to acquire the data for area with depth less than 10 m. By integrating data from multibeam bathymetry and backscatter with singlebeam bathymetry and side sonar images, we produce a substrate map and coral coverage map for the TLL using i) marine landscape mapping technique and ii) RSOBIA ArcGIS toolbar (developed by T. Le Bas). We take the initiative to explore the ability of aerial drone and satellite image (WorldView-3) to derive the depths and substrate type within the intertidal and subtidal zone where it is not accessible via acoustic mapping. Although the coverage was limited, the outcome showed a promising technique to be incorporated towards establishing a guideline to facilitate a standard practice for efficient marine spatial management in Malaysia.Keywords: habitat mapping, marine spatial management, South China Sea, National seabed mapping
Procedia PDF Downloads 22425168 Data Analytics in Hospitality Industry
Authors: Tammy Wee, Detlev Remy, Arif Perdana
Abstract:
In the recent years, data analytics has become the buzzword in the hospitality industry. The hospitality industry is another example of a data-rich industry that has yet fully benefited from the insights of data analytics. Effective use of data analytics can change how hotels operate, market and position themselves competitively in the hospitality industry. However, at the moment, the data obtained by individual hotels remain under-utilized. This research is a preliminary research on data analytics in the hospitality industry, using an in-depth face-to-face interview on one hotel as a start to a multi-level research. The main case study of this research, hotel A, is a chain brand of international hotel that has been systematically gathering and collecting data on its own customer for the past five years. The data collection points begin from the moment a guest book a room until the guest leave the hotel premises, which includes room reservation, spa booking, and catering. Although hotel A has been gathering data intelligence on its customer for some time, they have yet utilized the data to its fullest potential, and they are aware of their limitation as well as the potential of data analytics. Currently, the utilization of data analytics in hotel A is limited in the area of customer service improvement, namely to enhance the personalization of service for each individual customer. Hotel A is able to utilize the data to improve and enhance their service which in turn, encourage repeated customers. According to hotel A, 50% of their guests returned to their hotel, and 70% extended nights because of the personalized service. Apart from using the data analytics for enhancing customer service, hotel A also uses the data in marketing. Hotel A uses the data analytics to predict or forecast the change in consumer behavior and demand, by tracking their guest’s booking preference, payment preference and demand shift between properties. However, hotel A admitted that the data they have been collecting was not fully utilized due to two challenges. The first challenge of using data analytics in hotel A is the data is not clean. At the moment, the data collection of one guest profile is meaningful only for one department in the hotel but meaningless for another department. Cleaning up the data and getting standards correctly for usage by different departments are some of the main concerns of hotel A. The second challenge of using data analytics in hotel A is the non-integral internal system. At the moment, the internal system used by hotel A do not integrate with each other well, limiting the ability to collect data systematically. Hotel A is considering another system to replace the current one for more comprehensive data collection. Hotel proprietors recognized the potential of data analytics as reported in this research, however, the current challenges of implementing a system to collect data come with a cost. This research has identified the current utilization of data analytics and the challenges faced when it comes to implementing data analytics.Keywords: data analytics, hospitality industry, customer relationship management, hotel marketing
Procedia PDF Downloads 18025167 Determinants of Food Insecurity Among Smallholder Farming Households in Southwest Area of Nigeria
Authors: Adesomoju O. A., E. A. Onemolease, G. O. Igene
Abstract:
The study analyzed the determinants of food insecurity among smallholder farming households in the Southwestern part of Nigeria with Ondo and Osun States in focus. Multi-stage sampling procedures were employed to gather data from 389 farming households (194 from Ondo State and 195 from Osun State) spread across 4 agricultural zones, 8 local governments, and 24 communities. The data was analyzed using descriptive statistics, Ordinal regression, and Friedman test. Results revealed the average age of the respondents was 47 years with majority being male 63.75% and married 82.26% and having an household size of 6. Most household heads were educated (94.09%), engaged in farming for about 19 years, and do not belong to cooperatives (73.26%). Respondents derived income from both farming and non-farm activities with the average farm income being N216,066.8/annum and non-farm income being about N360,000/annum. Multiple technologies were adopted by respondents such as application of herbicides (77.63%), pesticides (73.26%) and fertilizers (66.58%). Using the FANTA Cornel model, food insecurity was prevalent in the study area with the majority (61.44%) of the households being severely food insecure, and 35.73% being moderately food insecure. In comparison, 1.80% and 1.03% were food-secured and mildly food insecure. The most significant constraints to food security among the farming households were the inability to access credit (mean rank = 8.78), poor storage infrastructure (8.57), inadequate capital (8.56), and high cost of farm chemicals (8.35). Significant factors related to food insecurity among the farming households were age (b = -0.059), education (b = -0.376), family size (b = 0.197), adoption of technology (b = -0.198), farm income (b = -0.335), association membership (b = -0.999), engagement in non-farm activities (b = -1.538), and access to credit (b = -0.853). Linking farmers' groups to credit institutions and input suppliers was proposed.Keywords: food insecurity, FANTA Cornel, Ondo, Osun, Nigeria, Southwest, Livelihood
Procedia PDF Downloads 30