Search results for: mining collection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3912

Search results for: mining collection

2952 An Intelligence-Led Methodologly for Detecting Dark Actors in Human Trafficking Networks

Authors: Andrew D. Henshaw, James M. Austin

Abstract:

Introduction: Human trafficking is an increasingly serious transnational criminal enterprise and social security issue. Despite ongoing efforts to mitigate the phenomenon and a significant expansion of security scrutiny over past decades, it is not receding. This is true for many nations in Southeast Asia, widely recognized as the global hub for trafficked persons, including men, women, and children. Clearly, human trafficking is difficult to address because there are numerous drivers, causes, and motivators for it to persist, such as non-military and non-traditional security challenges, i.e., climate change, global warming displacement, and natural disasters. These make displaced persons and refugees particularly vulnerable. The issue is so large conservative estimates put a dollar value at around $150 billion-plus per year (Niethammer, 2020) spanning sexual slavery and exploitation, forced labor, construction, mining and in conflict roles, and forced marriages of girls and women. Coupled with corruption throughout military, police, and civil authorities around the world, and the active hands of powerful transnational criminal organizations, it is likely that such figures are grossly underestimated as human trafficking is misreported, under-detected, and deliberately obfuscated to protect those profiting from it. For example, the 2022 UN report on human trafficking shows a 56% reduction in convictions in that year alone (UNODC, 2022). Our Approach: To better understand this, our research utilizes a bespoke methodology. Applying a JAM (Juxtaposition Assessment Matrix), which we previously developed to detect flows of dark money around the globe (Henshaw, A & Austin, J, 2021), we now focus on the human trafficking paradigm. Indeed, utilizing a JAM methodology has identified key indicators of human trafficking not previously explored in depth. Being a set of structured analytical techniques that provide panoramic interpretations of the subject matter, this iteration of the JAM further incorporates behavioral and driver indicators, including the employment of Open-Source Artificial Intelligence (OS-AI) across multiple collection points. The extracted behavioral data was then applied to identify non-traditional indicators as they contribute to human trafficking. Furthermore, as the JAM OS-AI analyses data from the inverted position, i.e., the viewpoint of the traffickers, it examines the behavioral and physical traits required to succeed. This transposed examination of the requirements of success delivers potential leverage points for exploitation in the fight against human trafficking in a new and novel way. Findings: Our approach identified new innovative datasets that have previously been overlooked or, at best, undervalued. For example, the JAM OS-AI approach identified critical 'dark agent' lynchpins within human trafficking that are difficult to detect and harder to connect to actors and agents within a network. Our preliminary data suggests this is in part due to the fact that ‘dark agents’ in extant research have been difficult to detect and potentially much harder to directly connect to the actors and organizations in human trafficking networks. Our research demonstrates that using new investigative techniques such as OS-AI-aided JAM introduces a powerful toolset to increase understanding of human trafficking and transnational crime and illuminate networks that, to date, avoid global law enforcement scrutiny.

Keywords: human trafficking, open-source intelligence, transnational crime, human security, international human rights, intelligence analysis, JAM OS-AI, Dark Money

Procedia PDF Downloads 89
2951 Building Data Infrastructure for Public Use and Informed Decision Making in Developing Countries-Nigeria

Authors: Busayo Fashoto, Abdulhakeem Shaibu, Justice Agbadu, Samuel Aiyeoribe

Abstract:

Data has gone from just rows and columns to being an infrastructure itself. The traditional medium of data infrastructure has been managed by individuals in different industries and saved on personal work tools; one of such is the laptop. This hinders data sharing and Sustainable Development Goal (SDG) 9 for infrastructure sustainability across all countries and regions. However, there has been a constant demand for data across different agencies and ministries by investors and decision-makers. The rapid development and adoption of open-source technologies that promote the collection and processing of data in new ways and in ever-increasing volumes are creating new data infrastructure in sectors such as lands and health, among others. This paper examines the process of developing data infrastructure and, by extension, a data portal to provide baseline data for sustainable development and decision making in Nigeria. This paper employs the FAIR principle (Findable, Accessible, Interoperable, and Reusable) of data management using open-source technology tools to develop data portals for public use. eHealth Africa, an organization that uses technology to drive public health interventions in Nigeria, developed a data portal which is a typical data infrastructure that serves as a repository for various datasets on administrative boundaries, points of interest, settlements, social infrastructure, amenities, and others. This portal makes it possible for users to have access to datasets of interest at any point in time at no cost. A skeletal infrastructure of this data portal encompasses the use of open-source technology such as Postgres database, GeoServer, GeoNetwork, and CKan. These tools made the infrastructure sustainable, thus promoting the achievement of SDG 9 (Industries, Innovation, and Infrastructure). As of 6th August 2021, a wider cross-section of 8192 users had been created, 2262 datasets had been downloaded, and 817 maps had been created from the platform. This paper shows the use of rapid development and adoption of technologies that facilitates data collection, processing, and publishing in new ways and in ever-increasing volumes. In addition, the paper is explicit on new data infrastructure in sectors such as health, social amenities, and agriculture. Furthermore, this paper reveals the importance of cross-sectional data infrastructures for planning and decision making, which in turn can form a central data repository for sustainable development across developing countries.

Keywords: data portal, data infrastructure, open source, sustainability

Procedia PDF Downloads 93
2950 Twitter's Impact on Print Media with Respect to Real World Events

Authors: Basit Shahzad, Abdullatif M. Abdullatif

Abstract:

Recent advancements in Information and Communication Technologies (ICT) and easy access to Internet have made social media the first choice for information sharing related to any important events or news. On Twitter, trend is a common feature that quantifies the level of popularity of a certain news or event. In this work, we examine the impact of Twitter trends on real world events by hypothesizing that Twitter trends have an influence on print media in Pakistan. For this, Twitter is used as a platform and Twitter trends as a base line. We first collect data from two sources (Twitter trends and print media) in the period May to August 2016. Obtained data from two sources is analyzed and it is observed that social media is significantly influencing the print media and majority of the news printed in newspaper are posted on Twitter earlier.

Keywords: twitter trends, text mining, effectiveness of trends, print media

Procedia PDF Downloads 256
2949 A Study on the Nostalgia Contents Analysis of Hometown Alumni in the Online Community

Authors: Heejin Yun, Juanjuan Zang

Abstract:

This study aims to analyze the text terms posted on an online community of people from the same hometown and to understand the topic and trend of nostalgia composed online. For this purpose, this study collected 144 writings which the natives of Yeongjong Island, Incheon, South-Korea have posted on an online community. And it analyzed association relations. As a result, online community texts means that just defining nostalgia as ‘a mind longing for hometown’ is not an enough explanation. Second, texts composed online have abstractness rather than persons’ individual stories. This study figured out the relationship that had the most critical and closest mutual association among the terms that constituted nostalgia through literature research and association rule concerning nostalgia. The result of this study has a characteristic that it summed up the core terms and emotions related to nostalgia.

Keywords: nostalgia, cultural memory, data mining, association rule

Procedia PDF Downloads 228
2948 Fractional, Component and Morphological Composition of Ambient Air Dust in the Areas of Mining Industry

Authors: S.V. Kleyn, S.Yu. Zagorodnov, А.А. Kokoulina

Abstract:

Technogenic emissions of the mining and processing complex are characterized by a high content of chemical components and solid dust particles. However, each industrial enterprise and the surrounding area have features that require refinement and parameterization. Numerous studies have shown the negative impact of fine dust PM10 and PM2.5 on the health, as well as the possibility of toxic components absorption, including heavy metals by dust particles. The target of the study was the quantitative assessment of the fractional and particle size composition of ambient air dust in the area of impact by primary magnesium production complex. Also, we tried to describe the morphology features of dust particles. Study methods. To identify the dust emission sources, the analysis of the production process has been carried out. The particulate composition of the emissions was measured using laser particle analyzer Microtrac S3500 (covered range of particle size is 20 nm to 2000 km). Particle morphology and the component composition were established by electron microscopy by scanning microscope of high resolution (magnification rate - 5 to 300 000 times) with X-ray fluorescence device S3400N ‘HITACHI’. The chemical composition was identified by X-ray analysis of the samples using an X-ray diffractometer XRD-700 ‘Shimadzu’. Determination of the dust pollution level was carried out using model calculations of emissions in the atmosphere dispersion. The calculations were verified by instrumental studies. Results of the study. The results demonstrated that the dust emissions of different technical processes are heterogeneous and fractional structure is complicated. The percentage of particle sizes up to 2.5 micrometres inclusive was ranged from 0.00 to 56.70%; particle sizes less than 10 microns inclusive – 0.00 - 85.60%; particle sizes greater than 10 microns - 14.40% -100.00%. During microscopy, the presence of nanoscale size particles has been detected. Studied dust particles are round, irregular, cubic and integral shapes. The composition of the dust includes magnesium, sodium, potassium, calcium, iron, chlorine. On the base of obtained results, it was performed the model calculations of dust emissions dispersion and establishment of the areas of fine dust РМ 10 and РМ 2.5 distribution. It was found that the dust emissions of fine powder fractions PM10 and PM2.5 are dispersed over large distances and beyond the border of the industrial site of the enterprise. The population living near the enterprise is exposed to the risk of diseases associated with dust exposure. Data are transferred to the economic entity to make decisions on the measures to minimize the risks. Exposure and risks indicators on the health are used to provide named patient health and preventive care to the citizens living in the area of negative impact of the facility.

Keywords: dust emissions, еxposure assessment, PM 10, PM 2.5

Procedia PDF Downloads 253
2947 Three or Four Tonics and a Wave: The Trajectory of Health Insurance Regulation in Brazil

Authors: João Boaventura Branco De Matos

Abstract:

Currently, in Brazil, there is a considerable collection of publications on the supplementary health sector, but the vast majority is limited to retrospective examination of the sector. The present contribution starts from the diagnosis of an overwhelming change in the role of the State and its institutions, as well as an accelerated and no less forceful change in the way of producing goods and services, resulting in a clash between these different waves (state and market). This shock produces unique energy, capable of imposing major changes in the most varied sectors. Based on this diagnosis, there was an opportunity to offer the perspective and propositional study of regulatory measures relevant to the best conduct and performance of this sector in the future.

Keywords: private health regulation, state and market, forecasts in Brazilian regulation, political economy

Procedia PDF Downloads 146
2946 Effectiveness of ATMS (Advanced Transport Management Systems) in Asuncion, Paraguay

Authors: Sung Ho Oh

Abstract:

The advanced traffic lights, the system of traffic information collection and provision, the CCTVs for traffic control, and the traffic information center were installed in Asuncion, capital of Paraguay. After pre-post comparison of the installation, significant changes were found. Even though the traffic volumes were increased, travel speed was higher, so that travel time from origin to destination was decreased. the saving values for travel time, gas cost, and environmental cost are about 47 million US dollars per year. Satisfaction survey results for the installation were presented with statistical significance analysis.

Keywords: advanced transport management systems, effectiveness, Paraguay, traffic lights

Procedia PDF Downloads 348
2945 Analysis and Forecasting of Bitcoin Price Using Exogenous Data

Authors: J-C. Leneveu, A. Chereau, L. Mansart, T. Mesbah, M. Wyka

Abstract:

Extracting and interpreting information from Big Data represent a stake for years to come in several sectors such as finance. Currently, numerous methods are used (such as Technical Analysis) to try to understand and to anticipate market behavior, with mixed results because it still seems impossible to exactly predict a financial trend. The increase of available data on Internet and their diversity represent a great opportunity for the financial world. Indeed, it is possible, along with these standard financial data, to focus on exogenous data to take into account more macroeconomic factors. Coupling the interpretation of these data with standard methods could allow obtaining more precise trend predictions. In this paper, in order to observe the influence of exogenous data price independent of other usual effects occurring in classical markets, behaviors of Bitcoin users are introduced in a model reconstituting Bitcoin value, which is elaborated and tested for prediction purposes.

Keywords: big data, bitcoin, data mining, social network, financial trends, exogenous data, global economy, behavioral finance

Procedia PDF Downloads 352
2944 Knowledge Development: How New Information System Technologies Affect Knowledge Development

Authors: Yener Ekiz

Abstract:

Knowledge development is a proactive process that covers collection, analysis, storage and distribution of information that helps to contribute the understanding of the environment. To transfer knowledge correctly and fastly, you have to use new emerging information system technologies. Actionable knowledge is only of value if it is understandable and usable by target users. The purpose of the paper is to enlighten how technology eases and affects the process of knowledge development. While preparing the paper, literature review, survey and interview methodology will be used. The hypothesis is that the technology and knowledge development are inseparable and the technology will formalize the DIKW hierarchy again. As a result, today there is huge data. This data must be classified sharply and quickly.

Keywords: DIKW hierarchy, knowledge development, technology

Procedia PDF Downloads 435
2943 Uplift Modeling Approach to Optimizing Content Quality in Social Q/A Platforms

Authors: Igor A. Podgorny

Abstract:

TurboTax AnswerXchange is a social Q/A system supporting users working on federal and state tax returns. Content quality and popularity in the AnswerXchange can be predicted with propensity models using attributes of the question and answer. Using uplift modeling, we identify features of questions and answers that can be modified during the question-asking and question-answering experience in order to optimize the AnswerXchange content quality. We demonstrate that adding details to the questions always results in increased question popularity that can be used to promote good quality content. Responding to close-ended questions assertively improve content quality in the AnswerXchange in 90% of cases. Answering knowledge questions with web links increases the likelihood of receiving a negative vote from 60% of the askers. Our findings provide a rationale for employing the uplift modeling approach for AnswerXchange operations.

Keywords: customer relationship management, human-machine interaction, text mining, uplift modeling

Procedia PDF Downloads 243
2942 SNP g.1007A>G within the Porcine DNAL4 Gene Affects Sperm Motility Traits

Authors: I. Wiedemann, A. R. Sharifi, A. Mählmeyer, C. Knorr

Abstract:

A requirement for sperm motility is a morphologically intact flagellum with a central axoneme. The flagellar beating is caused by the varying activation and inactivation of dynein molecules which are located in the axoneme. DNAL4 (dynein, axonemal, light chain 4) is regarded as a possible functional candidate gene encoding a small subunit of the dyneins. In the present study, 5814bp of the porcine DNAL4 (GenBank Acc. No. AM284696.1, 6097 bp, 4 exons) were comparatively sequenced using three boars with a high motility (>68%) and three with a low motility (<60%). Primers were self-designed except for those covering exons 1, 2 and 3. Prior to sequencing, the PCR products were purified. Sequencing was performed with an ABI PRISM 3100 Genetic Analyzer using the BigDyeTM Terminator v3.1 Cycle Sequencing Reaction Kit. Finally, 23 SNPs were described and genotyped for 82 AI boars representing the breeds Piétrain, German Large White and German Landrace. The genotypes were used to assess possible associations with standard spermatological parameters (ejaculate volume, density, and sperm motility (undiluted (Motud), 24h (Mot1) and 48h (Mot2) after semen collection) that were regularly recorded on the AI station. The analysis included a total of 8,833 spermatological data sets which ranged from 2 to 295 sets per boar in five years. Only SNP g.1007A>G had a significant effect. Finally, the gene substitution effect using the following statistical model was calculated: Yijk= µ+αi+βj+αβij+b1Sijk+b2Aijk+b3T ijk + b4Vijk+b5(α*A)ijk +b6(β*A)ijk+b7(A*T)ijk+Uijk+eijk where Yijk is the semen characteristics, µ is the general mean, α is the main effect of breed, β is the main effect of season, S is the effect of SNP (g.1007A > G), A is the effect of age at semen collection, V is the effect of diluter, αβ, α*A, β*A, A*T are interactions between the fixed effects, b1-b7 are regression coefficients between y and the respective covariate, U is the random effect of repeated observation on animal and e is the random error. The results from the single marker regression analysis revealed highly significant effects (p < 0.0001) of SNP g.1007A > G on Mot1 resp. on Mot2, resulting in a marked reduction by 11.4% resp. 15.4%. Furthermore a loss of Motud by 4.6% was detected (p < 0.0178). Considering the SNP g.1007A > G as a main factor (dominant-recessive model), significant differences between genotypes AA and AG as well as AA and GG for Mot1 and Mot2 exist. For Motud there was a significant difference between AA and GG.

Keywords: association, DNAL4, porcine, sperm traits

Procedia PDF Downloads 453
2941 Entrepreneurs’ Perceptions of the Economic, Social and Physical Impacts of Tourism

Authors: Oktay Emir

Abstract:

The objective of this study is to determine how entrepreneurs perceive the economic, social and physical impacts of tourism. The study was conducted in the city of Afyonkarahisar, Turkey, which is rich in thermal tourism resources and investments. A survey was used as the data collection method, and the questionnaire was applied to 472 entrepreneurs. A simple random sampling method was used to identify the sample. Independent sampling t-tests and ANOVA tests were used to analyse the data obtained. Additionally, some statistically significant differences (p<0.05) were found based on the participants’ demographic characteristics regarding their opinions about the social, economic and physical impacts of tourism activities.

Keywords: tourism, perception, entrepreneurship, entrepreneurs, structural equation modelling

Procedia PDF Downloads 449
2940 Study of the Landslide and Stability of Open Pit Quarry: Case of Open Pite Quarry of Chouf Amar M'sila, Algeria

Authors: Saadoun Abd Errazak, Hafssaoui Abdallah, Fredj Mohamed

Abstract:

Mining operations open induce risks of instability that can cause landslides and collapse at the bleachers slope. These risks may occur both during and after the operation phase. The magnitude of these risks depends on the mechanical and physical characteristics of the rock mass, the geometrical dimensions of ore bodies, their spatial arrangement, and the state of the operated area. If security and technology measures are not taken into account for this purpose, the environment will be affected. The main objective of this work is to assess these risks by analytical and numerical methods. The study is based on the geological, hydrogeological and geotechnical rock mass of the open pit quarry of Chouf Amar M'sila. The results obtained have allowed us to obtain an acceptable factor of safety and stability study of the open pit.

Keywords: stability, land sliding, numerical modeling, safety factor, open-pit quarry

Procedia PDF Downloads 366
2939 An Analysis of Privacy and Security for Internet of Things Applications

Authors: Dhananjay Singh, M. Abdullah-Al-Wadud

Abstract:

The Internet of Things is a concept of a large scale ecosystem of wireless actuators. The actuators are defined as things in the IoT, those which contribute or produces some data to the ecosystem. However, ubiquitous data collection, data security, privacy preserving, large volume data processing, and intelligent analytics are some of the key challenges into the IoT technologies. In order to solve the security requirements, challenges and threats in the IoT, we have discussed a message authentication mechanism for IoT applications. Finally, we have discussed data encryption mechanism for messages authentication before propagating into IoT networks.

Keywords: Internet of Things (IoT), message authentication, privacy, security

Procedia PDF Downloads 379
2938 The Problem of the Use of Learning Analytics in Distance Higher Education: An Analytical Study of the Open and Distance University System in Mexico

Authors: Ismene Ithai Bras-Ruiz

Abstract:

Learning Analytics (LA) is employed by universities not only as a tool but as a specialized ground to enhance students and professors. However, not all the academic programs apply LA with the same goal and use the same tools. In fact, LA is formed by five main fields of study (academic analytics, action research, educational data mining, recommender systems, and personalized systems). These fields can help not just to inform academic authorities about the situation of the program, but also can detect risk students, professors with needs, or general problems. The highest level applies Artificial Intelligence techniques to support learning practices. LA has adopted different techniques: statistics, ethnography, data visualization, machine learning, natural language process, and data mining. Is expected that any academic program decided what field wants to utilize on the basis of his academic interest but also his capacities related to professors, administrators, systems, logistics, data analyst, and the academic goals. The Open and Distance University System (SUAYED in Spanish) of the University National Autonomous of Mexico (UNAM), has been working for forty years as an alternative to traditional programs; one of their main supports has been the employ of new information and communications technologies (ICT). Today, UNAM has one of the largest network higher education programs, twenty-six academic programs in different faculties. This situation means that every faculty works with heterogeneous populations and academic problems. In this sense, every program has developed its own Learning Analytic techniques to improve academic issues. In this context, an investigation was carried out to know the situation of the application of LA in all the academic programs in the different faculties. The premise of the study it was that not all the faculties have utilized advanced LA techniques and it is probable that they do not know what field of study is closer to their program goals. In consequence, not all the programs know about LA but, this does not mean they do not work with LA in a veiled or, less clear sense. It is very important to know the grade of knowledge about LA for two reasons: 1) This allows to appreciate the work of the administration to improve the quality of the teaching and, 2) if it is possible to improve others LA techniques. For this purpose, it was designed three instruments to determinate the experience and knowledge in LA. These were applied to ten faculty coordinators and his personnel; thirty members were consulted (academic secretary, systems manager, or data analyst, and coordinator of the program). The final report allowed to understand that almost all the programs work with basic statistics tools and techniques, this helps the administration only to know what is happening inside de academic program, but they are not ready to move up to the next level, this means applying Artificial Intelligence or Recommender Systems to reach a personalized learning system. This situation is not related to the knowledge of LA, but the clarity of the long-term goals.

Keywords: academic improvements, analytical techniques, learning analytics, personnel expertise

Procedia PDF Downloads 126
2937 Hybrid Hierarchical Clustering Approach for Community Detection in Social Network

Authors: Radhia Toujani, Jalel Akaichi

Abstract:

Social Networks generally present a hierarchy of communities. To determine these communities and the relationship between them, detection algorithms should be applied. Most of the existing algorithms, proposed for hierarchical communities identification, are based on either agglomerative clustering or divisive clustering. In this paper, we present a hybrid hierarchical clustering approach for community detection based on both bottom-up and bottom-down clustering. Obviously, our approach provides more relevant community structure than hierarchical method which considers only divisive or agglomerative clustering to identify communities. Moreover, we performed some comparative experiments to enhance the quality of the clustering results and to show the effectiveness of our algorithm.

Keywords: agglomerative hierarchical clustering, community structure, divisive hierarchical clustering, hybrid hierarchical clustering, opinion mining, social network, social network analysis

Procedia PDF Downloads 360
2936 Treatment of Acid Mine Drainage with Metallurgical Slag

Authors: Sukla Saha, Alok Sinha

Abstract:

Acid mine drainage (AMD) refers to the production of acidified water from abandoned mines and active mines as well. The reason behind the generation of this kind of acidified water is the oxidation of pyrites present in the rocks in and around mining areas. Thiobacillus ferrooxidans, which is a sulfur oxidizing bacteria, helps in the oxidation process. AMD is extremely acidic in nature, (pH 2-3) with high concentration of several trace and heavy metals such as Fe, Al, Zn, Mn, Cu and Co and anions such as chloride and sulfate. AMD has several detrimental effect on aquatic organism and environment. It can directly or indirectly contaminate the ground water and surface water as well. The present study considered the treatment of AMD with metallurgical slag, which is a waste material. Slag helped to enhance the pH of AMD to 8.62 from 1.5 with 99% removal of trace metals such as Fe, Al, Mn, Cu and Co. Metallurgical slag was proven as efficient neutralizing material for the treatment of AMD.

Keywords: acid mine drainage, Heavy metals, metallurgical slag, Neutralization

Procedia PDF Downloads 184
2935 Audit on Antibiotic Prophylaxis and Post-Procedure Complication Rate for Patients Undergoing Transperineal Template Biopsies of the Prostate

Authors: W. Hajuthman, R. Warner, S. Rahman, M. Abraham, H. Helliwell, D. Bodiwala

Abstract:

Context: Prostate cancer is a prevalent cancer in males in Europe and the US, with diagnosis primarily relying on PSA testing, mpMRI, and subsequent biopsies. However, this diagnostic strategy may lead to complications for patients. Research Aim: The aim of this study is to assess compliance with trust guidelines for antibiotic prophylaxis in patients undergoing transperineal template biopsies of the prostate and evaluate the rate of post-procedure complications. Methodology: This study is conducted retrospectively over an 8-month period. Data collection includes patient demographics, compliance with trust guidelines, associated risk factors, and post-procedure complications such as infection, haematuria, and urinary retention. Findings: The audit includes 100 patients with a median age of 66.11. The compliance with pre-procedure antibiotics was 98%, while compliance with antibiotic prophylaxis recommended by trust guidelines was 68%. Among the patients, 3% developed post-procedure sepsis, with 2 requiring admission for intravenous antibiotics. No evident risk factors were identified in these cases. Additionally, post-procedure urinary retention occurred in 3% of patients and post-procedure haematuria in 2%. Theoretical Importance: This study highlights the increasing use of transperineal template biopsies across UK centres and suggests that having a standardized protocol and compliance with guidelines can reduce confusion, ensure appropriate administration of antibiotics, and mitigate post-procedure complications. Data Collection and Analysis Procedures: Data for this study is collected retrospectively, involving the extraction and analysis of relevant information from patient records over the specified 8-month period. Question Addressed: This study addresses the following research questions: (1) What is the compliance rate with trust guidelines for antibiotic prophylaxis in transperineal template biopsies of the prostate? (2) What is the rate of post-procedure complications, such as infection, haematuria, and urinary retention? Conclusion: Transperineal template biopsies are becoming increasingly prevalent in the UK. Implementing a standardized protocol and ensuring compliance with guidelines can reduce confusion, ensure proper administration of antibiotics, and potentially minimize post-procedure complications. Additionally, considering that studies show no difference in outcomes when prophylactic antibiotics are not used, the reminder to follow trust guidelines may prompt a re-evaluation of antibiotic prescribing practices.

Keywords: prostate, transperineal template biopsies of prostate, antibiotics, complications, microbiology, guidelines

Procedia PDF Downloads 72
2934 Spectroscopic Autoradiography of Alpha Particles on Geologic Samples at the Thin Section Scale Using a Parallel Ionization Multiplier Gaseous Detector

Authors: Hugo Lefeuvre, Jerôme Donnard, Michael Descostes, Sophie Billon, Samuel Duval, Tugdual Oger, Herve Toubon, Paul Sardini

Abstract:

Spectroscopic autoradiography is a method of interest for geological sample analysis. Indeed, researchers may face different issues such as radioelement identification and quantification in the field of environmental studies. Imaging gaseous ionization detectors find their place in geosciences for conducting specific measurements of radioactivity to improve the monitoring of natural processes using naturally-occurring radioactive tracers, but also for the nuclear industry linked to the mining sector. In geological samples, the location and identification of the radioactive-bearing minerals at the thin-section scale remains a major challenge as the detection limit of the usual elementary microprobe techniques is far higher than the concentration of most of the natural radioactive decay products. The spatial distribution of each decay product in the case of uranium in a geomaterial is interesting for relating radionuclides concentration to the mineralogy. The present study aims to provide spectroscopic autoradiography analysis method for measuring the initial energy of alpha particles with a parallel ionization multiplier gaseous detector. The analysis method has been developed thanks to Geant4 modelling of the detector. The track of alpha particles recorded in the gas detector allow the simultaneous measurement of the initial point of emission and the reconstruction of the initial particle energy by a selection based on the linear energy distribution. This spectroscopic autoradiography method was successfully used to reproduce the alpha spectra from a 238U decay chain on a geological sample at the thin-section scale. The characteristics of this measurement are an energy spectrum resolution of 17.2% (FWHM) at 4647 keV and a spatial resolution of at least 50 µm. Even if the efficiency of energy spectrum reconstruction is low (4.4%) compared to the efficiency of a simple autoradiograph (50%), this novel measurement approach offers the opportunity to select areas on an autoradiograph to perform an energy spectrum analysis within that area. This opens up possibilities for the detailed analysis of heterogeneous geological samples containing natural alpha emitters such as uranium-238 and radium-226. This measurement will allow the study of the spatial distribution of uranium and its descendants in geo-materials by coupling scanning electron microscope characterizations. The direct application of this dual modality (energy-position) of analysis will be the subject of future developments. The measurement of the radioactive equilibrium state of heterogeneous geological structures, and the quantitative mapping of 226Ra radioactivity are now being actively studied.

Keywords: alpha spectroscopy, digital autoradiography, mining activities, natural decay products

Procedia PDF Downloads 146
2933 Mean Monthly Rainfall Prediction at Benina Station Using Artificial Neural Networks

Authors: Hasan G. Elmazoghi, Aisha I. Alzayani, Lubna S. Bentaher

Abstract:

Rainfall is a highly non-linear phenomena, which requires application of powerful supervised data mining techniques for its accurate prediction. In this study the Artificial Neural Network (ANN) technique is used to predict the mean monthly historical rainfall data collected from BENINA station in Benghazi for 31 years, the period of “1977-2006” and the results are compared against the observed values. The specific objective to achieve this goal was to determine the best combination of weather variables to be used as inputs for the ANN model. Several statistical parameters were calculated and an uncertainty analysis for the results is also presented. The best ANN model is then applied to the data of one year (2007) as a case study in order to evaluate the performance of the model. Simulation results reveal that application of ANN technique is promising and can provide reliable estimates of rainfall.

Keywords: neural networks, rainfall, prediction, climatic variables

Procedia PDF Downloads 484
2932 A New Approach for Improving Accuracy of Multi Label Stream Data

Authors: Kunal Shah, Swati Patel

Abstract:

Many real world problems involve data which can be considered as multi-label data streams. Efficient methods exist for multi-label classification in non streaming scenarios. However, learning in evolving streaming scenarios is more challenging, as the learners must be able to adapt to change using limited time and memory. Classification is used to predict class of unseen instance as accurate as possible. Multi label classification is a variant of single label classification where set of labels associated with single instance. Multi label classification is used by modern applications, such as text classification, functional genomics, image classification, music categorization etc. This paper introduces the task of multi-label classification, methods for multi-label classification and evolution measure for multi-label classification. Also, comparative analysis of multi label classification methods on the basis of theoretical study, and then on the basis of simulation was done on various data sets.

Keywords: binary relevance, concept drift, data stream mining, MLSC, multiple window with buffer

Procedia PDF Downloads 580
2931 The Practice and Research of Computer-Aided Language Learning in China

Authors: Huang Yajing

Abstract:

Context: Computer-aided language learning (CALL) in China has undergone significant development over the past few decades, with distinct stages marking its evolution. This paper aims to provide a comprehensive review of the practice and research in this field in China, tracing its journey from the early stages of audio-visual education to the current multimedia network integration stage. Research Aim: The study aims to analyze the historical progression of CALL in China, identify key developments in the field, and provide recommendations for enhancing CALL practices in the future. Methodology: The research employs document analysis and literature review to synthesize existing knowledge on CALL in China, drawing on a range of sources to construct a detailed overview of the evolution of CALL practices and research in the country. Findings: The review highlights the significant advancements in CALL in China, showcasing the transition from traditional audio-visual educational approaches to the current integrated multimedia network stage. The study identifies key milestones, technological advancements, and theoretical influences that have shaped CALL practices in China. Theoretical Importance: The evolution of CALL in China reflects not only technological progress but also shifts in educational paradigms and theories. The study underscores the significance of cognitive psychology as a theoretical underpinning for CALL practices, emphasizing the learner's active role in the learning process. Data Collection and Analysis Procedures: Data collection involved extensive review and analysis of documents and literature related to CALL in China. The analysis was carried out systematically to identify trends, developments, and challenges in the field. Questions Addressed: The study addresses the historical development of CALL in China, the impact of technological advancements on teaching practices, the role of cognitive psychology in shaping CALL methodologies, and the future outlook for CALL in the country. Conclusion: The review provides a comprehensive overview of the evolution of CALL in China, highlighting key stages of development and emerging trends. The study concludes by offering recommendations to further enhance CALL practices in the Chinese context.

Keywords: English education, educational technology, computer-aided language teaching, applied linguistics

Procedia PDF Downloads 49
2930 Learning Grammars for Detection of Disaster-Related Micro Events

Authors: Josef Steinberger, Vanni Zavarella, Hristo Tanev

Abstract:

Natural disasters cause tens of thousands of victims and massive material damages. We refer to all those events caused by natural disasters, such as damage on people, infrastructure, vehicles, services and resource supply, as micro events. This paper addresses the problem of micro - event detection in online media sources. We present a natural language grammar learning algorithm and apply it to online news. The algorithm in question is based on distributional clustering and detection of word collocations. We also explore the extraction of micro-events from social media and describe a Twitter mining robot, who uses combinations of keywords to detect tweets which talk about effects of disasters.

Keywords: online news, natural language processing, machine learning, event extraction, crisis computing, disaster effects, Twitter

Procedia PDF Downloads 476
2929 Evaluation of Modern Natural Language Processing Techniques via Measuring a Company's Public Perception

Authors: Burak Oksuzoglu, Savas Yildirim, Ferhat Kutlu

Abstract:

Opinion mining (OM) is one of the natural language processing (NLP) problems to determine the polarity of opinions, mostly represented on a positive-neutral-negative axis. The data for OM is usually collected from various social media platforms. In an era where social media has considerable control over companies’ futures, it’s worth understanding social media and taking actions accordingly. OM comes to the fore here as the scale of the discussion about companies increases, and it becomes unfeasible to gauge opinion on individual levels. Thus, the companies opt to automize this process by applying machine learning (ML) approaches to their data. For the last two decades, OM or sentiment analysis (SA) has been mainly performed by applying ML classification algorithms such as support vector machines (SVM) and Naïve Bayes to a bag of n-gram representations of textual data. With the advent of deep learning and its apparent success in NLP, traditional methods have become obsolete. Transfer learning paradigm that has been commonly used in computer vision (CV) problems started to shape NLP approaches and language models (LM) lately. This gave a sudden rise to the usage of the pretrained language model (PTM), which contains language representations that are obtained by training it on the large datasets using self-supervised learning objectives. The PTMs are further fine-tuned by a specialized downstream task dataset to produce efficient models for various NLP tasks such as OM, NER (Named-Entity Recognition), Question Answering (QA), and so forth. In this study, the traditional and modern NLP approaches have been evaluated for OM by using a sizable corpus belonging to a large private company containing about 76,000 comments in Turkish: SVM with a bag of n-grams, and two chosen pre-trained models, multilingual universal sentence encoder (MUSE) and bidirectional encoder representations from transformers (BERT). The MUSE model is a multilingual model that supports 16 languages, including Turkish, and it is based on convolutional neural networks. The BERT is a monolingual model in our case and transformers-based neural networks. It uses a masked language model and next sentence prediction tasks that allow the bidirectional training of the transformers. During the training phase of the architecture, pre-processing operations such as morphological parsing, stemming, and spelling correction was not used since the experiments showed that their contribution to the model performance was found insignificant even though Turkish is a highly agglutinative and inflective language. The results show that usage of deep learning methods with pre-trained models and fine-tuning achieve about 11% improvement over SVM for OM. The BERT model achieved around 94% prediction accuracy while the MUSE model achieved around 88% and SVM did around 83%. The MUSE multilingual model shows better results than SVM, but it still performs worse than the monolingual BERT model.

Keywords: BERT, MUSE, opinion mining, pretrained language model, SVM, Turkish

Procedia PDF Downloads 139
2928 Outlier Detection in Stock Market Data using Tukey Method and Wavelet Transform

Authors: Sadam Alwadi

Abstract:

Outlier values become a problem that frequently occurs in the data observation or recording process. Thus, the need for data imputation has become an essential matter. In this work, it will make use of the methods described in the prior work to detect the outlier values based on a collection of stock market data. In order to implement the detection and find some solutions that maybe helpful for investors, real closed price data were obtained from the Amman Stock Exchange (ASE). Tukey and Maximum Overlapping Discrete Wavelet Transform (MODWT) methods will be used to impute the detect the outlier values.

Keywords: outlier values, imputation, stock market data, detecting, estimation

Procedia PDF Downloads 79
2927 Cardiovascular Disease Prediction Using Machine Learning Approaches

Authors: P. Halder, A. Zaman

Abstract:

It is estimated that heart disease accounts for one in ten deaths worldwide. United States deaths due to heart disease are among the leading causes of death according to the World Health Organization. Cardiovascular diseases (CVDs) account for one in four U.S. deaths, according to the Centers for Disease Control and Prevention (CDC). According to statistics, women are more likely than men to die from heart disease as a result of strokes. A 50% increase in men's mortality was reported by the World Health Organization in 2009. The consequences of cardiovascular disease are severe. The causes of heart disease include diabetes, high blood pressure, high cholesterol, abnormal pulse rates, etc. Machine learning (ML) can be used to make predictions and decisions in the healthcare industry. Thus, scientists have turned to modern technologies like Machine Learning and Data Mining to predict diseases. The disease prediction is based on four algorithms. Compared to other boosts, the Ada boost is much more accurate.

Keywords: heart disease, cardiovascular disease, coronary artery disease, feature selection, random forest, AdaBoost, SVM, decision tree

Procedia PDF Downloads 148
2926 Value Analysis of Islamic Banking and Conventional Banking to Measure Value Co-Creation

Authors: Amna Javed, Hisashi Masuda, Youji Kohda

Abstract:

This study examines the value analysis in Islamic and conventional banking services in Pakistan. Many scholars have focused on co-creation of values in services but mainly economic values not non-economic. As Islamic banking is based on Islamic principles that are more concerned with non-economic values (well-being, partnership, fairness, trust worthy, and justice) than economic values as money in terms of interest. This study is important to know the providers point of view about the co-created values, because, it may be more sustainable and appropriate for today’s unpredictable socioeconomic environment. Data were collected from 4 banks (2 Islamic and 2 conventional banks). Text mining technique is applied for data analysis, and values with 100% occurrences in Islamic banking are chosen. The results reflect that Islamic banking is more centric towards non-economic values than economic values and it promotes team work and partnership concept by applying Islamic spirit and trust worthiness concept.

Keywords: economic values, Islamic banking, non-economic values, value system

Procedia PDF Downloads 457
2925 The Importance of Self-Efficacy and Collective Competence Beliefs in Managerial Competence of Sports Managers'

Authors: Şenol Yanar, Sinan Çeli̇kbi̇lek, Mehmet Bayansalduz, Yusuf Can

Abstract:

Managerial competence defines as the skills that managers in managerial positions have in relation to managerial responsibilities and managerial duties. Today's organizations, which are in a competitive environment, have the desire to work with effective managers in order to be more advantageous position than the other organizations they are competing with. In today's organizations, self-efficacy and collective competence belief that determine managerial competencies of managers to assume managerial responsibility are of special importance. In this framework, the aim of this study is to examine the effects of sports managers' perceptions of self-efficacy and collective competence in managerial competence perceptions. In the study, it has also been analyzed if there is a significant difference between self-efficacy, collective competence and managerial competence levels of sports managers in terms of their gender, age, duty status, year of service and level of education. 248 sports managers, who work at the department of sports service’s central and field organization at least as a chief in the manager position, have been chosen with random sampling method and they have voluntarily participated in the study. In the study, the self-efficacy scale which was developed by Schwarzer, R. & Jerusalem, M. (1995), collective competence scale developed by Goddard, Hoy and Woolfolk-Hoy (2000) and managerial competence scale developed by Cetinkaya (2009) have been used as a data collection tool. The questionnaire form used as a data collection tool in the study includes a personal information form consisting of 5 questions; questioning gender, age, duty status, years of service and level of education. In the study, Pearson Correlation Analysis has been used for defining the correlation among self-efficacy, collective competence belief, and managerial competence levels in sports managers and regression analysis have been used to define the affect of self-efficacy and collective competence belief on the perception of managerial competence. T-test for binary grouping and ANOVA analysis have been used for more than binary groups in order to determine if there is any significant difference in the level of self-efficacy, collective and managerial competence in terms of the participants’ duty status, year of service and level of education. According to the research results, it has been found that there is a positive correlation between sports managers' self-efficacy, collective competence beliefs, and managerial competence levels. According to the results of the regression analysis, it is understood that the managers’ perception of self-efficacy and collective competence belief significantly defines the perception of managerial competence. Also, the results show that there is no significant difference in self-efficacy, collective competence, and level of managerial competence of sports managers in terms of duty status, year of service and level of education.

Keywords: sports manager, self-efficacy, collective competence, managerial competence

Procedia PDF Downloads 229
2924 A Development of Science Instructional Model Based on Stem Education Approach to Enhance Scientific Mind and Problem Solving Skills for Primary Students

Authors: Prasita Sooksamran, Wareerat Kaewurai

Abstract:

STEM is an integrated teaching approach promoted by the Ministry of Education in Thailand. STEM Education is an integrated approach to teaching Science, Technology, Engineering, and Mathematics. It has been questioned by Thai teachers on the grounds of how to integrate STEM into the classroom. Therefore, the main objective of this study is to develop a science instructional model based on the STEM approach to enhance scientific mind and problem-solving skills for primary students. This study is participatory action research, and follows the following steps: 1) develop a model 2) seek the advice of experts regarding the teaching model. Developing the instructional model began with the collection and synthesis of information from relevant documents, related research and other sources in order to create prototype instructional model. 2) The examination of the validity and relevance of instructional model by a panel of nine experts. The findings were as follows: 1. The developed instructional model comprised of principles, objective, content, operational procedures and learning evaluation. There were 4 principles: 1) Learning based on the natural curiosity of primary school level children leading to knowledge inquiry, understanding and knowledge construction, 2) Learning based on the interrelation between people and environment, 3) Learning that is based on concrete learning experiences, exploration and the seeking of knowledge, 4) Learning based on the self-construction of knowledge, creativity, innovation and 5) relating their findings to real life and the solving of real-life problems. The objective of this construction model is to enhance scientific mind and problem-solving skills. Children will be evaluated according to their achievements. Lesson content is based on science as a core subject which is integrated with technology and mathematics at grade 6 level according to The Basic Education Core Curriculum 2008 guidelines. The operational procedures consisted of 6 steps: 1) Curiosity 2) Collection of data 3) Collaborative planning 4) Creativity and Innovation 5) Criticism and 6) Communication and Service. The learning evaluation is an authentic assessment based on continuous evaluation of all the material taught. 2. The experts agreed that the Science Instructional Model based on the STEM Education Approach had an excellent level of validity and relevance (4.67 S.D. 0.50).

Keywords: instructional model, STEM education, scientific mind, problem solving

Procedia PDF Downloads 188
2923 The Perception of Teacher Candidates' on History in Non-Educational TV Series: The Magnificent Century

Authors: Evren Şar İşbilen

Abstract:

As it is known, the movies and tv series are occupying a large part in the daily lives of adults and children in our era. In this connection, in the present study, the most popular historical TV series of recent years in Turkey, “Muhteşem Yüzyıl” (The Magnificent Century), was selected as the sample for the data collection in order to explore the perception of history of university students’. The data collected was analyzed bothqualitatively and quantitatively. The findings discussed in relation to the possible educative effects of historical non-educational TV series and movies on students' perceptions related to history. Additionally, suggestions were made regarding to the utilization of non-educational TV series or movies in education in a positive way.

Keywords: education, history, movies, teacher candidates

Procedia PDF Downloads 331