Search results for: WEKA data mining tool
26242 Research on Construction of Subject Knowledge Base Based on Literature Knowledge Extraction
Authors: Yumeng Ma, Fang Wang, Jinxia Huang
Abstract:
Researchers put forward higher requirements for efficient acquisition and utilization of domain knowledge in the big data era. As literature is an effective way for researchers to quickly and accurately understand the research situation in their field, the knowledge discovery based on literature has become a new research method. As a tool to organize and manage knowledge in a specific domain, the subject knowledge base can be used to mine and present the knowledge behind the literature to meet the users' personalized needs. This study designs the construction route of the subject knowledge base for specific research problems. Information extraction method based on knowledge engineering is adopted. Firstly, the subject knowledge model is built through the abstraction of the research elements. Then under the guidance of the knowledge model, extraction rules of knowledge points are compiled to analyze, extract and correlate entities, relations, and attributes in literature. Finally, a database platform based on this structured knowledge is developed that can provide a variety of services such as knowledge retrieval, knowledge browsing, knowledge q&a, and visualization correlation. Taking the construction practices in the field of activating blood circulation and removing stasis as an example, this study analyzes how to construct subject knowledge base based on literature knowledge extraction. As the system functional test shows, this subject knowledge base can realize the expected service scenarios such as a quick query of knowledge, related discovery of knowledge and literature, knowledge organization. As this study enables subject knowledge base to help researchers locate and acquire deep domain knowledge quickly and accurately, it provides a transformation mode of knowledge resource construction and personalized precision knowledge services in the data-intensive research environment.Keywords: knowledge model, literature knowledge extraction, precision knowledge services, subject knowledge base
Procedia PDF Downloads 16326241 A Methodology to Integrate Data in the Company Based on the Semantic Standard in the Context of Industry 4.0
Authors: Chang Qin, Daham Mustafa, Abderrahmane Khiat, Pierre Bienert, Paulo Zanini
Abstract:
Nowadays, companies are facing lots of challenges in the process of digital transformation, which can be a complex and costly undertaking. Digital transformation involves the collection and analysis of large amounts of data, which can create challenges around data management and governance. Furthermore, it is also challenged to integrate data from multiple systems and technologies. Although with these pains, companies are still pursuing digitalization because by embracing advanced technologies, companies can improve efficiency, quality, decision-making, and customer experience while also creating different business models and revenue streams. In this paper, the issue that data is stored in data silos with different schema and structures is focused. The conventional approaches to addressing this issue involve utilizing data warehousing, data integration tools, data standardization, and business intelligence tools. However, these approaches primarily focus on the grammar and structure of the data and neglect the importance of semantic modeling and semantic standardization, which are essential for achieving data interoperability. In this session, the challenge of data silos in Industry 4.0 is addressed by developing a semantic modeling approach compliant with Asset Administration Shell (AAS) models as an efficient standard for communication in Industry 4.0. The paper highlights how our approach can facilitate the data mapping process and semantic lifting according to existing industry standards such as ECLASS and other industrial dictionaries. It also incorporates the Asset Administration Shell technology to model and map the company’s data and utilize a knowledge graph for data storage and exploration.Keywords: data interoperability in industry 4.0, digital integration, industrial dictionary, semantic modeling
Procedia PDF Downloads 9426240 On Stochastic Models for Fine-Scale Rainfall Based on Doubly Stochastic Poisson Processes
Authors: Nadarajah I. Ramesh
Abstract:
Much of the research on stochastic point process models for rainfall has focused on Poisson cluster models constructed from either the Neyman-Scott or Bartlett-Lewis processes. The doubly stochastic Poisson process provides a rich class of point process models, especially for fine-scale rainfall modelling. This paper provides an account of recent development on this topic and presents the results based on some of the fine-scale rainfall models constructed from this class of stochastic point processes. Amongst the literature on stochastic models for rainfall, greater emphasis has been placed on modelling rainfall data recorded at hourly or daily aggregation levels. Stochastic models for sub-hourly rainfall are equally important, as there is a need to reproduce rainfall time series at fine temporal resolutions in some hydrological applications. For example, the study of climate change impacts on hydrology and water management initiatives requires the availability of data at fine temporal resolutions. One approach to generating such rainfall data relies on the combination of an hourly stochastic rainfall simulator, together with a disaggregator making use of downscaling techniques. Recent work on this topic adopted a different approach by developing specialist stochastic point process models for fine-scale rainfall aimed at generating synthetic precipitation time series directly from the proposed stochastic model. One strand of this approach focused on developing a class of doubly stochastic Poisson process (DSPP) models for fine-scale rainfall to analyse data collected in the form of rainfall bucket tip time series. In this context, the arrival pattern of rain gauge bucket tip times N(t) is viewed as a DSPP whose rate of occurrence varies according to an unobserved finite state irreducible Markov process X(t). Since the likelihood function of this process can be obtained, by conditioning on the underlying Markov process X(t), the models were fitted with maximum likelihood methods. The proposed models were applied directly to the raw data collected by tipping-bucket rain gauges, thus avoiding the need to convert tip-times to rainfall depths prior to fitting the models. One advantage of this approach was that the use of maximum likelihood methods enables a more straightforward estimation of parameter uncertainty and comparison of sub-models of interest. Another strand of this approach employed the DSPP model for the arrivals of rain cells and attached a pulse or a cluster of pulses to each rain cell. Different mechanisms for the pattern of the pulse process were used to construct variants of this model. We present the results of these models when they were fitted to hourly and sub-hourly rainfall data. The results of our analysis suggest that the proposed class of stochastic models is capable of reproducing the fine-scale structure of the rainfall process, and hence provides a useful tool in hydrological modelling.Keywords: fine-scale rainfall, maximum likelihood, point process, stochastic model
Procedia PDF Downloads 27826239 Transport Related Air Pollution Modeling Using Artificial Neural Network
Authors: K. D. Sharma, M. Parida, S. S. Jain, Anju Saini, V. K. Katiyar
Abstract:
Air quality models form one of the most important components of an urban air quality management plan. Various statistical modeling techniques (regression, multiple regression and time series analysis) have been used to predict air pollution concentrations in the urban environment. These models calculate pollution concentrations due to observed traffic, meteorological and pollution data after an appropriate relationship has been obtained empirically between these parameters. Artificial neural network (ANN) is increasingly used as an alternative tool for modeling the pollutants from vehicular traffic particularly in urban areas. In the present paper, an attempt has been made to model traffic air pollution, specifically CO concentration using neural networks. In case of CO concentration, two scenarios were considered. First, with only classified traffic volume input and the second with both classified traffic volume and meteorological variables. The results showed that CO concentration can be predicted with good accuracy using artificial neural network (ANN).Keywords: air quality management, artificial neural network, meteorological variables, statistical modeling
Procedia PDF Downloads 52426238 Barriers to Participation in Sport for Children without Disability: A Systematic Review
Authors: S. Somerset, D. J. Hoare
Abstract:
Participation in sport is linked to better mental and physical health in children and adults. Studies have shown children who participate in sports benefit from improved social skills, self-confidence, communication skills and a better quality of life. Children who participate in sports from a young age are also more likely to continue to have active lifestyles during adulthood. This is an important consideration with a nation where physical activity levels are declining and the incidences of obesity are rising. Getting children active and keeping them active can provide long term health benefits to the individual but also a potential reduction in health costs in the future. This systematic review aims to identify the barriers to participation in sport for children aged up to 18 years and encompasses both qualitative and quantitative studies. The bibliographic databases, EMBASE, Medline, CINAHL and SportDiscus were searched. Additional hand searches were carried out on review articles found in the searches to identify any studies that may have been missed. Studies involving children up to 18 years without additional needs focusing on barriers to participation in sport were included. Randomised control trials, policy guidelines, studies with sport as an intervention, studies focusing on the female athlete triad, tobacco abuse, alcohol abuse, drug abuse, pre exercise testing, and cardiovascular disease were excluded. Abstract review, full paper review and quality appraisal were conducted by two researchers. A consensus meeting took place to resolve any differences at the abstract, full text and data extraction / quality appraisal stages. The CASP qualitative studies appraisal tool and the CASP cohort studies tool (excluding question 3 and 4 which refer to interventions) were used for quality appraisal in this review. The review identified several salient barriers to participation in sport for children. These barriers ranged from the uniform worn during school physical education lessons to the weather during participation in sport. The most commonly identified barriers in the review include parental support, time allocation, location of the activity and the cost of the activity. Therefore, it would be beneficial for a greater provision to be made within the school environment for children to participate sport. This can reduce the cost and time commitment required from parents to encourage participation. This would help to increase activity levels of children, which ultimately can only be a good thing.Keywords: barrier, children, participation, sport
Procedia PDF Downloads 36126237 A Comparative Human Rights Analysis of Expulsion as a Counterterrorism Instrument: An Evaluation of Belgium
Authors: Louise Reyntjens
Abstract:
Where criminal law used to be the traditional response to cope with the terrorist threat, European governments are increasingly relying on administrative paths. The reliance on immigration law fits into this trend. Terrorism is seen as a civilization menace emanating from abroad. In this context, the expulsion of dangerous aliens, immigration law’s core task, is put forward as a key security tool. Governments all over Europe are focusing on removing dangerous individuals from their territory rather than bringing them to justice. This research reflects on the consequences for the expelled individuals’ fundamental rights. For this, the author selected four European countries for a comparative study: Belgium, France, the United Kingdom and Sweden. All these countries face similar social and security issues, igniting the recourse to immigration law as a counterterrorism tool. Yet, they adopt a very different approach on this: the United Kingdom positions itself on the repressive side of the spectrum. Sweden on the other hand, also 'securitized' its immigration policy after the recent terrorist hit in Stockholm, but remains on the tolerant side of the spectrum. Belgium and France are situated in between. This paper addresses the situation in Belgium. In 2017, the Belgian parliament introduced several legislative changes by which it considerably expanded and facilitated the possibility to expel unwanted aliens. First, the expulsion measure was subjected to new and questionably definitions: a serious attack on the nation’s safety used to be required to expel certain categories of aliens. Presently, mere suspicions suffice to fulfil the new definition of a 'serious threat to national security'. A definition which fails to respond to the principle of legality; the law, nor the prepatory works clarify what is meant by 'a threat to national security'. This creates the risk of submitting this concept’s interpretation almost entirely to the discretion of the immigration authorities. Secondly, in name of intervening more quickly and efficiently, the automatic suspensive appeal for expulsions was abolished. The European Court of Human Rights nonetheless requires such an automatic suspensive appeal under Article 13 and 3 of the Convention. Whether this procedural reform will stand to endure, is thus questionable. This contribution also raises questions regarding expulsion’s efficacy as a key security tool. In a globalized and mobilized world, particularly in a European Union with no internal boundaries, questions can be raised about the usefulness of this measure. Even more so, by simply expelling a dangerous individual, States avoid their responsibility and shift the risk to another State. Criminal law might in these instances be more capable of providing a conclusive and long term response. This contribution explores the human rights consequences of expulsion as a security tool in Belgium. It also offers a critical view on its efficacy for protecting national security.Keywords: Belgium, counter-terrorism and human rights, expulsion, immigration law
Procedia PDF Downloads 12726236 Economic Characteristics of Bitcoin: "An Analytical Study"
Authors: Abdelhalem Shahen
Abstract:
The world is now experiencing a digital revolution and greatly accelerated technological developments, in addition to the transition from the economy in its traditional form to the digital economy, which has resulted in the emergence of new tools that are appropriate to those developments, and from this, this paper attempts to explore the economic characteristics of the bitcoin currency that circulated recently. Due to the many advantages that distinguish it from money in its traditional forms, which have a range of economic effects. The study found that Bitcoin is among the technological innovations, which contain a set of characteristics that are worth studying, those that make it the focus of attention, such as the digital currency, the peer-to-peer property, Lower and Faster Transaction Costs, transparency, decentralized control, privacy, and Double-Spending, as well as security and Cryptographic, and finally mining.Keywords: Digital Economics, Digital Currencies, Bitcoin, Features of Bitcoin
Procedia PDF Downloads 13826235 Enhancing Plant Throughput in Mineral Processing Through Multimodal Artificial Intelligence
Authors: Muhammad Bilal Shaikh
Abstract:
Mineral processing plants play a pivotal role in extracting valuable minerals from raw ores, contributing significantly to various industries. However, the optimization of plant throughput remains a complex challenge, necessitating innovative approaches for increased efficiency and productivity. This research paper investigates the application of Multimodal Artificial Intelligence (MAI) techniques to address this challenge, aiming to improve overall plant throughput in mineral processing operations. The integration of multimodal AI leverages a combination of diverse data sources, including sensor data, images, and textual information, to provide a holistic understanding of the complex processes involved in mineral extraction. The paper explores the synergies between various AI modalities, such as machine learning, computer vision, and natural language processing, to create a comprehensive and adaptive system for optimizing mineral processing plants. The primary focus of the research is on developing advanced predictive models that can accurately forecast various parameters affecting plant throughput. Utilizing historical process data, machine learning algorithms are trained to identify patterns, correlations, and dependencies within the intricate network of mineral processing operations. This enables real-time decision-making and process optimization, ultimately leading to enhanced plant throughput. Incorporating computer vision into the multimodal AI framework allows for the analysis of visual data from sensors and cameras positioned throughout the plant. This visual input aids in monitoring equipment conditions, identifying anomalies, and optimizing the flow of raw materials. The combination of machine learning and computer vision enables the creation of predictive maintenance strategies, reducing downtime and improving the overall reliability of mineral processing plants. Furthermore, the integration of natural language processing facilitates the extraction of valuable insights from unstructured textual data, such as maintenance logs, research papers, and operator reports. By understanding and analyzing this textual information, the multimodal AI system can identify trends, potential bottlenecks, and areas for improvement in plant operations. This comprehensive approach enables a more nuanced understanding of the factors influencing throughput and allows for targeted interventions. The research also explores the challenges associated with implementing multimodal AI in mineral processing plants, including data integration, model interpretability, and scalability. Addressing these challenges is crucial for the successful deployment of AI solutions in real-world industrial settings. To validate the effectiveness of the proposed multimodal AI framework, the research conducts case studies in collaboration with mineral processing plants. The results demonstrate tangible improvements in plant throughput, efficiency, and cost-effectiveness. The paper concludes with insights into the broader implications of implementing multimodal AI in mineral processing and its potential to revolutionize the industry by providing a robust, adaptive, and data-driven approach to optimizing plant operations. In summary, this research contributes to the evolving field of mineral processing by showcasing the transformative potential of multimodal artificial intelligence in enhancing plant throughput. The proposed framework offers a holistic solution that integrates machine learning, computer vision, and natural language processing to address the intricacies of mineral extraction processes, paving the way for a more efficient and sustainable future in the mineral processing industry.Keywords: multimodal AI, computer vision, NLP, mineral processing, mining
Procedia PDF Downloads 6826234 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encryption
Authors: Waziri Victor Onomza, John K. Alhassan, Idris Ismaila, Noel Dogonyaro Moses
Abstract:
This paper describes the problem of building secure computational services for encrypted information in the Cloud Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy, confidentiality, availability of the users. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute theoretical presentations in high-level computational processes that are based on number theory and algebra that can easily be integrated and leveraged in the Cloud computing with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based cryptographic security algorithm.Keywords: big data analytics, security, privacy, bootstrapping, homomorphic, homomorphic encryption scheme
Procedia PDF Downloads 38026233 Role of Agricultural Journalism in Diffusion of Farming Technologies
Authors: Muhammad Luqman, Mujahid Karim
Abstract:
Agricultural journalism considered an effective tool in the diffusion of agricultural technologies among the members of farming communities. Various agricultural journalism forms are used by the different organization in order to address the community problems and provide solutions to them. The present study was conducted for analyzing the role of agricultural journalism in the dissemination of agricultural information. The universe of the study was district Sargodha from which a sample of 100 was collected through a validating and pre-tested questionnaire. Statistical analysis of collected data was done with the help of SPSS. It was concluded that majority (64.6%) of the respondent were middle-aged (31-50) years, also indicates a high (73.23%) literacy rate above middle-level education, most (78.3%) of the respondents were connected with the occupation of farming. In various forms of agricultural journalism “Radio/T.V./F.M) is used by 99.4% of the respondent, Mobile phones (96%), Magazine/ Newspaper/ periodical (66.4%) and social media (60.9%). Regarding majors areas focused on agriculture journalism “Help farmers to enhance their productivity is on the highest level with a mean of ( =3.98/5.00). The regression model of farmer's education and various forms of agricultural journalism facilities used was found to be significant.Keywords: agricultural information, journalism, farming community, technology diffusion and adoption
Procedia PDF Downloads 19526232 Urban Energy Demand Modelling: Spatial Analysis Approach
Authors: Hung-Chu Chen, Han Qi, Bauke de Vries
Abstract:
Energy consumption in the urban environment has attracted numerous researches in recent decades. However, it is comparatively rare to find literary works which investigated 3D spatial analysis of urban energy demand modelling. In order to analyze the spatial correlation between urban morphology and energy demand comprehensively, this paper investigates their relation by using the spatial regression tool. In addition, the spatial regression tool which is applied in this paper is ordinary least squares regression (OLS) and geographically weighted regression (GWR) model. Normalized Difference Built-up Index (NDBI), Normalized Difference Vegetation Index (NDVI), and building volume are explainers of urban morphology, which act as independent variables of Energy-land use (E-L) model. NDBI and NDVI are used as the index to describe five types of land use: urban area (U), open space (O), artificial green area (G), natural green area (V), and water body (W). Accordingly, annual electricity, gas demand and energy demand are dependent variables of the E-L model. Based on the analytical result of E-L model relation, it revealed that energy demand and urban morphology are closely connected and the possible causes and practical use are discussed. Besides, the spatial analysis methods of OLS and GWR are compared.Keywords: energy demand model, geographically weighted regression, normalized difference built-up index, normalized difference vegetation index, spatial statistics
Procedia PDF Downloads 14826231 Protecting Privacy and Data Security in Online Business
Authors: Bilquis Ferdousi
Abstract:
With the exponential growth of the online business, the threat to consumers’ privacy and data security has become a serious challenge. This literature review-based study focuses on a better understanding of those threats and what legislative measures have been taken to address those challenges. Research shows that people are increasingly involved in online business using different digital devices and platforms, although this practice varies based on age groups. The threat to consumers’ privacy and data security is a serious hindrance in developing trust among consumers in online businesses. There are some legislative measures taken at the federal and state level to protect consumers’ privacy and data security. The study was based on an extensive review of current literature on protecting consumers’ privacy and data security and legislative measures that have been taken.Keywords: privacy, data security, legislation, online business
Procedia PDF Downloads 10626230 Flowing Online Vehicle GPS Data Clustering Using a New Parallel K-Means Algorithm
Authors: Orhun Vural, Oguz Bayat, Rustu Akay, Osman N. Ucan
Abstract:
This study presents a new parallel approach clustering of GPS data. Evaluation has been made by comparing execution time of various clustering algorithms on GPS data. This paper aims to propose a parallel based on neighborhood K-means algorithm to make it faster. The proposed parallelization approach assumes that each GPS data represents a vehicle and to communicate between vehicles close to each other after vehicles are clustered. This parallelization approach has been examined on different sized continuously changing GPS data and compared with serial K-means algorithm and other serial clustering algorithms. The results demonstrated that proposed parallel K-means algorithm has been shown to work much faster than other clustering algorithms.Keywords: parallel k-means algorithm, parallel clustering, clustering algorithms, clustering on flowing data
Procedia PDF Downloads 22226229 An Analysis of Privacy and Security for Internet of Things Applications
Authors: Dhananjay Singh, M. Abdullah-Al-Wadud
Abstract:
The Internet of Things is a concept of a large scale ecosystem of wireless actuators. The actuators are defined as things in the IoT, those which contribute or produces some data to the ecosystem. However, ubiquitous data collection, data security, privacy preserving, large volume data processing, and intelligent analytics are some of the key challenges into the IoT technologies. In order to solve the security requirements, challenges and threats in the IoT, we have discussed a message authentication mechanism for IoT applications. Finally, we have discussed data encryption mechanism for messages authentication before propagating into IoT networks.Keywords: Internet of Things (IoT), message authentication, privacy, security
Procedia PDF Downloads 38226228 Rapid Building Detection in Population-Dense Regions with Overfitted Machine Learning Models
Authors: V. Mantey, N. Findlay, I. Maddox
Abstract:
The quality and quantity of global satellite data have been increasing exponentially in recent years as spaceborne systems become more affordable and the sensors themselves become more sophisticated. This is a valuable resource for many applications, including disaster management and relief. However, while more information can be valuable, the volume of data available is impossible to manually examine. Therefore, the question becomes how to extract as much information as possible from the data with limited manpower. Buildings are a key feature of interest in satellite imagery with applications including telecommunications, population models, and disaster relief. Machine learning tools are fast becoming one of the key resources to solve this problem, and models have been developed to detect buildings in optical satellite imagery. However, by and large, most models focus on affluent regions where buildings are generally larger and constructed further apart. This work is focused on the more difficult problem of detection in populated regions. The primary challenge with detecting small buildings in densely populated regions is both the spatial and spectral resolution of the optical sensor. Densely packed buildings with similar construction materials will be difficult to separate due to a similarity in color and because the physical separation between structures is either non-existent or smaller than the spatial resolution. This study finds that training models until they are overfitting the input sample can perform better in these areas than a more robust, generalized model. An overfitted model takes less time to fine-tune from a generalized pre-trained model and requires fewer input data. The model developed for this study has also been fine-tuned using existing, open-source, building vector datasets. This is particularly valuable in the context of disaster relief, where information is required in a very short time span. Leveraging existing datasets means that little to no manpower or time is required to collect data in the region of interest. The training period itself is also shorter for smaller datasets. Requiring less data means that only a few quality areas are necessary, and so any weaknesses or underpopulated regions in the data can be skipped over in favor of areas with higher quality vectors. In this study, a landcover classification model was developed in conjunction with the building detection tool to provide a secondary source to quality check the detected buildings. This has greatly reduced the false positive rate. The proposed methodologies have been implemented and integrated into a configurable production environment and have been employed for a number of large-scale commercial projects, including continent-wide DEM production, where the extracted building footprints are being used to enhance digital elevation models. Overfitted machine learning models are often considered too specific to have any predictive capacity. However, this study demonstrates that, in cases where input data is scarce, overfitted models can be judiciously applied to solve time-sensitive problems.Keywords: building detection, disaster relief, mask-RCNN, satellite mapping
Procedia PDF Downloads 16926227 The Influence of the Regional Sectoral Structure on the Socio-Economic Development of the Arkhangelsk Region
Authors: K. G. Sorokozherdyev, E. A. Efimov
Abstract:
The socio-economic development of regions and countries is an important research issue. Today, in the face of many negative events in the global and regional economies, it is especially important to identify those areas that can serve as sources of economic growth and the basis for the well-being of the population. This study aims to identify the most important sectors of the economy of the Arkhangelsk region that can contribute to the socio-economic development of the region as a whole. For research, the Arkhangelsk region was taken as one of the typical Russian regions that do not have significant reserves of hydrocarbons nor there are located any large industrial complexes. In this regard, the question of possible origins of economic growth seems especially relevant. The basis of this study constitutes the distributed lag regression model (ADL model) developed by the authors, which is based on quarterly data on the socio-economic development of the Arkhangelsk region for the period 2004-2016. As a result, we obtained three equations reflecting the dynamics of three indicators of the socio-economic development of the region -the average wage, the regional GRP, and the birth rate. The influencing factors are the shares in GRP of such sectors as agriculture, mining, manufacturing, construction, wholesale and retail trade, hotels and restaurants, as well as the financial sector. The study showed that the greatest influence on the socio-economic development of the region is exerted by such industries as wholesale and retail trade, construction, and industrial sectors. The study can be the basis for forecasting and modeling the socio-economic development of the Arkhangelsk region in the short and medium term. It also can be helpful while analyzing the effectiveness of measures aimed at stimulating those or other industries of the region. The model can be used in developing a regional development strategy.Keywords: regional economic development, regional sectoral structure, ADL model, Arkhangelsk region
Procedia PDF Downloads 10026226 Development and Verification of the Idom Shielding Optimization Tool
Authors: Omar Bouhassoun, Cristian Garrido, César Hueso
Abstract:
The radiation shielding design is an optimization problem with multiple -constrained- objective functions (radiation dose, weight, price, etc.) that depend on several parameters (material, thickness, position, etc.). The classical approach for shielding design consists of a brute force trial-and-error process subject to previous designer experience. Therefore, the result is an empirical solution but not optimal, which can degrade the overall performance of the shielding. In order to automate the shielding design procedure, the IDOM Shielding Optimization Tool (ISOT) has been developed. This software combines optimization algorithms with the capabilities to read/write input files, run calculations, as well as parse output files for different radiation transport codes. In the first stage, the software was established to adjust the input files for two well-known Monte Carlo codes (MCNP and Serpent) and optimize the result (weight, volume, price, dose rate) using multi-objective genetic algorithms. Nevertheless, its modular implementation easily allows the inclusion of more radiation transport codes and optimization algorithms. The work related to the development of ISOT and its verification on a simple 3D multi-layer shielding problem using both MCNP and Serpent will be presented. ISOT looks very promising for achieving an optimal solution to complex shielding problems.Keywords: optimization, shielding, nuclear, genetic algorithm
Procedia PDF Downloads 11026225 Chip Morphology and Cutting Forces Investigation in Dry High Speed Orthogonal Turning of Titanium Alloy
Authors: M. Benghersallah, L. Boulanouar, G. List, G. Sutter
Abstract:
The present work is an experimental study on the dry high speed turning of Ti-6Al-4V titanium alloy. The objective of this study is to see for high cutting speeds, how wear occurs on the face of insert and how to evolve cutting forces and chip formation. Cutting speeds tested is 600, 800, 1000 and 1200 m / min in orthogonal turning with a carbide insert tool H13A uncoated on a cylindrical titanium alloy part. Investigation on the wear inserts with 3D scanning microscope revered the crater formation is instantaneous and a chip adhesion (welded chip) causes detachment of carbide particles. In these experiments, the chip shape was systematically investigated at each cutting conditions using optical microscopy. The chips produced were collected and polished to measure the thicknesses t2max and t2min, dch the distance between each segments and ɸseg the inclination angle As described in the introduction part, the shear angle f and the inclination angle of a segment ɸseg are differentiated. The angle ɸseg is actually measured on the collected chips while the shear angle f cannot be. The angle ɸ represents the initial shear similar to the one that describes the formation of a continuous chip in the primary shear zone. Cutting forces increase and stabilize before removing the tool. The chip reaches a very high temperature.Keywords: dry high speed, orthogonal turning, chip formation, cutting speed, cutting forces
Procedia PDF Downloads 27626224 Frequent Pattern Mining for Digenic Human Traits
Authors: Atsuko Okazaki, Jurg Ott
Abstract:
Some genetic diseases (‘digenic traits’) are due to the interaction between two DNA variants. For example, certain forms of Retinitis Pigmentosa (a genetic form of blindness) occur in the presence of two mutant variants, one in the ROM1 gene and one in the RDS gene, while the occurrence of only one of these mutant variants leads to a completely normal phenotype. Detecting such digenic traits by genetic methods is difficult. A common approach to finding disease-causing variants is to compare 100,000s of variants between individuals with a trait (cases) and those without the trait (controls). Such genome-wide association studies (GWASs) have been very successful but hinge on genetic effects of single variants, that is, there should be a difference in allele or genotype frequencies between cases and controls at a disease-causing variant. Frequent pattern mining (FPM) methods offer an avenue at detecting digenic traits even in the absence of single-variant effects. The idea is to enumerate pairs of genotypes (genotype patterns) with each of the two genotypes originating from different variants that may be located at very different genomic positions. What is needed is for genotype patterns to be significantly more common in cases than in controls. Let Y = 2 refer to cases and Y = 1 to controls, with X denoting a specific genotype pattern. We are seeking association rules, ‘X → Y’, with high confidence, P(Y = 2|X), significantly higher than the proportion of cases, P(Y = 2) in the study. Clearly, generally available FPM methods are very suitable for detecting disease-associated genotype patterns. We use fpgrowth as the basic FPM algorithm and built a framework around it to enumerate high-frequency digenic genotype patterns and to evaluate their statistical significance by permutation analysis. Application to a published dataset on opioid dependence furnished results that could not be found with classical GWAS methodology. There were 143 cases and 153 healthy controls, each genotyped for 82 variants in eight genes of the opioid system. The aim was to find out whether any of these variants were disease-associated. The single-variant analysis did not lead to significant results. Application of our FPM implementation resulted in one significant (p < 0.01) genotype pattern with both genotypes in the pattern being heterozygous and originating from two variants on different chromosomes. This pattern occurred in 14 cases and none of the controls. Thus, the pattern seems quite specific to this form of substance abuse and is also rather predictive of disease. An algorithm called Multifactor Dimension Reduction (MDR) was developed some 20 years ago and has been in use in human genetics ever since. This and our algorithms share some similar properties, but they are also very different in other respects. The main difference seems to be that our algorithm focuses on patterns of genotypes while the main object of inference in MDR is the 3 × 3 table of genotypes at two variants.Keywords: digenic traits, DNA variants, epistasis, statistical genetics
Procedia PDF Downloads 12226223 Evaluation and Comparison of Male and Female Students’ Life Skills of Theoretical, Technical-Vocational and Job and Knowledge Branches of Secondary High School Period
Authors: Khalil Aryanfar, Shahrzad Sanjari, Elmira Hafez, Pariya Gholipor
Abstract:
The aim of this study was to Evaluate and compare the male and female students’ life skills of theoretical, technical-vocational and Job and Knowledge branches of secondary high school period. The research method is descriptive - survey Research population was 5892 students from three high schools in Tehran, sample size was determined 342 patients according to Morgan’s table and by stratified random sampling. The data collection tool was a questionnaire designed by the researchers that the reliability was more than 85/0 respectively. Data was anglicized by Kryskal Wallis and Mann-Whitney U-test. In three branches of theoretical, technical-vocational and Job and Knowledge The variables of academic achievement, the importance of organization, problem solving, seeking knowledge, good habits, mental and physical self-concept, family orientation and future orientation was not significant differences, in the variables of cooperative behavior, and ready for change was but significant differences. Variables such as academic achievement, seek knowledge, good habits, mental and physical, seeking direction to future cooperative behavior between boys and girls with the confidence of at least 95/0 and the variable ready for change among boys and girls by ensuring 0932 / There was an However, the importance of variables, problem solving, self-concept and family orientation was not significantly different.Keywords: life skills, high school, theoretical, technical-vocational, job and knowledge
Procedia PDF Downloads 38526222 Cognitive Science Based Scheduling in Grid Environment
Authors: N. D. Iswarya, M. A. Maluk Mohamed, N. Vijaya
Abstract:
Grid is infrastructure that allows the deployment of distributed data in large size from multiple locations to reach a common goal. Scheduling data intensive applications becomes challenging as the size of data sets are very huge in size. Only two solutions exist in order to tackle this challenging issue. First, computation which requires huge data sets to be processed can be transferred to the data site. Second, the required data sets can be transferred to the computation site. In the former scenario, the computation cannot be transferred since the servers are storage/data servers with little or no computational capability. Hence, the second scenario can be considered for further exploration. During scheduling, transferring huge data sets from one site to another site requires more network bandwidth. In order to mitigate this issue, this work focuses on incorporating cognitive science in scheduling. Cognitive Science is the study of human brain and its related activities. Current researches are mainly focused on to incorporate cognitive science in various computational modeling techniques. In this work, the problem solving approach of human brain is studied and incorporated during the data intensive scheduling in grid environments. Here, a cognitive engine is designed and deployed in various grid sites. The intelligent agents present in CE will help in analyzing the request and creating the knowledge base. Depending upon the link capacity, decision will be taken whether to transfer data sets or to partition the data sets. Prediction of next request is made by the agents to serve the requesting site with data sets in advance. This will reduce the data availability time and data transfer time. Replica catalog and Meta data catalog created by the agents assist in decision making process.Keywords: data grid, grid workflow scheduling, cognitive artificial intelligence
Procedia PDF Downloads 39426221 Heritage and Tourism in the Era of Big Data: Analysis of Chinese Cultural Tourism in Catalonia
Authors: Xinge Liao, Francesc Xavier Roige Ventura, Dolores Sanchez Aguilera
Abstract:
With the development of the Internet, the study of tourism behavior has rapidly expanded from the traditional physical market to the online market. Data on the Internet is characterized by dynamic changes, and new data appear all the time. In recent years the generation of a large volume of data was characterized, such as forums, blogs, and other sources, which have expanded over time and space, together they constitute large-scale Internet data, known as Big Data. This data of technological origin that derives from the use of devices and the activity of multiple users is becoming a source of great importance for the study of geography and the behavior of tourists. The study will focus on cultural heritage tourist practices in the context of Big Data. The research will focus on exploring the characteristics and behavior of Chinese tourists in relation to the cultural heritage of Catalonia. Geographical information, target image, perceptions in user-generated content will be studied through data analysis from Weibo -the largest social networks of blogs in China. Through the analysis of the behavior of heritage tourists in the Big Data environment, this study will understand the practices (activities, motivations, perceptions) of cultural tourists and then understand the needs and preferences of tourists in order to better guide the sustainable development of tourism in heritage sites.Keywords: Barcelona, Big Data, Catalonia, cultural heritage, Chinese tourism market, tourists’ behavior
Procedia PDF Downloads 13826220 Selection of Intensity Measure in Probabilistic Seismic Risk Assessment of a Turkish Railway Bridge
Authors: M. F. Yilmaz, B. Ö. Çağlayan
Abstract:
Fragility curve is an effective common used tool to determine the earthquake performance of structural and nonstructural components. Also, it is used to determine the nonlinear behavior of bridges. There are many historical bridges in the Turkish railway network; the earthquake performances of these bridges are needed to be investigated. To derive fragility curve Intensity measures (IMs) and Engineering demand parameters (EDP) are needed to be determined. And the relation between IMs and EDP are needed to be derived. In this study, a typical simply supported steel girder riveted railway bridge is studied. Fragility curves of this bridge are derived by two parameters lognormal distribution. Time history analyses are done for selected 60 real earthquake data to determine the relation between IMs and EDP. Moreover, efficiency, practicality, and sufficiency of three different IMs are discussed. PGA, Sa(0.2s) and Sa(1s), the most common used IMs parameters for fragility curve in the literature, are taken into consideration in terms of efficiency, practicality and sufficiency.Keywords: railway bridges, earthquake performance, fragility analyses, selection of intensity measures
Procedia PDF Downloads 35726219 Towards A Framework for Using Open Data for Accountability: A Case Study of A Program to Reduce Corruption
Authors: Darusalam, Jorish Hulstijn, Marijn Janssen
Abstract:
Media has revealed a variety of corruption cases in the regional and local governments all over the world. Many governments pursued many anti-corruption reforms and have created a system of checks and balances. Three types of corruption are faced by citizens; administrative corruption, collusion and extortion. Accountability is one of the benchmarks for building transparent government. The public sector is required to report the results of the programs that have been implemented so that the citizen can judge whether the institution has been working such as economical, efficient and effective. Open Data is offering solutions for the implementation of good governance in organizations who want to be more transparent. In addition, Open Data can create transparency and accountability to the community. The objective of this paper is to build a framework of open data for accountability to combating corruption. This paper will investigate the relationship between open data, and accountability as part of anti-corruption initiatives. This research will investigate the impact of open data implementation on public organization.Keywords: open data, accountability, anti-corruption, framework
Procedia PDF Downloads 33626218 Health State Utility Values Related to COVID-19 Pandemic Using EQ-5D: A Systematic Review and Meta-Analysis
Authors: Xu Feifei
Abstract:
The prevalence of COVID-19 currently is the biggest challenge to improving people's quality of life. Its impact on the health-related quality of life (HRQoL) is highly uncertain and has not been summarized so far. The aim of the present systematic review was to assess and provide an up-to-date analysis of the impact of the COVID-19 pandemic on the HRQoL of participants who have been infected, have not been infected but isolated, frontline, with different diseases, and the general population. Therefore, an electronic search of the literature in PubMed databases was performed from 2019 to July 2022 (without date restriction). PRISMA guideline methodology was employed, and data regarding the HRQoL were extracted from eligible studies. Articles were included if they met the following inclusion criteria: (a) reports on the data collection of the health state utility values (HSUVs) related to COVID-19 from 2019 to 2021; (b) English language and peer-reviewed journals; and (c) original HSUV data; (d) using EQ-5D tool to quantify the HRQoL. To identify studies that reported the effects on COVID-19, data on the proportion of overall HSUVs of participants who had the outcome were collected and analyzed using a one-group meta-analysis. As a result, thirty-two studies fulfilled the inclusion criteria and, therefore, were included in the systematic review. A total of 45295 participants and provided 219 means of HSUVs during COVID-19 were included in this systematic review. The range of utility is from 0.224 to 1. The study included participants from Europe (n=16), North America (n=4), Asia (n=10), South America (n=1), and Africa (n=1). Twelve articles showed that the HRQoL of the participants who have been infected with COVID-19 (range of overall HSUVs from 0.6125 to 0.863). Two studies reported the population of frontline workers (the range of overall HSUVs from 0.82 to 0.93). Seven of the articles researched the participants who had not been infected with COVID-19 but suffered from morbidities during the pandemic (range of overall HSUVs from 0.5 to 0.96). Thirteen studies showed that the HRQoL of the respondents who have not been infected with COVID-19 and without any morbidities (range of overall HSUVs from 0.64 to 0.964). Moreover, eighteen articles reported the outcomes of overall HSUVs during the COVID-19 pandemic in different population groups. The estimate of overall HSUVs of direct COVID-19 experience population (n=1333) was 0.751 (95% CI 0.670 - 0.832, I2 = 98.64%); the estimate of frontline population (n=610) was 0.906 ((95% CI 0.854 – 0.957, I2 = 98.61%); participants with different disease (n=132) were 0.768 (95% CI 0.515 - 1.021, I2= 99.26%); general population without infection history (n=29,892) was 0.825 (95% CI 0.766 - 0.885, I2 =99.69%). Conclusively, taking into account these results, this systematic review might confirm that COVID-19 has a negative impact on the HRQoL of the infected population and illness population. It provides practical value for cost-effectiveness model analysis of health states related to COVID-19.Keywords: COVID-19, health-related quality of life, meta-analysis, systematic review, utility value
Procedia PDF Downloads 8226217 Using the Transtheoretical Model to Investigate Stages of Change in Regular Volunteer Service among Seniors in Community
Authors: Pei-Ti Hsu, I-Ju Chen, Jeu-Jung Chen, Cheng-Fen Chang, Shiu-Yan Yang
Abstract:
Taiwan now is an aging society Research on the elderly should not be confined to caring for seniors, but should also be focused on ways to improve health and the quality of life. Senior citizens who participate in volunteer services could become less lonely, have new growth opportunities, and regain a sense of accomplishment. Thus, the question of how to get the elderly to participate in volunteer service is worth exploring. Apply the Transtheoretical Model to understand stages of change in regular volunteer service and voluntary service behaviour among the seniors. 1525 adults over the age of 65 from the Renai district of Keelung City were interviewed. The research tool was a self-constructed questionnaire and individual interviews were conducted to collect data. Then the data was processed and analyzed using the IBM SPSS Statistics 20 (Windows version) statistical software program. In the past six months, research subjects averaged 9.92 days of volunteer services. A majority of these elderly individuals had no intention to change their regular volunteer services. We discovered that during the maintenance stage, the self-efficacy for volunteer services was higher than during all other stages, but self-perceived barriers were less during the preparation stage and action stage. Self-perceived benefits were found to have an important predictive power for those with regular volunteer service behaviors in the previous stage, and self-efficacy was found to have an important predictive power for those with regular volunteer service behaviors in later stages. The research results support the conclusion that community nursing staff should group elders based on their regular volunteer services change stages and design appropriate behavioral change strategies.Keywords: seniors, stages of change in regular volunteer services, volunteer service behavior, self-efficacy, self-perceived benefits
Procedia PDF Downloads 42626216 Analysis of Urban Population Using Twitter Distribution Data: Case Study of Makassar City, Indonesia
Authors: Yuyun Wabula, B. J. Dewancker
Abstract:
In the past decade, the social networking app has been growing very rapidly. Geolocation data is one of the important features of social media that can attach the user's location coordinate in the real world. This paper proposes the use of geolocation data from the Twitter social media application to gain knowledge about urban dynamics, especially on human mobility behavior. This paper aims to explore the relation between geolocation Twitter with the existence of people in the urban area. Firstly, the study will analyze the spread of people in the particular area, within the city using Twitter social media data. Secondly, we then match and categorize the existing place based on the same individuals visiting. Then, we combine the Twitter data from the tracking result and the questionnaire data to catch the Twitter user profile. To do that, we used the distribution frequency analysis to learn the visitors’ percentage. To validate the hypothesis, we compare it with the local population statistic data and land use mapping released by the city planning department of Makassar local government. The results show that there is the correlation between Twitter geolocation and questionnaire data. Thus, integration the Twitter data and survey data can reveal the profile of the social media users.Keywords: geolocation, Twitter, distribution analysis, human mobility
Procedia PDF Downloads 31426215 The Use of Videoconferencing in a Task-Based Beginners' Chinese Class
Authors: Sijia Guo
Abstract:
The development of new technologies and the falling cost of high-speed Internet access have made it easier for institutes and language teachers to opt different ways to communicate with students at distance. The emergence of web-conferencing applications, which integrate text, chat, audio / video and graphic facilities, offers great opportunities for language learning to through the multimodal environment. This paper reports on data elicited from a Ph.D. study of using web-conferencing in the teaching of first-year Chinese class in order to promote learners’ collaborative learning. Firstly, a comparison of four desktop videoconferencing (DVC) tools was conducted to determine the pedagogical value of the videoconferencing tool-Blackboard Collaborate. Secondly, the evaluation of 14 campus-based Chinese learners who conducted five one-hour online sessions via the multimodal environment reveals the users’ choice of modes and their learning preference. The findings show that the tasks designed for the web-conferencing environment contributed to the learners’ collaborative learning and second language acquisition.Keywords: computer-mediated communication (CMC), CALL evaluation, TBLT, web-conferencing, online Chinese teaching
Procedia PDF Downloads 30926214 Perceptual Learning with Hand-Eye Coordination as an Effective Tool for Managing Amblyopia: A Prospective Study
Authors: Anandkumar S. Purohit
Abstract:
Introduction: Amblyopia is a serious condition resulting in monocular impairment of vision. Although traditional treatment improves vision, we attempted the results of perceptual learning in this study. Methods: The prospective cohort study included all patients with amblyopia who were subjected to perceptual learning. The presenting data on vision, stereopsis, and contrast sensitivity were documented in a pretested online format, and the pre‑ and post‑treatment information was compared using descriptive, cross‑tabulation, and comparative methods on SPSS 22. Results: The cohort consisted of 47 patients (23 females and 24 males) with a mean age of 14.11 ± 7.13 years. A significant improvement was detected in visual acuity after the PL sessions, and the median follow‑up period was 17 days. Stereopsis improved significantly in all age groups. Conclusion: PL with hand-eye coordination is an effective method for managing amblyopia. This approach can improve vision in all age groups.Keywords: amblyopia, perceptual learning, hand-eye coordination, visual acuity, stereopsis, contrast sensitivity, ophthalmology
Procedia PDF Downloads 2626213 Data-Centric Anomaly Detection with Diffusion Models
Authors: Sheldon Liu, Gordon Wang, Lei Liu, Xuefeng Liu
Abstract:
Anomaly detection, also referred to as one-class classification, plays a crucial role in identifying product images that deviate from the expected distribution. This study introduces Data-centric Anomaly Detection with Diffusion Models (DCADDM), presenting a systematic strategy for data collection and further diversifying the data with image generation via diffusion models. The algorithm addresses data collection challenges in real-world scenarios and points toward data augmentation with the integration of generative AI capabilities. The paper explores the generation of normal images using diffusion models. The experiments demonstrate that with 30% of the original normal image size, modeling in an unsupervised setting with state-of-the-art approaches can achieve equivalent performances. With the addition of generated images via diffusion models (10% equivalence of the original dataset size), the proposed algorithm achieves better or equivalent anomaly localization performance.Keywords: diffusion models, anomaly detection, data-centric, generative AI
Procedia PDF Downloads 82