Search results for: open data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27132

Search results for: open data

24522 A Proof for Goldbach's Conjecture

Authors: Hashem Sazegar

Abstract:

In 1937, Vinograd of Russian Mathematician proved that each odd large number can be shown by three primes. In 1973, Chen Jingrun proved that each odd number can be shown by one prime plus a number that has maximum two primes. In this article, we state one proof for Goldbach’conjecture. Introduction: Bertrand’s postulate state for every positive integer n, there is always at least one prime p, such that n < p < 2n. This was first proved by Chebyshev in 1850, which is why postulate is also called the Bertrand-Chebyshev theorem. Legendre’s conjecture states that there is a prime between n2 and (n+1)2 for every positive integer n, which is one of the four Landau’s problems. The rest of these four basic problems are; (i) Twin prime conjecture: There are infinitely many primes p such that p+2 is a prime. (ii) Goldbach’s conjecture: Every even integer n > 2 can be written asthe sum of two primes. (iii) Are there infinitely many primes p such that p−1 is a perfect square? Problems (i), (ii), and (iii) are open till date.

Keywords: Bertrand-Chebyshev theorem, Landau’s problems, twin prime, Legendre’s conjecture, Oppermann’s conjecture

Procedia PDF Downloads 403
24521 BFDD-S: Big Data Framework to Detect and Mitigate DDoS Attack in SDN Network

Authors: Amirreza Fazely Hamedani, Muzzamil Aziz, Philipp Wieder, Ramin Yahyapour

Abstract:

Software-defined networking in recent years came into the sight of so many network designers as a successor to the traditional networking. Unlike traditional networks where control and data planes engage together within a single device in the network infrastructure such as switches and routers, the two planes are kept separated in software-defined networks (SDNs). All critical decisions about packet routing are made on the network controller, and the data level devices forward the packets based on these decisions. This type of network is vulnerable to DDoS attacks, degrading the overall functioning and performance of the network by continuously injecting the fake flows into it. This increases substantial burden on the controller side, and the result ultimately leads to the inaccessibility of the controller and the lack of network service to the legitimate users. Thus, the protection of this novel network architecture against denial of service attacks is essential. In the world of cybersecurity, attacks and new threats emerge every day. It is essential to have tools capable of managing and analyzing all this new information to detect possible attacks in real-time. These tools should provide a comprehensive solution to automatically detect, predict and prevent abnormalities in the network. Big data encompasses a wide range of studies, but it mainly refers to the massive amounts of structured and unstructured data that organizations deal with on a regular basis. On the other hand, it regards not only the volume of the data; but also that how data-driven information can be used to enhance decision-making processes, security, and the overall efficiency of a business. This paper presents an intelligent big data framework as a solution to handle illegitimate traffic burden on the SDN network created by the numerous DDoS attacks. The framework entails an efficient defence and monitoring mechanism against DDoS attacks by employing the state of the art machine learning techniques.

Keywords: apache spark, apache kafka, big data, DDoS attack, machine learning, SDN network

Procedia PDF Downloads 169
24520 Welding Process Selection for Storage Tank by Integrated Data Envelopment Analysis and Fuzzy Credibility Constrained Programming Approach

Authors: Rahmad Wisnu Wardana, Eakachai Warinsiriruk, Sutep Joy-A-Ka

Abstract:

Selecting the most suitable welding process usually depends on experiences or common application in similar companies. However, this approach generally ignores many criteria that can be affecting the suitable welding process selection. Therefore, knowledge automation through knowledge-based systems will significantly improve the decision-making process. The aims of this research propose integrated data envelopment analysis (DEA) and fuzzy credibility constrained programming approach for identifying the best welding process for stainless steel storage tank in the food and beverage industry. The proposed approach uses fuzzy concept and credibility measure to deal with uncertain data from experts' judgment. Furthermore, 12 parameters are used to determine the most appropriate welding processes among six competitive welding processes.

Keywords: welding process selection, data envelopment analysis, fuzzy credibility constrained programming, storage tank

Procedia PDF Downloads 168
24519 Capital Mobility in Savings and Investment across China and the ASEAN-5: Evidence from Recursive Cointegration

Authors: Chang Lee Shu-Jung, Mei-Se Chien, Chien-Chiang Lee, Hui-Ting Hu

Abstract:

This paper applies recursive cointegration analysis to examine the dynamic changes in Feldstein-Horioka saving-investment (S-I) coefficients across China and the ASEAN-5 countries over time. To the extent that the S-I coefficients measure international capital mobility, the main empirical results are as follows. The recursive trace statistics show that the investment- savings nexus varies in these six countries. There is no cointegration between investment and savings in three countries (China, Malaysia, and Singapore), which means that the mobility of the capital markets in the three is high and that domestic investment in them will be financed by the global pool of capital. As to the other three countries (Indonesia, Thailand, and Philippines), there is cointegration between investment and savings for part of the sample period in the three, including before 2002 for Thailand, before 2001 for Indonesia, and before 2002 for Philippines. This shows these three countries achieved highly mobile and open capital markets later.

Keywords: investment, savings, recursive cointegration test, ASEAN, China

Procedia PDF Downloads 552
24518 Non-Invasive Characterization of the Mechanical Properties of Arterial Walls

Authors: Bruno RamaëL, GwenaëL Page, Catherine Knopf-Lenoir, Olivier Baledent, Anne-Virginie Salsac

Abstract:

No routine technique currently exists for clinicians to measure the mechanical properties of vascular walls non-invasively. Most of the data available in the literature come from traction or dilatation tests conducted ex vivo on native blood vessels. The objective of the study is to develop a non-invasive characterization technique based on Magnetic Resonance Imaging (MRI) measurements of the deformation of vascular walls under pulsating blood flow conditions. The goal is to determine the mechanical properties of the vessels by inverse analysis, coupling imaging measurements and numerical simulations of the fluid-structure interactions. The hyperelastic properties are identified using Solidworks and Ansys workbench (ANSYS Inc.) solving an optimization technique. The vessel of interest targeted in the study is the common carotid artery. In vivo MRI measurements of the vessel anatomy and inlet velocity profiles was acquired along the facial vascular network on a cohort of 30 healthy volunteers: - The time-evolution of the blood vessel contours and, thus, of the cross-section surface area was measured by 3D imaging angiography sequences of phase-contrast MRI. - The blood flow velocity was measured using a 2D CINE MRI phase contrast (PC-MRI) method. Reference arterial pressure waveforms were simultaneously measured in the brachial artery using a sphygmomanometer. The three-dimensional (3D) geometry of the arterial network was reconstructed by first creating an STL file from the raw MRI data using the open source imaging software ITK-SNAP. The resulting geometry was then transformed with Solidworks into volumes that are compatible with Ansys softwares. Tetrahedral meshes of the wall and fluid domains were built using the ANSYS Meshing software, with a near-wall mesh refinement method in the case of the fluid domain to improve the accuracy of the fluid flow calculations. Ansys Structural was used for the numerical simulation of the vessel deformation and Ansys CFX for the simulation of the blood flow. The fluid structure interaction simulations showed that the systolic and diastolic blood pressures of the common carotid artery could be taken as reference pressures to identify the mechanical properties of the different arteries of the network. The coefficients of the hyperelastic law were identified using Ansys Design model for the common carotid. Under large deformations, a stiffness of 800 kPa is measured, which is of the same order of magnitude as the Young modulus of collagen fibers. Areas of maximum deformations were highlighted near bifurcations. This study is a first step towards patient-specific characterization of the mechanical properties of the facial vessels. The method is currently applied on patients suffering from facial vascular malformations and on patients scheduled for facial reconstruction. Information on the blood flow velocity as well as on the vessel anatomy and deformability will be key to improve surgical planning in the case of such vascular pathologies.

Keywords: identification, mechanical properties, arterial walls, MRI measurements, numerical simulations

Procedia PDF Downloads 319
24517 Photo-Reflective Mulches For Saving Water in Agriculture

Authors: P. Mormile, M. Rippa, G. Bonanomi, F. Scala, Changrong Yan, L. Petti

Abstract:

Photo-reflective films represent, in the panorama of agricultural films, a valid support for Spring and Summer cultivations, both in open field and under greenhouse. In fact, thanks to the high reflectivity of these films, thermal aggression, that causes serious problems to plants when traditional black mulch films are used, is avoided. Yellow or silver colored photo-reflective films protect plants from damages, assure the mulching effect, give a valid support to Integrated Pest Management and, according to recent trials, greatly contribute in saving water. This further advantage is determined by the high water condensation under the mulch film and this gives rise to reduction of irrigation. Water saving means also energy saving for electric system of water circulation. Trials performed at different geographic and ambient context confirm that the use of photo-reflective mulch films during the hot season allows to save water up to 30%.

Keywords: photo-selective mulches, saving water, water circulation, irrigation

Procedia PDF Downloads 516
24516 On the Estimation of Crime Rate in the Southwest of Nigeria: Principal Component Analysis Approach

Authors: Kayode Balogun, Femi Ayoola

Abstract:

Crime is at alarming rate in this part of world and there are many factors that are contributing to this antisocietal behaviour both among the youths and old. In this work, principal component analysis (PCA) was used as a tool to reduce the dimensionality and to really know those variables that were crime prone in the study region. Data were collected on twenty-eight crime variables from National Bureau of Statistics (NBS) databank for a period of fifteen years, while retaining as much of the information as possible. We use PCA in this study to know the number of major variables and contributors to the crime in the Southwest Nigeria. The results of our analysis revealed that there were eight principal variables have been retained using the Scree plot and Loading plot which implies an eight-equation solution will be appropriate for the data. The eight components explained 93.81% of the total variation in the data set. We also found that the highest and commonly committed crimes in the Southwestern Nigeria were: Assault, Grievous Harm and Wounding, theft/stealing, burglary, house breaking, false pretence, unlawful arms possession and breach of public peace.

Keywords: crime rates, data, Southwest Nigeria, principal component analysis, variables

Procedia PDF Downloads 445
24515 On-Line Data-Driven Multivariate Statistical Prediction Approach to Production Monitoring

Authors: Hyun-Woo Cho

Abstract:

Detection of incipient abnormal events in production processes is important to improve safety and reliability of manufacturing operations and reduce losses caused by failures. The construction of calibration models for predicting faulty conditions is quite essential in making decisions on when to perform preventive maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of process measurement data. The calibration model is used to predict faulty conditions from historical reference data. This approach utilizes variable selection techniques, and the predictive performance of several prediction methods are evaluated using real data. The results shows that the calibration model based on supervised probabilistic model yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.

Keywords: calibration model, monitoring, quality improvement, feature selection

Procedia PDF Downloads 357
24514 Multilevel Gray Scale Image Encryption through 2D Cellular Automata

Authors: Rupali Bhardwaj

Abstract:

Cryptography is the science of using mathematics to encrypt and decrypt data; the data are converted into some other gibberish form, and then the encrypted data are transmitted. The primary purpose of this paper is to provide two levels of security through a two-step process, rather than transmitted the message bits directly, first encrypted it using 2D cellular automata and then scrambled with Arnold Cat Map transformation; it provides an additional layer of protection and reduces the chance of the transmitted message being detected. A comparative analysis on effectiveness of scrambling technique is provided by scrambling degree measurement parameters i.e. Gray Difference Degree (GDD) and Correlation Coefficient.

Keywords: scrambling, cellular automata, Arnold cat map, game of life, gray difference degree, correlation coefficient

Procedia PDF Downloads 378
24513 Survey Based Data Security Evaluation in Pakistan Financial Institutions against Malicious Attacks

Authors: Naveed Ghani, Samreen Javed

Abstract:

In today’s heterogeneous network environment, there is a growing demand for distrust clients to jointly execute secure network to prevent from malicious attacks as the defining task of propagating malicious code is to locate new targets to attack. Residual risk is always there no matter what solutions are implemented or whet so ever security methodology or standards being adapted. Security is the first and crucial phase in the field of Computer Science. The main aim of the Computer Security is gathering of information with secure network. No one need wonder what all that malware is trying to do: It's trying to steal money through data theft, bank transfers, stolen passwords, or swiped identities. From there, with the help of our survey we learn about the importance of white listing, antimalware programs, security patches, log files, honey pots, and more used in banks for financial data protection but there’s also a need of implementing the IPV6 tunneling with Crypto data transformation according to the requirements of new technology to prevent the organization from new Malware attacks and crafting of its own messages and sending them to the target. In this paper the writer has given the idea of implementing IPV6 Tunneling Secessions on private data transmission from financial organizations whose secrecy needed to be safeguarded.

Keywords: network worms, malware infection propagating malicious code, virus, security, VPN

Procedia PDF Downloads 358
24512 Interactive IoT-Blockchain System for Big Data Processing

Authors: Abdallah Al-ZoubI, Mamoun Dmour

Abstract:

The spectrum of IoT devices is becoming widely diversified, entering almost all possible fields and finding applications in industry, health, finance, logistics, education, to name a few. The IoT active endpoint sensors and devices exceeded the 12 billion mark in 2021 and are expected to reach 27 billion in 2025, with over $34 billion in total market value. This sheer rise in numbers and use of IoT devices bring with it considerable concerns regarding data storage, analysis, manipulation and protection. IoT Blockchain-based systems have recently been proposed as a decentralized solution for large-scale data storage and protection. COVID-19 has actually accelerated the desire to utilize IoT devices as it impacted both demand and supply and significantly affected several regions due to logistic reasons such as supply chain interruptions, shortage of shipping containers and port congestion. An IoT-blockchain system is proposed to handle big data generated by a distributed network of sensors and controllers in an interactive manner. The system is designed using the Ethereum platform, which utilizes smart contracts, programmed in solidity to execute and manage data generated by IoT sensors and devices. such as Raspberry Pi 4, Rasbpian, and add-on hardware security modules. The proposed system will run a number of applications hosted by a local machine used to validate transactions. It then sends data to the rest of the network through InterPlanetary File System (IPFS) and Ethereum Swarm, forming a closed IoT ecosystem run by blockchain where a number of distributed IoT devices can communicate and interact, thus forming a closed, controlled environment. A prototype has been deployed with three IoT handling units distributed over a wide geographical space in order to examine its feasibility, performance and costs. Initial results indicated that big IoT data retrieval and storage is feasible and interactivity is possible, provided that certain conditions of cost, speed and thorough put are met.

Keywords: IoT devices, blockchain, Ethereum, big data

Procedia PDF Downloads 150
24511 Keynote Talk: The Role of Internet of Things in the Smart Cities Power System

Authors: Abdul-Rahman Al-Ali

Abstract:

As the number of mobile devices is growing exponentially, it is estimated to connect about 50 million devices to the Internet by the year 2020. At the end of this decade, it is expected that an average of eight connected devices per person worldwide. The 50 billion devices are not mobile phones and data browsing gadgets only, but machine-to-machine and man-to-machine devices. With such growing numbers of devices the Internet of Things (I.o.T) concept is one of the emerging technologies as of recently. Within the smart grid technologies, smart home appliances, Intelligent Electronic Devices (IED) and Distributed Energy Resources (DER) are major I.o.T objects that can be addressable using the IPV6. These objects are called the smart grid internet of things (SG-I.o.T). The SG-I.o.T generates big data that requires high-speed computing infrastructure, widespread computer networks, big data storage, software, and platforms services. A company’s utility control and data centers cannot handle such a large number of devices, high-speed processing, and massive data storage. Building large data center’s infrastructure takes a long time, it also requires widespread communication networks and huge capital investment. To maintain and upgrade control and data centers’ infrastructure and communication networks as well as updating and renewing software licenses which collectively, requires additional cost. This can be overcome by utilizing the emerging computing paradigms such as cloud computing. This can be used as a smart grid enabler to replace the legacy of utilities data centers. The talk will highlight the role of I.o.T, cloud computing services and their development models within the smart grid technologies.

Keywords: intelligent electronic devices (IED), distributed energy resources (DER), internet, smart home appliances

Procedia PDF Downloads 324
24510 Statistical Analysis of Interferon-γ for the Effectiveness of an Anti-Tuberculous Treatment

Authors: Shishen Xie, Yingda L. Xie

Abstract:

Tuberculosis (TB) is a potentially serious infectious disease that remains a health concern. The Interferon Gamma Release Assay (IGRA) is a blood test to find out if an individual is tuberculous positive or negative. This study applies statistical analysis to the clinical data of interferon-gamma levels of seventy-three subjects who diagnosed pulmonary TB in an anti-tuberculous treatment. Data analysis is performed to determine if there is a significant decline in interferon-gamma levels for the subjects during a period of six months, and to infer if the anti-tuberculous treatment is effective.

Keywords: data analysis, interferon gamma release assay, statistical methods, tuberculosis infection

Procedia PDF Downloads 306
24509 Short Text Classification Using Part of Speech Feature to Analyze Students' Feedback of Assessment Components

Authors: Zainab Mutlaq Ibrahim, Mohamed Bader-El-Den, Mihaela Cocea

Abstract:

Students' textual feedback can hold unique patterns and useful information about learning process, it can hold information about advantages and disadvantages of teaching methods, assessment components, facilities, and other aspects of teaching. The results of analysing such a feedback can form a key point for institutions’ decision makers to advance and update their systems accordingly. This paper proposes a data mining framework for analysing end of unit general textual feedback using part of speech feature (PoS) with four machine learning algorithms: support vector machines, decision tree, random forest, and naive bays. The proposed framework has two tasks: first, to use the above algorithms to build an optimal model that automatically classifies the whole data set into two subsets, one subset is tailored to assessment practices (assessment related), and the other one is the non-assessment related data. Second task to use the same algorithms to build an optimal model for whole data set, and the new data subsets to automatically detect their sentiment. The significance of this paper is to compare the performance of the above four algorithms using part of speech feature to the performance of the same algorithms using n-grams feature. The paper follows Knowledge Discovery and Data Mining (KDDM) framework to construct the classification and sentiment analysis models, which is understanding the assessment domain, cleaning and pre-processing the data set, selecting and running the data mining algorithm, interpreting mined patterns, and consolidating the discovered knowledge. The results of this paper experiments show that both models which used both features performed very well regarding first task. But regarding the second task, models that used part of speech feature has underperformed in comparison with models that used unigrams and bigrams.

Keywords: assessment, part of speech, sentiment analysis, student feedback

Procedia PDF Downloads 142
24508 Fast Fourier Transform-Based Steganalysis of Covert Communications over Streaming Media

Authors: Jinghui Peng, Shanyu Tang, Jia Li

Abstract:

Steganalysis seeks to detect the presence of secret data embedded in cover objects, and there is an imminent demand to detect hidden messages in streaming media. This paper shows how a steganalysis algorithm based on Fast Fourier Transform (FFT) can be used to detect the existence of secret data embedded in streaming media. The proposed algorithm uses machine parameter characteristics and a network sniffer to determine whether the Internet traffic contains streaming channels. The detected streaming data is then transferred from the time domain to the frequency domain through FFT. The distributions of power spectra in the frequency domain between original VoIP streams and stego VoIP streams are compared in turn using t-test, achieving the p-value of 7.5686E-176 which is below the threshold. The results indicate that the proposed FFT-based steganalysis algorithm is effective in detecting the secret data embedded in VoIP streaming media.

Keywords: steganalysis, security, Fast Fourier Transform, streaming media

Procedia PDF Downloads 147
24507 Horizon Scanning of Disruptive Technology Trends in Marine for 2030 Horizon

Authors: Jose Gonzalez, Fai Cheng, Ivy Fan

Abstract:

Shipping has a mature and ever expanding worldwide market. The future of the marine industry itself is not only irrevocably linked with the global economic, social, and political landscape; it is also subject to the technological developments in different fields. Some of them may have never been linked to the marine industry before. Companies in the marine sector are getting more dependent on technologies to achieve competitive advantage in an increasing open market. Technologies can be fused across different business functions and geopolitical influences. A successful marine business should be prepared to embrace such potential changes that lie ahead. The present paper intends to articulate long-term marine technology strategies from an industrial perspective. Methodology and current development are introduced. The paper will also provide insight into future technological trends demand for major commercial ship types. It may also assist different stakeholders in tailoring their long-term strategies to achieve a Sea Change and to uncap opportunity.

Keywords: commercial sector, marine, trends, technology

Procedia PDF Downloads 409
24506 Privacy-Preserving Model for Social Network Sites to Prevent Unwanted Information Diffusion

Authors: Sanaz Kavianpour, Zuraini Ismail, Bharanidharan Shanmugam

Abstract:

Social Network Sites (SNSs) can be served as an invaluable platform to transfer the information across a large number of individuals. A substantial component of communicating and managing information is to identify which individual will influence others in propagating information and also whether dissemination of information in the absence of social signals about that information will be occurred or not. Classifying the final audience of social data is difficult as controlling the social contexts which transfers among individuals are not completely possible. Hence, undesirable information diffusion to an unauthorized individual on SNSs can threaten individuals’ privacy. This paper highlights the information diffusion in SNSs and moreover it emphasizes the most significant privacy issues to individuals of SNSs. The goal of this paper is to propose a privacy-preserving model that has urgent regards with individuals’ data in order to control availability of data and improve privacy by providing access to the data for an appropriate third parties without compromising the advantages of information sharing through SNSs.

Keywords: anonymization algorithm, classification algorithm, information diffusion, privacy, social network sites

Procedia PDF Downloads 321
24505 Application Difference between Cox and Logistic Regression Models

Authors: Idrissa Kayijuka

Abstract:

The logistic regression and Cox regression models (proportional hazard model) at present are being employed in the analysis of prospective epidemiologic research looking into risk factors in their application on chronic diseases. However, a theoretical relationship between the two models has been studied. By definition, Cox regression model also called Cox proportional hazard model is a procedure that is used in modeling data regarding time leading up to an event where censored cases exist. Whereas the Logistic regression model is mostly applicable in cases where the independent variables consist of numerical as well as nominal values while the resultant variable is binary (dichotomous). Arguments and findings of many researchers focused on the overview of Cox and Logistic regression models and their different applications in different areas. In this work, the analysis is done on secondary data whose source is SPSS exercise data on BREAST CANCER with a sample size of 1121 women where the main objective is to show the application difference between Cox regression model and logistic regression model based on factors that cause women to die due to breast cancer. Thus we did some analysis manually i.e. on lymph nodes status, and SPSS software helped to analyze the mentioned data. This study found out that there is an application difference between Cox and Logistic regression models which is Cox regression model is used if one wishes to analyze data which also include the follow-up time whereas Logistic regression model analyzes data without follow-up-time. Also, they have measurements of association which is different: hazard ratio and odds ratio for Cox and logistic regression models respectively. A similarity between the two models is that they are both applicable in the prediction of the upshot of a categorical variable i.e. a variable that can accommodate only a restricted number of categories. In conclusion, Cox regression model differs from logistic regression by assessing a rate instead of proportion. The two models can be applied in many other researches since they are suitable methods for analyzing data but the more recommended is the Cox, regression model.

Keywords: logistic regression model, Cox regression model, survival analysis, hazard ratio

Procedia PDF Downloads 456
24504 Text Mining of Twitter Data Using a Latent Dirichlet Allocation Topic Model and Sentiment Analysis

Authors: Sidi Yang, Haiyi Zhang

Abstract:

Twitter is a microblogging platform, where millions of users daily share their attitudes, views, and opinions. Using a probabilistic Latent Dirichlet Allocation (LDA) topic model to discern the most popular topics in the Twitter data is an effective way to analyze a large set of tweets to find a set of topics in a computationally efficient manner. Sentiment analysis provides an effective method to show the emotions and sentiments found in each tweet and an efficient way to summarize the results in a manner that is clearly understood. The primary goal of this paper is to explore text mining, extract and analyze useful information from unstructured text using two approaches: LDA topic modelling and sentiment analysis by examining Twitter plain text data in English. These two methods allow people to dig data more effectively and efficiently. LDA topic model and sentiment analysis can also be applied to provide insight views in business and scientific fields.

Keywords: text mining, Twitter, topic model, sentiment analysis

Procedia PDF Downloads 179
24503 Navigating Government Finance Statistics: Effortless Retrieval and Comparative Analysis through Data Science and Machine Learning

Authors: Kwaku Damoah

Abstract:

This paper presents a methodology and software application (App) designed to empower users in accessing, retrieving, and comparatively exploring data within the hierarchical network framework of the Government Finance Statistics (GFS) system. It explores the ease of navigating the GFS system and identifies the gaps filled by the new methodology and App. The GFS, embodies a complex Hierarchical Network Classification (HNC) structure, encapsulating institutional units, revenues, expenses, assets, liabilities, and economic activities. Navigating this structure demands specialized knowledge, experience, and skill, posing a significant challenge for effective analytics and fiscal policy decision-making. Many professionals encounter difficulties deciphering these classifications, hindering confident utilization of the system. This accessibility barrier obstructs a vast number of professionals, students, policymakers, and the public from leveraging the abundant data and information within the GFS. Leveraging R programming language, Data Science Analytics and Machine Learning, an efficient methodology enabling users to access, navigate, and conduct exploratory comparisons was developed. The machine learning Fiscal Analytics App (FLOWZZ) democratizes access to advanced analytics through its user-friendly interface, breaking down expertise barriers.

Keywords: data science, data wrangling, drilldown analytics, government finance statistics, hierarchical network classification, machine learning, web application.

Procedia PDF Downloads 70
24502 The Attitudes of Pre-Service Teachers towards Analytical Thinking Skill Development Based on Miller’s Model

Authors: Thassanant Unnanantn, Suttipong Boonphadung

Abstract:

This research study aimed to survey and analyze the attitudes of pre-service teachers’ the analytical thinking development based on Miller’s Model. The informants of this study were 22 third year teacher students majoring in Thai. The course where the instruction was conducted was English for Academic Purposes in Thai Language 2. The instrument of this research was an open-ended questionnaire with two dimensions of questions: academic and satisfaction dimensions. The investigation revealed the positive attitudes. In the academic dimension, the majority of 12 (54.54%), the highest percentage, reflected that the method of teaching analytical thinking and language simultaneously was their new knowledge and the similar percentage also belonged to text cohesion in writing. For the satisfaction, the highest frequency count was from 17 of them (77.27%) and this majority favored the openness or friendliness of the teacher.

Keywords: analytical thinking development, Miller’s Model, attitudes, pre-service teachers

Procedia PDF Downloads 309
24501 Development of an Intervention Program for Moral Education of Undergraduate Students of Sport Sciences and Physical Education

Authors: Najia Zulfiqar

Abstract:

Imparting moral education is the need of time, considering the obvious moral decline in society. Recent research shows the downfall of moral competence among university students. The main objective of the present study was to develop moral development intervention strategies for undergraduate students of Sports and Physical Education. Using an interpretative phenomenological approach, insight into field-specific moral issues was gained through interviews with 7 subject experts and a focus-group discussion session with 8 students. Two research assistants who were trained in qualitative interviewing collected, transcribed and analyzed data into the MAXQDA software using content and discourse analyses. The identified moral issues in Sports and Physical Education were sports gambling and betting, pay-for-play, doping, coach misconduct, tampering, cultural bias, gender equity/nepotism, bullying/discrimination, and harassment. Next, intervention modules were developed for each moral issue based on hypothetical situations, and followed by guided reflection and dilemma discussion questions. The third moral development strategy was community services that included posture screening, diet plan for different age groups, open fitness ground training, exercise camps for physical fitness, balanced diet awareness camp, gymnastic camp, shoe assessment as per health standards, and volunteering for public awareness at the playground, gymnasium, stadium, park, etc. The intervention modules were given to four subject specialists for expert validation who were from different backgrounds within Sport Sciences. Upon refinement and finalization, four students were presented with these intervention modules and questioned about accuracy, relevance, comprehension, and content organization. Iterative changes were made in the content of the intervention modules to tailor them to the moral development needs of undergraduate students. This intervention will strengthen positive moral values and foster mature decision-making about right and wrong acts. As this intervention is easy to apply as a remedial tool, academicians and policymakers can use this to promote students’ moral development.

Keywords: community service, dilemma discussion, morality, physical education, university students.

Procedia PDF Downloads 72
24500 Value Chain Based New Business Opportunity

Authors: Seonjae Lee, Sungjoo Lee

Abstract:

Excavation is necessary to remain competitive in the current business environment. The company survived the rapidly changing industry conditions by adapting new business strategy and reducing technology challenges. Traditionally, the two methods are conducted excavations for new businesses. The first method is, qualitative analysis of expert opinion, which is gathered through opportunities and secondly, new technologies are discovered through quantitative data analysis of method patents. The second method increases time and cost. Patent data is restricted for use and the purpose of discovering business opportunities. This study presents the company's characteristics (sector, size, etc.), of new business opportunities in customized form by reviewing the value chain perspective and to contributing to creating new business opportunities in the proposed model. It utilizes the trademark database of the Korean Intellectual Property Office (KIPO) and proprietary company information database of the Korea Enterprise Data (KED). This data is key to discovering new business opportunities with analysis of competitors and advanced business trademarks (Module 1) and trading analysis of competitors found in the KED (Module 2).

Keywords: value chain, trademark, trading analysis, new business opportunity

Procedia PDF Downloads 373
24499 The Volume–Volatility Relationship Conditional to Market Efficiency

Authors: Massimiliano Frezza, Sergio Bianchi, Augusto Pianese

Abstract:

The relation between stock price volatility and trading volume represents a controversial issue which has received a remarkable attention over the past decades. In fact, an extensive literature shows a positive relation between price volatility and trading volume in the financial markets, but the causal relationship which originates such association is an open question, from both a theoretical and empirical point of view. In this regard, various models, which can be considered as complementary rather than competitive, have been introduced to explain this relationship. They include the long debated Mixture of Distributions Hypothesis (MDH); the Sequential Arrival of Information Hypothesis (SAIH); the Dispersion of Beliefs Hypothesis (DBH); the Noise Trader Hypothesis (NTH). In this work, we analyze whether stock market efficiency can explain the diversity of results achieved during the years. For this purpose, we propose an alternative measure of market efficiency, based on the pointwise regularity of a stochastic process, which is the Hurst–H¨older dynamic exponent. In particular, we model the stock market by means of the multifractional Brownian motion (mBm) that displays the property of a time-changing regularity. Mostly, such models have in common the fact that they locally behave as a fractional Brownian motion, in the sense that their local regularity at time t0 (measured by the local Hurst–H¨older exponent in a neighborhood of t0 equals the exponent of a fractional Brownian motion of parameter H(t0)). Assuming that the stock price follows an mBm, we introduce and theoretically justify the Hurst–H¨older dynamical exponent as a measure of market efficiency. This allows to measure, at any time t, markets’ departures from the martingale property, i.e. from efficiency as stated by the Efficient Market Hypothesis. This approach is applied to financial markets; using data for the SP500 index from 1978 to 2017, on the one hand we find that when efficiency is not accounted for, a positive contemporaneous relationship emerges and is stable over time. Conversely, it disappears as soon as efficiency is taken into account. In particular, this association is more pronounced during time frames of high volatility and tends to disappear when market becomes fully efficient.

Keywords: volume–volatility relationship, efficient market hypothesis, martingale model, Hurst–Hölder exponent

Procedia PDF Downloads 78
24498 Novel Spoke-Type BLDC Motor Design for Cost Effective and High Power Density

Authors: Suyong Kim

Abstract:

Recently because of the rise in the price of rare earth magnet, interest of non-rare earth or less-rare earth motor is growing. Especially to achieve the high power density, Spoke-Type BLDC (Brushless Permanent Magnet) Motor with ferrite permanent magnet are spotlighted. But Spoke-Type Ferrite BLDC Motor has much of magnetic flux leakage in the direction of rotor shaft. In order to solve this problem, there are two conventional ways. But conventional ways bring the increases of product cost or the decreases of the power density. Therefore, this paper proposes new Spoke-Type BLDC Rotor shape that has the advantages of both conventional methods. The new shape is consists of a one-piece core. The inside and the outside of the rotor are open alternately. So it can take reduced production cost and high power density.

Keywords: motor, BLDC, spoke, ferrite

Procedia PDF Downloads 573
24497 Towards Addressing the Cultural Snapshot Phenomenon in Cultural Mapping Libraries

Authors: Mousouris Spiridon, Kavakli Evangelia

Abstract:

This paper focuses on Digital Libraries (DLs) that contain and geovisualise cultural data, highlighting the need to define them as a separate category termed Cultural Mapping Libraries, based on their inherent connection of culture with geographic location and their design requirements in support of visual representation of cultural data on the map. An exploratory analysis of DLs that conform to the above definition brought forward the observation that existing Cultural Mapping Libraries fail to geovisualise the entirety of cultural data per point of interest thus resulting in a Cultural Snapshot phenomenon. The existence of this phenomenon was reinforced by the results of a systematic bibliographic research. In order to address the Cultural Snapshot, this paper proposes the use of the Semantic Web principles to efficiently interconnect spatial cultural data through time, per geographic location. In this way points of interest are transformed into scenery where culture evolves over time. This evolution is expressed as occurrences taking place chronologically, in an event oriented approach, a conceptualization also endorsed by the CIDOC Conceptual Reference Model (CIDOC CRM). In particular, we posit the use of CIDOC CRM as the baseline for defining the logic of Cultural Mapping Libraries as part of the Culture Domain in accordance with the Digital Library Reference Model, in order to define the rules of cultural data management by the system. Our future goal is to transform this conceptual definition in to inferencing rules that resolve the Cultural Snapshot and lead to a more complete geovisualisation of cultural data.

Keywords: digital libraries, semantic web, geovisualization, CIDOC-CRM

Procedia PDF Downloads 109
24496 Sweden’s SARS-CoV-2 Mitigation Failure as a Science and Solutions Principle Case Study

Authors: Dany I. Doughan, Nizam S. Najd

Abstract:

Different governments in today’s global pandemic are approaching the challenging and complex issue of mitigating the spread of the SARS-CoV-2 virus differently while simultaneously considering their national economic and operational bottom lines. One of the most notable successes has been Taiwan's multifaceted virus containment approach, which resulted in a substantially lower incidence rate compared to Sweden’s chief mitigation tactic of herd immunity. From a classic Swiss Cheese Model perspective, integrating more fail-safe layers of defense against the virus in Taiwan’s approach compared to Sweden’s meant that in Taiwan, the government did not have to resort to extreme measures like the national lockdown Sweden is currently contemplating. From an optimized virus spread mitigation solution development standpoint using the Solutions Principle, the Taiwanese and Swedish solutions were desirable economically by businesses that remained open and non-economically or socially by individuals who enjoyed fewer disruptions from what they considered normal before the pandemic. Out of the two, the Taiwanese approach was more feasible long-term from a workforce management and quality control perspective for healthcare facilities and their professionals who were able to provide better, longer, more attentive care to the fewer new positive COVID-19 cases. Furthermore, the Taiwanese approach was more applicable as an overall model to emulate thanks in part to its short-term and long-term multilayered approach, which allows for the kind of flexibility needed by other governments to fully or partially adapt or adopt said, model. The Swedish approach, on the other hand, ignored the biochemical nature of the virus and relied heavily on short-term personal behavioral adjustments and conduct modifications, which are not as reliable as establishing required societal norms and awareness programs. The available international data on COVID-19 cases and the published governmental approaches to control the spread of the coronavirus support a better fit into the Solutions Principle of Taiwan’s Swiss Cheese Model success story compared to Sweden’s.

Keywords: coronavirus containment and mitigation, solutions principle, Swiss Cheese Model, viral mutation

Procedia PDF Downloads 135
24495 The Context of Teaching and Learning Primary Science to Gifted Students: An Analysis of Australian Curriculum and New South Wales Science Syllabus

Authors: Rashedul Islam

Abstract:

A firmly-validated aim of teaching science is to support student enthusiasm for science learning with an outspread interest in scientific issues in future life. This is in keeping with the recent development in Gifted and Talented Education statement which instructs that gifted students have a renewed interest and natural aptitude in science. Yet, the practice of science teaching leaves many students with the feeling that science is difficult and compared to other school subjects, students interest in science is declining at the final years of the primary school. As a curriculum guides the teaching-learning activities in school, where significant consequences may result from the context of the curricula and syllabi, are a major feature of certain educational jurisdictions in NSW, Australia. The purpose of this study was an exploration of the curriculum sets the context to identify how science education is practiced through primary schools in Sydney, Australia. This phenomenon was explored through document review from two publicly available documents namely: the NSW Science Syllabus K-6, and Australian Curriculum: Foundation - 10 Science. To analyse the data, this qualitative study applied themed content analysis at three different levels, i.e., first cycle coding, second cycle coding- pattern codes, and thematic analysis. Preliminary analysis revealed the phenomenon of teaching-learning practices drawn from eight themes under three phenomena aligned with teachers’ practices and gifted student’s learning characteristics based on Gagné’s Differentiated Model of Gifted and Talent (DMGT). From the results, it appears that, overall, the two documents are relatively well-placed in terms of identifying the context of teaching and learning primary science to gifted students. However, educators need to make themselves aware of the ways in which the curriculum needs to be adapted to meet gifted students learning needs in science. It explores the important phenomena of teaching-learning context to provide gifted students with optimal educational practices including inquiry-based learning, problem-solving, open-ended tasks, creativity in science, higher order thinking, integration, and challenges. The significance of such a study lies in its potential to schools and further research in the field of gifted education.

Keywords: teaching primary science, gifted student learning, curriculum context, science syllabi, Australia

Procedia PDF Downloads 421
24494 Optimize Data Evaluation Metrics for Fraud Detection Using Machine Learning

Authors: Jennifer Leach, Umashanger Thayasivam

Abstract:

The use of technology has benefited society in more ways than one ever thought possible. Unfortunately, though, as society’s knowledge of technology has advanced, so has its knowledge of ways to use technology to manipulate people. This has led to a simultaneous advancement in the world of fraud. Machine learning techniques can offer a possible solution to help decrease this advancement. This research explores how the use of various machine learning techniques can aid in detecting fraudulent activity across two different types of fraudulent data, and the accuracy, precision, recall, and F1 were recorded for each method. Each machine learning model was also tested across five different training and testing splits in order to discover which testing split and technique would lead to the most optimal results.

Keywords: data science, fraud detection, machine learning, supervised learning

Procedia PDF Downloads 196
24493 Suitability of Satellite-Based Data for Groundwater Modelling in Southwest Nigeria

Authors: O. O. Aiyelokun, O. A. Agbede

Abstract:

Numerical modelling of groundwater flow can be susceptible to calibration errors due to lack of adequate ground-based hydro-metrological stations in river basins. Groundwater resources management in Southwest Nigeria is currently challenged by overexploitation, lack of planning and monitoring, urbanization and climate change; hence to adopt models as decision support tools for sustainable management of groundwater; they must be adequately calibrated. Since river basins in Southwest Nigeria are characterized by missing data, and lack of adequate ground-based hydro-meteorological stations; the need for adopting satellite-based data for constructing distributed models is crucial. This study seeks to evaluate the suitability of satellite-based data as substitute for ground-based, for computing boundary conditions; by determining if ground and satellite based meteorological data fit well in Ogun and Oshun River basins. The Climate Forecast System Reanalysis (CFSR) global meteorological dataset was firstly obtained in daily form and converted to monthly form for the period of 432 months (January 1979 to June, 2014). Afterwards, ground-based meteorological data for Ikeja (1981-2010), Abeokuta (1983-2010), and Oshogbo (1981-2010) were compared with CFSR data using Goodness of Fit (GOF) statistics. The study revealed that based on mean absolute error (MEA), coefficient of correlation, (r) and coefficient of determination (R²); all meteorological variables except wind speed fit well. It was further revealed that maximum and minimum temperature, relative humidity and rainfall had high range of index of agreement (d) and ratio of standard deviation (rSD), implying that CFSR dataset could be used to compute boundary conditions such as groundwater recharge and potential evapotranspiration. The study concluded that satellite-based data such as the CFSR should be used as input when constructing groundwater flow models in river basins in Southwest Nigeria, where majority of the river basins are partially gaged and characterized with long missing hydro-metrological data.

Keywords: boundary condition, goodness of fit, groundwater, satellite-based data

Procedia PDF Downloads 130