Search results for: R data science
26033 Model Order Reduction for Frequency Response and Effect of Order of Method for Matching Condition
Authors: Aref Ghafouri, Mohammad javad Mollakazemi, Farhad Asadi
Abstract:
In this paper, model order reduction method is used for approximation in linear and nonlinearity aspects in some experimental data. This method can be used for obtaining offline reduced model for approximation of experimental data and can produce and follow the data and order of system and also it can match to experimental data in some frequency ratios. In this study, the method is compared in different experimental data and influence of choosing of order of the model reduction for obtaining the best and sufficient matching condition for following the data is investigated in format of imaginary and reality part of the frequency response curve and finally the effect and important parameter of number of order reduction in nonlinear experimental data is explained further.Keywords: frequency response, order of model reduction, frequency matching condition, nonlinear experimental data
Procedia PDF Downloads 40226032 An Empirical Study of the Impacts of Big Data on Firm Performance
Authors: Thuan Nguyen
Abstract:
In the present time, data to a data-driven knowledge-based economy is the same as oil to the industrial age hundreds of years ago. Data is everywhere in vast volumes! Big data analytics is expected to help firms not only efficiently improve performance but also completely transform how they should run their business. However, employing the emergent technology successfully is not easy, and assessing the roles of big data in improving firm performance is even much harder. There was a lack of studies that have examined the impacts of big data analytics on organizational performance. This study aimed to fill the gap. The present study suggested using firms’ intellectual capital as a proxy for big data in evaluating its impact on organizational performance. The present study employed the Value Added Intellectual Coefficient method to measure firm intellectual capital, via its three main components: human capital efficiency, structural capital efficiency, and capital employed efficiency, and then used the structural equation modeling technique to model the data and test the models. The financial fundamental and market data of 100 randomly selected publicly listed firms were collected. The results of the tests showed that only human capital efficiency had a significant positive impact on firm profitability, which highlighted the prominent human role in the impact of big data technology.Keywords: big data, big data analytics, intellectual capital, organizational performance, value added intellectual coefficient
Procedia PDF Downloads 24526031 Automated Test Data Generation For some types of Algorithm
Authors: Hitesh Tahbildar
Abstract:
The cost of test data generation for a program is computationally very high. In general case, no algorithm to generate test data for all types of algorithms has been found. The cost of generating test data for different types of algorithm is different. Till date, people are emphasizing the need to generate test data for different types of programming constructs rather than different types of algorithms. The test data generation methods have been implemented to find heuristics for different types of algorithms. Some algorithms that includes divide and conquer, backtracking, greedy approach, dynamic programming to find the minimum cost of test data generation have been tested. Our experimental results say that some of these types of algorithm can be used as a necessary condition for selecting heuristics and programming constructs are sufficient condition for selecting our heuristics. Finally we recommend the different heuristics for test data generation to be selected for different types of algorithms.Keywords: ongest path, saturation point, lmax, kL, kS
Procedia PDF Downloads 40526030 Exploring the Activity Fabric of an Intelligent Environment with Hierarchical Hidden Markov Theory
Authors: Chiung-Hui Chen
Abstract:
The Internet of Things (IoT) was designed for widespread convenience. With the smart tag and the sensing network, a large quantity of dynamic information is immediately presented in the IoT. Through the internal communication and interaction, meaningful objects provide real-time services for users. Therefore, the service with appropriate decision-making has become an essential issue. Based on the science of human behavior, this study employed the environment model to record the time sequences and locations of different behaviors and adopted the probability module of the hierarchical Hidden Markov Model for the inference. The statistical analysis was conducted to achieve the following objectives: First, define user behaviors and predict the user behavior routes with the environment model to analyze user purposes. Second, construct the hierarchical Hidden Markov Model according to the logic framework, and establish the sequential intensity among behaviors to get acquainted with the use and activity fabric of the intelligent environment. Third, establish the intensity of the relation between the probability of objects’ being used and the objects. The indicator can describe the possible limitations of the mechanism. As the process is recorded in the information of the system created in this study, these data can be reused to adjust the procedure of intelligent design services.Keywords: behavior, big data, hierarchical hidden Markov model, intelligent object
Procedia PDF Downloads 23326029 Effectiveness of Geogebra Training Activities through Teams for Junior High School Teachers
Authors: Idha Novianti, Suci Nurhayati, Puryati, Elang Krisnadi
Abstract:
Community service activities are activities of the academic community in practicing and cultivating science, knowledge, and technology to advance the general welfare and educate the nation's life as described in the Higher Education Law. Training activities on the use of GeoGebra software are an option because GeoGebra software is software that is easy to operate and complete in the presentation of graphic design. The training activity was held for 3 hours online via teams and 3 hours offline. Involving 15 junior high school mathematics teachers located around south Tangerang. As a result, all teachers were satisfied with the activity, and they had additional new knowledge and skills to teach mathematics in the topic of geometry and algebra. The existence of new knowledge made the participants increase their confidence in developing mathematical science for students at school.Keywords: geogebra, Ms. teams, junior high school teacher, mathematics
Procedia PDF Downloads 11526028 Astronomy in the Education Area: A Narrative Review
Authors: Isabella Lima Leite de Freitas
Abstract:
The importance of astronomy for humanity is unquestionable. Despite being a robust science, capable of bringing new discoveries every day and quickly increasing the ability of researchers to understand the universe more deeply, scientific research in this area can also help in various applications outside the domain of astronomy. The objective of this study was to review and conduct a descriptive analysis of published studies that presented the importance of astronomy in the area of education. A narrative review of the literature has been performed, considering the articles published in the last five years. As astronomy involves the study of physics, chemistry, biology, mathematics and technology, one of the studies evaluated presented astronomy as the gateway to science, demonstrating the presence of astronomy in 52 school curricula in 37 countries, with celestial movement the dominant content area. Another intervention study, evaluating individuals aged 4-5 years, demonstrated that the attribution of personal characteristics to cosmic bodies, in addition to the use of comprehensive astronomy concepts, favored the learning of science in preschool-age children, considering the use of practical activities of accompaniment and free drawing. Aiming to measure scientific literacy, another study developed in Turkey, motivated the authorities of this country to change the teaching materials and curriculum of secondary schools after the term “astronomy” appeared as one of the most attractive subjects for young people aged 15 to 24. There are also reports in the literature of the use of pedagogical tools, such as the representation of the Solar System on a human scale, where students can walk along the orbits of the planets while studying the laws of dynamics. The use of this tool favored the teaching of the relationship between distance, duration and speed over the period of the planets, in addition to improving the motivation and well-being of students aged 14-16. An important impact of astronomy on education was demonstrated in the study that evaluated the participation of high school students in the Astronomical Olympiads and the International Astronomy Olympiad. The study concluded that these Olympics have considerable influence on students who pursue a career in teaching or research later on, many of whom are in the area of astronomy itself. In addition, the literature indicates that the teaching of astronomy in the digital age has facilitated the availability of data for researchers, but also for the general population. This fact can increase even more the curiosity that the astronomy area has always instilled in people and promote the dissemination of knowledge on an expanded scale. Currently, astronomy has been considered an important ally in strengthening the school curricula of children, adolescents and young adults. This has been used as teaching tools, in addition to being extremely useful for scientific literacy, being increasingly used in the area of education.Keywords: astronomy, education area, teaching, review
Procedia PDF Downloads 10326027 The Perspective on Data Collection Instruments for Younger Learners
Authors: Hatice Kübra Koç
Abstract:
For academia, collecting reliable and valid data is one of the most significant issues for researchers. However, it is not the same procedure for all different target groups; meanwhile, during data collection from teenagers, young adults, or adults, researchers can use common data collection tools such as questionnaires, interviews, and semi-structured interviews; yet, for young learners and very young ones, these reliable and valid data collection tools cannot be easily designed or applied by the researchers. In this study, firstly, common data collection tools are examined for ‘very young’ and ‘young learners’ participant groups since it is thought that the quality and efficiency of an academic study is mainly based on its valid and correct data collection and data analysis procedure. Secondly, two different data collection instruments for very young and young learners are stated as discussing the efficacy of them. Finally, a suggested data collection tool – a performance-based questionnaire- which is specifically developed for ‘very young’ and ‘young learners’ participant groups in the field of teaching English to young learners as a foreign language is presented in this current study. The designing procedure and suggested items/factors for the suggested data collection tool are accordingly revealed at the end of the study to help researchers have studied with young and very learners.Keywords: data collection instruments, performance-based questionnaire, young learners, very young learners
Procedia PDF Downloads 9226026 Scattered Places in Stories Singularity and Pattern in Geographic Information
Abstract:
Increased knowledge about the nature of place and the conditions under which space becomes place is a key factor for better urban planning and place-making. Although there is a broad consensus on the relevance of this knowledge, difficulties remain in relating the theoretical framework about place and urban management. Issues related to representation of places are among the greatest obstacles to overcome this gap. With this critical discussion, based on literature review, we intended to explore, in a common framework for geographical analysis, the potential of stories to spell out place meanings, bringing together qualitative text analysis and text mining in order to capture and represent the singularity contained in each person's life history, and the patterns of social processes that shape places. The development of this reasoning is based on the extensive geographical thought about place, and in the theoretical advances in the field of Geographic Information Science (GISc).Keywords: discourse analysis, geographic information science place, place-making, stories
Procedia PDF Downloads 19626025 Generating Swarm Satellite Data Using Long Short-Term Memory and Generative Adversarial Networks for the Detection of Seismic Precursors
Authors: Yaxin Bi
Abstract:
Accurate prediction and understanding of the evolution mechanisms of earthquakes remain challenging in the fields of geology, geophysics, and seismology. This study leverages Long Short-Term Memory (LSTM) networks and Generative Adversarial Networks (GANs), a generative model tailored to time-series data, for generating synthetic time series data based on Swarm satellite data, which will be used for detecting seismic anomalies. LSTMs demonstrated commendable predictive performance in generating synthetic data across multiple countries. In contrast, the GAN models struggled to generate synthetic data, often producing non-informative values, although they were able to capture the data distribution of the time series. These findings highlight both the promise and challenges associated with applying deep learning techniques to generate synthetic data, underscoring the potential of deep learning in generating synthetic electromagnetic satellite data.Keywords: LSTM, GAN, earthquake, synthetic data, generative AI, seismic precursors
Procedia PDF Downloads 3226024 Correlation Analysis to Quantify Learning Outcomes for Different Teaching Pedagogies
Authors: Kanika Sood, Sijie Shang
Abstract:
A fundamental goal of education includes preparing students to become a part of the global workforce by making beneficial contributions to society. In this paper, we analyze student performance for multiple courses that involve different teaching pedagogies: a cooperative learning technique and an inquiry-based learning strategy. Student performance includes student engagement, grades, and attendance records. We perform this study in the Computer Science department for online and in-person courses for 450 students. We will perform correlation analysis to study the relationship between student scores and other parameters such as gender, mode of learning. We use natural language processing and machine learning to analyze student feedback data and performance data. We assess the learning outcomes of two teaching pedagogies for undergraduate and graduate courses to showcase the impact of pedagogical adoption and learning outcome as determinants of academic achievement. Early findings suggest that when using the specified pedagogies, students become experts on their topics and illustrate enhanced engagement with peers.Keywords: bag-of-words, cooperative learning, education, inquiry-based learning, in-person learning, natural language processing, online learning, sentiment analysis, teaching pedagogy
Procedia PDF Downloads 7726023 Teachers’ Perception of Implementing a Norm Critical Pedagogical Perspective – A Case Study of a Swedish Behavioural Science Programme
Authors: Sophia Yakhlef
Abstract:
Norm-critical pedagogy is an approach originating from intersectional gender pedagogy, feminist pedagogy, queer pedagogy, and critical pedagogy. In the Swedish context, the norm critical approach is rising in popularity, and norms that are highlighted or challenged are, for example, various dimensions of power such as ’whiteness norm’, discourses of ’Swedishness’, ’middle class norm’, heteronormativity, and body functionality. Instead of seeing students as a homogenous group, intersectional pedagogy focuses on the consequences of differences and on critically paying attention to differences. The perspective encourages teachers to assess their teaching methods, material, and the course literature provided in their education. The classical sociological literature that most students encounter when studying behaviour science or sociology has, in recent years, been referred to as the sociological canon. The sociological perspectives of the classical scholars included in the canon have, in many ways, shaped how we perceive the history of sociology and theories of the modern world in general. The sociological canon has, in recent decades, been challenged by, amongst others, feminist, post-colonial, and queer theorists. This urges us to further investigate the implications that this might have on sociological and behavioural science education, as well as on pedagogical considerations and teaching methods. This qualitative case study focuses on the experiences of implementing a norm critical pedagogical perspective in an online behavioural science programme at Kristianstad University in Sweden. Interviews and informal conversations were conducted in 2022 with teachers regarding their experiences of teaching online, of implementing a student-centred learning approach, and their experiences of implementing a norm critical perspective in sociology and criminology courses. The study demonstrates the inclusion aspect of online education, the benefits of adopting a norm critical perspective, the challenges that arise when updating course literature, and the urgent need for guidance and education for teachers regarding inclusion and paying attention to power asymmetry.Keywords: norm critical pedagogy, online-education, sociological canon, sweden
Procedia PDF Downloads 7726022 Generation of Quasi-Measurement Data for On-Line Process Data Analysis
Authors: Hyun-Woo Cho
Abstract:
For ensuring the safety of a manufacturing process one should quickly identify an assignable cause of a fault in an on-line basis. To this end, many statistical techniques including linear and nonlinear methods have been frequently utilized. However, such methods possessed a major problem of small sample size, which is mostly attributed to the characteristics of empirical models used for reference models. This work presents a new method to overcome the insufficiency of measurement data in the monitoring and diagnosis tasks. Some quasi-measurement data are generated from existing data based on the two indices of similarity and importance. The performance of the method is demonstrated using a real data set. The results turn out that the presented methods are able to handle the insufficiency problem successfully. In addition, it is shown to be quite efficient in terms of computational speed and memory usage, and thus on-line implementation of the method is straightforward for monitoring and diagnosis purposes.Keywords: data analysis, diagnosis, monitoring, process data, quality control
Procedia PDF Downloads 48126021 Attracting European Youths to STEM Education and Careers: A Pedagogical Approach to a Hybrid Learning Environment
Authors: M. Assaad, J. Mäkiö, T. Mäkelä, M. Kankaanranta, N. Fachantidis, V. Dagdilelis, A. Reid, C. R. del Rio, E. V. Pavlysh, S. V. Piashkun
Abstract:
To bring science and society together in Europe, thus increasing the continent’s international competitiveness, STEM (science, technology, engineering and mathematics) education must be more relatable to European youths in their everyday life. STIMEY (Science, Technology, Innovation, Mathematics, Engineering for the Young) project researches and develops a hybrid educational environment with multi-level components that is being designed and developed based on a well-researched pedagogical framework, aiming to make STEM education more attractive to young people aged 10 to 18 years in this digital era. This environment combines social media components, robotic artefacts, and radio to educate, engage and increase students’ interest in STEM education and careers from a young age. Additionally, it offers educators the necessary modern tools to deliver STEM education in an attractive and engaging manner in or out of class. Moreover, it enables parents to keep track of their children’s education, and collaborate with their teachers on their development. Finally, the open platform allows businesses to invest in the growth of the youths’ talents and skills in line with the economic and labour market needs through entrepreneurial tools. Thus, universities, schools, teachers, students, parents, and businesses come together to complete a circle in which STEM becomes part of the daily life of youths through a hybrid educational environment that also prepares them for future careers.Keywords: e-learning, entrepreneurship, pedagogy, robotics, serious gaming, social media, STEM education
Procedia PDF Downloads 37226020 Experimental Monitoring of the Parameters of the Ionosphere in the Local Area Using the Results of Multifrequency GNSS-Measurements
Authors: Andrey Kupriyanov
Abstract:
In recent years, much attention has been paid to the problems of ionospheric disturbances and their influence on the signals of global navigation satellite systems (GNSS) around the world. This is due to the increase in solar activity, the expansion of the scope of GNSS, the emergence of new satellite systems, the introduction of new frequencies and many others. The influence of the Earth's ionosphere on the propagation of radio signals is an important factor in many applied fields of science and technology. The paper considers the application of the method of transionospheric sounding using measurements from signals from Global Navigation Satellite Systems to determine the TEC distribution and scintillations of the ionospheric layers. To calculate these parameters, the International Reference Ionosphere (IRI) model of the ionosphere, refined in the local area, is used. The organization of operational monitoring of ionospheric parameters is analyzed using several NovAtel GPStation6 base stations. It allows performing primary processing of GNSS measurement data, calculating TEC and fixing scintillation moments, modeling the ionosphere using the obtained data, storing data and performing ionospheric correction in measurements. As a result of the study, it was proved that the use of the transionospheric sounding method for reconstructing the altitude distribution of electron concentration in different altitude range and would provide operational information about the ionosphere, which is necessary for solving a number of practical problems in the field of many applications. Also, the use of multi-frequency multisystem GNSS equipment and special software will allow achieving the specified accuracy and volume of measurements.Keywords: global navigation satellite systems (GNSS), GPstation6, international reference ionosphere (IRI), ionosphere, scintillations, total electron content (TEC)
Procedia PDF Downloads 18126019 A Critical Reflection of Ableist Methodologies: Approaching Interviews and Go-Along Interviews
Authors: Hana Porkertová, Pavel Doboš
Abstract:
Based on a research project studying the experience of visually disabled people with urban space in the Czech Republic, the conference contribution discusses the limits of social-science methodologies used in sociology and human geography. It draws on actor-network theory, assuming that science does not describe reality but produces it. Methodology connects theory, research questions, ways to answer them (methods), and results. A research design utilizing ableist methodologies can produce ableist realities. Therefore, it was necessary to adjust the methods so that they could mediate blind experience to the scientific community without reproducing ableism. The researchers faced multiple challenges, ranging from questionable validity to how to research experience that differs from that of the researchers who are able-bodied. Finding a suitable theory that could be used as an analytical tool that would demonstrate space and blind experience as multiple, dynamic, and mutually constructed was the first step that could offer a range of potentially productive methods and research questions, as well as bring critically reflected results. Poststructural theory, mainly Deleuze-Guattarian philosophy, was chosen, and two methods were used: interviews and go-along interviews that had to be adjusted to be able to explore blind experience. In spite of a thorough preparation of these methods, new difficulties kept emerging, which exposed the ableist character of scientific knowledge. From the beginning of data collecting, there was an agreement to work in teams with slightly different roles of each of the researchers, which was significant especially during go-along interviews. In some cases, the anticipations of the researchers and participants differed, which led to unexpected and potentially dangerous situations. These were not caused only by the differences between scientific and lay communities but also between able-bodied and disabled people. Researchers were sometimes assigned to the assistants’ roles, and this new position – doing research together – required further negotiations, which also opened various ethical questions.Keywords: ableist methodology, blind experience, go-along interviews, research ethics, scientific knowledge
Procedia PDF Downloads 16526018 Emerging Technology for Business Intelligence Applications
Authors: Hsien-Tsen Wang
Abstract:
Business Intelligence (BI) has long helped organizations make informed decisions based on data-driven insights and gain competitive advantages in the marketplace. In the past two decades, businesses witnessed not only the dramatically increasing volume and heterogeneity of business data but also the emergence of new technologies, such as Artificial Intelligence (AI), Semantic Web (SW), Cloud Computing, and Big Data. It is plausible that the convergence of these technologies would bring more value out of business data by establishing linked data frameworks and connecting in ways that enable advanced analytics and improved data utilization. In this paper, we first review and summarize current BI applications and methodology. Emerging technologies that can be integrated into BI applications are then discussed. Finally, we conclude with a proposed synergy framework that aims at achieving a more flexible, scalable, and intelligent BI solution.Keywords: business intelligence, artificial intelligence, semantic web, big data, cloud computing
Procedia PDF Downloads 9426017 The Effects of Virtual Reality Technology in Maternity Delivery: A Systematic Review and Meta-Analysis
Authors: Nuo Xu, Sijing Chen
Abstract:
Background: Childbirth is considered a critical traumatic event throughout our lives, positively or negatively impacting the mother's physiology, psychology, and even the whole family. Adverse birth experiences, such as labor pain, anxiety, and fear can negatively impact the mother. Studies had shown that the immersive nature of VR can distract attention from pain and increase focus on interventions for pain relief. However, the existing studies that applied VR to maternal delivery were still in their infancy and showed disparate results, and the small sample size is not representative, so this review analyzed the effects of VR in labor, such as on maternal pain and anxiety, with a view to providing a basis for future applications. Search strategy: We searched Pubmed, Embase, Web of Science, the Cochrane Library, CINAHL, China National Knowledge Infrastructure, Wan-Fang database from the building to November 17, 2021. Selection Criteria: Randomized controlled trials (RCTs) that intervened the pregnant women aged 18-35 years with gestational >34 weeks and without complications with VR technology were contained within this review. Data Collection and Analysis: Two researchers completed the study selection, data extraction, and assessment of study quality. For quantitative data we used MD or SMD, and RR (risk ratio) for qualitative data. Random-effects model and 95% confidence interval (95% CI) were used. Main Results: 12 studies were included. Using VR could relieve pain during labor (MD=-1.81, 95% CI (-2.04, -1.57), P< 0.00001) and active period (SMD=-0.41, 95% CI (-0.68, -0.14), P= 0.003), reduce anxiety (SMD=-1.39, 95% CI (-1.99, -0.78), P< 0.00001) and improve satisfaction (RR = 1.32; 95% CI (1.10, 1.59); P = 0.003), but the effect on the duration of first (SMD=-1.12, 95% CI (-2.38, 0.13), P=0.08) and second (SMD=-0.22, 95% CI (-0.67, 0.24), P=0.35) stage of labor was not statistically significant. Conclusions: Compared with conventional care, VR technology can relieve labor pain and anxiety and improve satisfaction. However, extensive experimental validation is still needed.Keywords: virtual reality, delivery, labor pain, anxiety, meta-analysis, systematic review
Procedia PDF Downloads 9226016 Chaotic Analysis of Acid Rains with Times Series of pH Degree, Nitrate and Sulphate Concentration on Wet Samples
Authors: Aysegul Sener, Gonca Tuncel Memis, Mirac Kamislioglu
Abstract:
Chaos theory is one of the new paradigms of science since the last century. After determining chaos in the weather systems by Edward Lorenz the popularity of the theory was increased. Chaos is observed in many natural systems and studies continue to defect chaos to other natural systems. Acid rain is one of the environmental problems that have negative effects on environment and acid rains values are monitored continuously. In this study, we aim that analyze the chaotic behavior of acid rains in Turkey with the chaotic defecting approaches. The data of pH degree of rain waters, concentration of sulfate and nitrate data of wet rain water samples in the rain collecting stations which are located in different regions of Turkey are provided by Turkish State Meteorology Service. Lyapunov exponents, reconstruction of the phase space, power spectrums are used in this study to determine and predict the chaotic behaviors of acid rains. As a result of the analysis it is found that acid rain time series have positive Lyapunov exponents and wide power spectrums and chaotic behavior is observed in the acid rain time series.Keywords: acid rains, chaos, chaotic analysis, Lypapunov exponents
Procedia PDF Downloads 14626015 Using Equipment Telemetry Data for Condition-Based maintenance decisions
Authors: John Q. Todd
Abstract:
Given that modern equipment can provide comprehensive health, status, and error condition data via built-in sensors, maintenance organizations have a new and valuable source of insight to take advantage of. This presentation will expose what these data payloads might look like and how they can be filtered, visualized, calculated into metrics, used for machine learning, and generate alerts for further action.Keywords: condition based maintenance, equipment data, metrics, alerts
Procedia PDF Downloads 18826014 Web-Based Decision Support Systems and Intelligent Decision-Making: A Systematic Analysis
Authors: Serhat Tüzün, Tufan Demirel
Abstract:
Decision Support Systems (DSS) have been investigated by researchers and technologists for more than 35 years. This paper analyses the developments in the architecture and software of these systems, provides a systematic analysis for different Web-based DSS approaches and Intelligent Decision-making Technologies (IDT), with the suggestion for future studies. Decision Support Systems literature begins with building model-oriented DSS in the late 1960s, theory developments in the 1970s, and the implementation of financial planning systems and Group DSS in the early and mid-80s. Then it documents the origins of Executive Information Systems, online analytic processing (OLAP) and Business Intelligence. The implementation of Web-based DSS occurred in the mid-1990s. With the beginning of the new millennia, intelligence is the main focus on DSS studies. Web-based technologies are having a major impact on design, development and implementation processes for all types of DSS. Web technologies are being utilized for the development of DSS tools by leading developers of decision support technologies. Major companies are encouraging its customers to port their DSS applications, such as data mining, customer relationship management (CRM) and OLAP systems, to a web-based environment. Similarly, real-time data fed from manufacturing plants are now helping floor managers make decisions regarding production adjustment to ensure that high-quality products are produced and delivered. Web-based DSS are being employed by organizations as decision aids for employees as well as customers. A common usage of Web-based DSS has been to assist customers configure product and service according to their needs. These systems allow individual customers to design their own products by choosing from a menu of attributes, components, prices and delivery options. The Intelligent Decision-making Technologies (IDT) domain is a fast growing area of research that integrates various aspects of computer science and information systems. This includes intelligent systems, intelligent technology, intelligent agents, artificial intelligence, fuzzy logic, neural networks, machine learning, knowledge discovery, computational intelligence, data science, big data analytics, inference engines, recommender systems or engines, and a variety of related disciplines. Innovative applications that emerge using IDT often have a significant impact on decision-making processes in government, industry, business, and academia in general. This is particularly pronounced in finance, accounting, healthcare, computer networks, real-time safety monitoring and crisis response systems. Similarly, IDT is commonly used in military decision-making systems, security, marketing, stock market prediction, and robotics. Even though lots of research studies have been conducted on Decision Support Systems, a systematic analysis on the subject is still missing. Because of this necessity, this paper has been prepared to search recent articles about the DSS. The literature has been deeply reviewed and by classifying previous studies according to their preferences, taxonomy for DSS has been prepared. With the aid of the taxonomic review and the recent developments over the subject, this study aims to analyze the future trends in decision support systems.Keywords: decision support systems, intelligent decision-making, systematic analysis, taxonomic review
Procedia PDF Downloads 27926013 Status of Participative Governance Practices in Higher Education: Implications for Stakeholders' Transformative Role-Assumption
Authors: Endalew Fufa Kufi
Abstract:
The research investigated the role of stakeholders such as students, teachers and administrators in the practices of good governance in higher education by looking into the special contributions of top-officials, teachers and students in ensuring workable ties and productive interchanges in Adama Science and Technology University. Attention was given to participation, fairness and exemplariness as key indicators of good governance. The target university was chosen for its familiarity for the researcher to get dependable data, access to respondent and management of the processing of data. Descriptive survey design was used for the purpose of describing concerned roles the stakeholders in the university governance in order to reflect on the nature of participation of the practices. Centres of the research were administration where supportive groups such as central administrators and underlying service-givers had parts and academia where teachers and students were target. Generally, 60 teachers, 40 students and 15 administrative officers were referents. Data were collected in the form of self-report through open-ended questionnaires. The findings indicated that, while vertical interchanges in terms of academic and administrative routines were had normal flow on top-down basis, planned practices of stakeholders in decision-making and reasonably communicating roles and changes in decisions with top-officials were not efficiently practiced. Moreover, the practices of good modelling were not witnessed to have existed to the fullest extent. Rather, existence of a very wide gap between the academic and administrative staffs was witnessed as was reflected the case between teachers and students. The implication was such that for shortage in participative atmosphere and weaning of fairness in governance, routine practices have been there as the vicious circles of governance.Keywords: governance, participative, stakeholders, transformative, role-assumption
Procedia PDF Downloads 39826012 Ethics Can Enable Open Source Data Research
Authors: Dragana Calic
Abstract:
The openness, availability and the sheer volume of big data have provided, what some regard as, an invaluable and rich dataset. Researchers, businesses, advertising agencies, medical institutions, to name only a few, collect, share, and analyze this data to enable their processes and decision making. However, there are important ethical considerations associated with the use of big data. The rapidly evolving nature of online technologies has overtaken the many legislative, privacy, and ethical frameworks and principles that exist. For example, should we obtain consent to use people’s online data, and under what circumstances can privacy considerations be overridden? Current guidance on how to appropriately and ethically handle big data is inconsistent. Consequently, this paper focuses on two quite distinct but related ethical considerations that are at the core of the use of big data for research purposes. They include empowering the producers of data and empowering researchers who want to study big data. The first consideration focuses on informed consent which is at the core of empowering producers of data. In this paper, we discuss some of the complexities associated with informed consent and consider studies of producers’ perceptions to inform research ethics guidelines and practice. The second consideration focuses on the researcher. Similarly, we explore studies that focus on researchers’ perceptions and experiences.Keywords: big data, ethics, producers’ perceptions, researchers’ perceptions
Procedia PDF Downloads 28426011 Hybrid Reliability-Similarity-Based Approach for Supervised Machine Learning
Authors: Walid Cherif
Abstract:
Data mining has, over recent years, seen big advances because of the spread of internet, which generates everyday a tremendous volume of data, and also the immense advances in technologies which facilitate the analysis of these data. In particular, classification techniques are a subdomain of Data Mining which determines in which group each data instance is related within a given dataset. It is used to classify data into different classes according to desired criteria. Generally, a classification technique is either statistical or machine learning. Each type of these techniques has its own limits. Nowadays, current data are becoming increasingly heterogeneous; consequently, current classification techniques are encountering many difficulties. This paper defines new measure functions to quantify the resemblance between instances and then combines them in a new approach which is different from actual algorithms by its reliability computations. Results of the proposed approach exceeded most common classification techniques with an f-measure exceeding 97% on the IRIS Dataset.Keywords: data mining, knowledge discovery, machine learning, similarity measurement, supervised classification
Procedia PDF Downloads 46426010 Seismic Data Scaling: Uncertainties, Potential and Applications in Workstation Interpretation
Authors: Ankur Mundhra, Shubhadeep Chakraborty, Y. R. Singh, Vishal Das
Abstract:
Seismic data scaling affects the dynamic range of a data and with present day lower costs of storage and higher reliability of Hard Disk data, scaling is not suggested. However, in dealing with data of different vintages, which perhaps were processed in 16 bits or even 8 bits and are need to be processed with 32 bit available data, scaling is performed. Also, scaling amplifies low amplitude events in deeper region which disappear due to high amplitude shallow events that saturate amplitude scale. We have focused on significance of scaling data to aid interpretation. This study elucidates a proper seismic loading procedure in workstations without using default preset parameters as available in most software suites. Differences and distribution of amplitude values at different depth for seismic data are probed in this exercise. Proper loading parameters are identified and associated steps are explained that needs to be taken care of while loading data. Finally, the exercise interprets the un-certainties which might arise when correlating scaled and unscaled versions of seismic data with synthetics. As, seismic well tie correlates the seismic reflection events with well markers, for our study it is used to identify regions which are enhanced and/or affected by scaling parameter(s).Keywords: clipping, compression, resolution, seismic scaling
Procedia PDF Downloads 46926009 Association of Social Data as a Tool to Support Government Decision Making
Authors: Diego Rodrigues, Marcelo Lisboa, Elismar Batista, Marcos Dias
Abstract:
Based on data on child labor, this work arises questions about how to understand and locate the factors that make up the child labor rates, and which properties are important to analyze these cases. Using data mining techniques to discover valid patterns on Brazilian social databases were evaluated data of child labor in the State of Tocantins (located north of Brazil with a territory of 277000 km2 and comprises 139 counties). This work aims to detect factors that are deterministic for the practice of child labor and their relationships with financial indicators, educational, regional and social, generating information that is not explicit in the government database, thus enabling better monitoring and updating policies for this purpose.Keywords: social data, government decision making, association of social data, data mining
Procedia PDF Downloads 36926008 A Particle Filter-Based Data Assimilation Method for Discrete Event Simulation
Authors: Zhi Zhu, Boquan Zhang, Tian Jing, Jingjing Li, Tao Wang
Abstract:
Data assimilation is a model and data hybrid-driven method that dynamically fuses new observation data with a numerical model to iteratively approach the real system state. It is widely used in state prediction and parameter inference of continuous systems. Because of the discrete event system’s non-linearity and non-Gaussianity, traditional Kalman Filter based on linear and Gaussian assumptions cannot perform data assimilation for such systems, so particle filter has gradually become a technical approach for discrete event simulation data assimilation. Hence, we proposed a particle filter-based discrete event simulation data assimilation method and took the unmanned aerial vehicle (UAV) maintenance service system as a proof of concept to conduct simulation experiments. The experimental results showed that the filtered state data is closer to the real state of the system, which verifies the effectiveness of the proposed method. This research can provide a reference framework for the data assimilation process of other complex nonlinear systems, such as discrete-time and agent simulation.Keywords: discrete event simulation, data assimilation, particle filter, model and data-driven
Procedia PDF Downloads 1326007 Outlier Detection in Stock Market Data using Tukey Method and Wavelet Transform
Authors: Sadam Alwadi
Abstract:
Outlier values become a problem that frequently occurs in the data observation or recording process. Thus, the need for data imputation has become an essential matter. In this work, it will make use of the methods described in the prior work to detect the outlier values based on a collection of stock market data. In order to implement the detection and find some solutions that maybe helpful for investors, real closed price data were obtained from the Amman Stock Exchange (ASE). Tukey and Maximum Overlapping Discrete Wavelet Transform (MODWT) methods will be used to impute the detect the outlier values.Keywords: outlier values, imputation, stock market data, detecting, estimation
Procedia PDF Downloads 8126006 PEINS: A Generic Compression Scheme Using Probabilistic Encoding and Irrational Number Storage
Authors: P. Jayashree, S. Rajkumar
Abstract:
With social networks and smart devices generating a multitude of data, effective data management is the need of the hour for networks and cloud applications. Some applications need effective storage while some other applications need effective communication over networks and data reduction comes as a handy solution to meet out both requirements. Most of the data compression techniques are based on data statistics and may result in either lossy or lossless data reductions. Though lossy reductions produce better compression ratios compared to lossless methods, many applications require data accuracy and miniature details to be preserved. A variety of data compression algorithms does exist in the literature for different forms of data like text, image, and multimedia data. In the proposed work, a generic progressive compression algorithm, based on probabilistic encoding, called PEINS is projected as an enhancement over irrational number stored coding technique to cater to storage issues of increasing data volumes as a cost effective solution, which also offers data security as a secondary outcome to some extent. The proposed work reveals cost effectiveness in terms of better compression ratio with no deterioration in compression time.Keywords: compression ratio, generic compression, irrational number storage, probabilistic encoding
Procedia PDF Downloads 29426005 Iot Device Cost Effective Storage Architecture and Real-Time Data Analysis/Data Privacy Framework
Authors: Femi Elegbeleye, Omobayo Esan, Muienge Mbodila, Patrick Bowe
Abstract:
This paper focused on cost effective storage architecture using fog and cloud data storage gateway and presented the design of the framework for the data privacy model and data analytics framework on a real-time analysis when using machine learning method. The paper began with the system analysis, system architecture and its component design, as well as the overall system operations. The several results obtained from this study on data privacy model shows that when two or more data privacy model is combined we tend to have a more stronger privacy to our data, and when fog storage gateway have several advantages over using the traditional cloud storage, from our result shows fog has reduced latency/delay, low bandwidth consumption, and energy usage when been compare with cloud storage, therefore, fog storage will help to lessen excessive cost. This paper dwelt more on the system descriptions, the researchers focused on the research design and framework design for the data privacy model, data storage, and real-time analytics. This paper also shows the major system components and their framework specification. And lastly, the overall research system architecture was shown, its structure, and its interrelationships.Keywords: IoT, fog, cloud, data analysis, data privacy
Procedia PDF Downloads 9926004 To Study the New Invocation of Biometric Authentication Technique
Authors: Aparna Gulhane
Abstract:
Biometrics is the science and technology of measuring and analyzing biological data form the basis of research in biological measuring techniques for the purpose of people identification and recognition. In information technology, biometrics refers to technologies that measure and analyze human body characteristics, such as DNA, fingerprints, eye retinas and irises, voice patterns, facial patterns and hand measurements. Biometric systems are used to authenticate the person's identity. The idea is to use the special characteristics of a person to identify him. These papers present a biometric authentication techniques and actual deployment of potential by overall invocation of biometrics recognition, with an independent testing of various biometric authentication products and technology.Keywords: types of biometrics, importance of biometric, review for biometrics and getting a new implementation, biometric authentication technique
Procedia PDF Downloads 321