Search results for: computing methodologies
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2001

Search results for: computing methodologies

1581 e-Learning Security: A Distributed Incident Response Generator

Authors: Bel G Raggad

Abstract:

An e-Learning setting is a distributed computing environment where information resources can be connected to any public network. Public networks are very unsecure which can compromise the reliability of an e-Learning environment. This study is only concerned with the intrusion detection aspect of e-Learning security and how incident responses are planned. The literature reported great advances in intrusion detection system (ids) but neglected to study an important ids weakness: suspected events are detected but an intrusion is not determined because it is not defined in ids databases. We propose an incident response generator (DIRG) that produces incident responses when the working ids system suspects an event that does not correspond to a known intrusion. Data involved in intrusion detection when ample uncertainty is present is often not suitable to formal statistical models including Bayesian. We instead adopt Dempster and Shafer theory to process intrusion data for the unknown event. The DIRG engine transforms data into a belief structure using incident scenarios deduced by the security administrator. Belief values associated with various incident scenarios are then derived and evaluated to choose the most appropriate scenario for which an automatic incident response is generated. This article provides a numerical example demonstrating the working of the DIRG system.

Keywords: decision support system, distributed computing, e-Learning security, incident response, intrusion detection, security risk, statefull inspection

Procedia PDF Downloads 419
1580 Learning in the Virtual Laboratory via Design of Automation Process for Wooden Hammers Marking

Authors: A. Javorova, J. Oravcova, K. Velisek

Abstract:

The article summarizes the experience of technical subjects teaching methodologies using a number of software products to solve specific assigned tasks described in this paper. Task is about the problems of automation and mechanization in the industry. Specifically, it focuses on introducing automation in the wood industry. The article describes the design of the automation process for marking wooden hammers. Similar problems are solved by students in CA laboratory.

Keywords: CA system, education, simulation, subject

Procedia PDF Downloads 283
1579 Innovation in PhD Training in the Interdisciplinary Research Institute

Authors: B. Shaw, K. Doherty

Abstract:

The Cultural Communication and Computing Research Institute (C3RI) is a diverse multidisciplinary research institute including art, design, media production, communication studies, computing and engineering. Across these disciplines it can seem like there are enormous differences of research practice and convention, including differing positions on objectivity and subjectivity, certainty and evidence, and different political and ethical parameters. These differences sit within, often unacknowledged, histories, codes, and communication styles of specific disciplines, and it is all these aspects that can make understanding of research practice across disciplines difficult. To explore this, a one day event was orchestrated, testing how a PhD community might communicate and share research in progress in a multi-disciplinary context. Instead of presenting results at a conference, research students were tasked to articulate their method of inquiry. A working party of students from across disciplines had to design a conference call, visual identity and an event framework that would work for students across all disciplines. The process of establishing the shape and identity of the conference was revealing. Even finding a linguistic frame that would meet the expectations of different disciplines for the conference call was challenging. The first abstracts submitted either resorted to reporting findings, or only described method briefly. It took several weeks of supported intervention for research students to get ‘inside’ their method and to understand their research practice as a process rich with philosophical and practical decisions and implications. In response to the abstracts the conference committee generated key methodological categories for conference sessions, including sampling, capturing ‘experience’, ‘making models’, researcher identities, and ‘constructing data’. Each session involved presentations by visual artists, communications students and computing researchers with inter-disciplinary dialogue, facilitated by alumni Chairs. The apparently simple focus on method illuminated research process as a site of creativity, innovation and discovery, and also built epistemological awareness, drawing attention to what is being researched and how it can be known. It was surprisingly difficult to limit students to discussing method, and it was apparent that the vocabulary available for method is sometimes limited. However, by focusing on method rather than results, the genuine process of research, rather than one constructed for approval, could be captured. In unlocking the twists and turns of planning and implementing research, and the impact of circumstance and contingency, students had to reflect frankly on successes and failures. This level of self – and public- critique emphasised the degree of critical thinking and rigour required in executing research and demonstrated that honest reportage of research, faults and all, is good valid research. The process also revealed the degree that disciplines can learn from each other- the computing students gained insights from the sensitive social contextualizing generated by communications and art and design students, and art and design students gained understanding from the greater ‘distance’ and emphasis on application that computing students applied to their subjects. Finding the means to develop dialogue across disciplines makes researchers better equipped to devise and tackle research problems across disciplines, potentially laying the ground for more effective collaboration.

Keywords: interdisciplinary, method, research student, training

Procedia PDF Downloads 189
1578 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 150
1577 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 138
1576 Dual Biometrics Fusion Based Recognition System

Authors: Prakash, Vikash Kumar, Vinay Bansal, L. N. Das

Abstract:

Dual biometrics is a subpart of multimodal biometrics, which refers to the use of a variety of modalities to identify and authenticate persons rather than just one. We limit the risks of mistakes by mixing several modals, and hackers have a tiny possibility of collecting information. Our goal is to collect the precise characteristics of iris and palmprint, produce a fusion of both methodologies, and ensure that authentication is only successful when the biometrics match a particular user. After combining different modalities, we created an effective strategy with a mean DI and EER of 2.41 and 5.21, respectively. A biometric system has been proposed.

Keywords: multimodal, fusion, palmprint, Iris, EER, DI

Procedia PDF Downloads 138
1575 Hybrid Genetic Approach for Solving Economic Dispatch Problems with Valve-Point Effect

Authors: Mohamed I. Mahrous, Mohamed G. Ashmawy

Abstract:

Hybrid genetic algorithm (HGA) is proposed in this paper to determine the economic scheduling of electric power generation over a fixed time period under various system and operational constraints. The proposed technique can outperform conventional genetic algorithms (CGAs) in the sense that HGA make it possible to improve both the quality of the solution and reduce the computing expenses. In contrast, any carefully designed GA is only able to balance the exploration and the exploitation of the search effort, which means that an increase in the accuracy of a solution can only occure at the sacrifice of convergent speed, and vice visa. It is unlikely that both of them can be improved simultaneously. The proposed hybrid scheme is developed in such a way that a simple GA is acting as a base level search, which makes a quick decision to direct the search towards the optimal region, and a local search method (pattern search technique) is next employed to do the fine tuning. The aim of the strategy is to achieve the cost reduction within a reasonable computing time. The effectiveness of the proposed hybrid technique is verified on two real public electricity supply systems with 13 and 40 generator units respectively. The simulation results obtained with the HGA for the two real systems are very encouraging with regard to the computational expenses and the cost reduction of power generation.

Keywords: genetic algorithms, economic dispatch, pattern search

Procedia PDF Downloads 424
1574 An Analysis of Innovative Cloud Model as Bridging the Gap between Physical and Virtualized Business Environments: The Customer Perspective

Authors: Asim Majeed, Rehan Bhana, Mak Sharma, Rebecca Goode, Nizam Bolia, Mike Lloyd-Williams

Abstract:

This study aims to investigate and explore the underlying causes of security concerns of customers emerged when WHSmith transformed its physical system to virtualized business model through NetSuite. NetSuite is essentially fully integrated software which helps transforming the physical system to virtualized business model. Modern organisations are moving away from traditional business models to cloud based models and consequently it is expected to have a better, secure and innovative environment for customers. The vital issue of the modern age race is the security when transforming virtualized through cloud based models and designers of interactive systems often misunderstand privacy and even often ignore it, thus causing concerns for users. The content analysis approach is being used to collect the qualitative data from 120 online bloggers including TRUSTPILOT. The results and finding provide useful new insights into the nature and form of security concerns of online users after they have used the WHSmith services offered online through their website. Findings have theoretical as well as practical implications for the successful adoption of cloud computing Business-to-Business model and similar systems.

Keywords: innovation, virtualization, cloud computing, organizational flexibility

Procedia PDF Downloads 370
1573 Estimation of Fragility Curves Using Proposed Ground Motion Selection and Scaling Procedure

Authors: Esra Zengin, Sinan Akkar

Abstract:

Reliable and accurate prediction of nonlinear structural response requires specification of appropriate earthquake ground motions to be used in nonlinear time history analysis. The current research has mainly focused on selection and manipulation of real earthquake records that can be seen as the most critical step in the performance based seismic design and assessment of the structures. Utilizing amplitude scaled ground motions that matches with the target spectra is commonly used technique for the estimation of nonlinear structural response. Representative ground motion ensembles are selected to match target spectrum such as scenario-based spectrum derived from ground motion prediction equations, Uniform Hazard Spectrum (UHS), Conditional Mean Spectrum (CMS) or Conditional Spectrum (CS). Different sets of criteria exist among those developed methodologies to select and scale ground motions with the objective of obtaining robust estimation of the structural performance. This study presents ground motion selection and scaling procedure that considers the spectral variability at target demand with the level of ground motion dispersion. The proposed methodology provides a set of ground motions whose response spectra match target median and corresponding variance within a specified period interval. The efficient and simple algorithm is used to assemble the ground motion sets. The scaling stage is based on the minimization of the error between scaled median and the target spectra where the dispersion of the earthquake shaking is preserved along the period interval. The impact of the spectral variability on nonlinear response distribution is investigated at the level of inelastic single degree of freedom systems. In order to see the effect of different selection and scaling methodologies on fragility curve estimations, results are compared with those obtained by CMS-based scaling methodology. The variability in fragility curves due to the consideration of dispersion in ground motion selection process is also examined.

Keywords: ground motion selection, scaling, uncertainty, fragility curve

Procedia PDF Downloads 571
1572 Play-Based Early Education and Teachers’ Professional Development: Impact on Vulnerable Children

Authors: Chirine Dannaoui, Maya Antoun

Abstract:

This paper explores the intricate dynamics of play-based early childhood education (ECE) and the impact of professional development on teachers implementing play-based pedagogy, particularly in the context of vulnerable Syrian refugee children in Lebanon. By utilizing qualitative methodologies, including classroom observations and in-depth interviews with five early childhood educators and a field manager, this study delves into the challenges and transformations experienced by teachers in adopting play-based learning strategies. The research unveils the critical role of continuous and context-specific professional development in empowering teachers to implement play-based pedagogies effectively. When appropriately supported, it emphasizes how such educational approaches significantly enhance children's cognitive, social, and emotional development in crisis-affected environments. Key findings indicate that despite diverse educational backgrounds, teachers show considerable growth in their pedagogical skills through targeted professional development. This growth is vital for fostering a learning environment where vulnerable children can thrive, particularly in humanitarian settings. The paper also addresses educators' challenges, including adapting to play-based methodologies, resource limitations, and balancing curricular requirements with the need for holistic child development. This study contributes to the discourse on early childhood education in crisis contexts, emphasizing the need for sustainable, well-structured professional development programs. It underscores the potential of play-based learning to bridge educational gaps and contribute to the healing process of children facing calamity. The study highlights significant implications for policymakers, educators, schools, and not-for-profit organizations engaged in early childhood education in humanitarian contexts, stressing the importance of investing in teacher capacity and curriculum reform to enhance the quality of education for children in general and vulnerable ones in particular.

Keywords: play-based learning, professional development, vulnerable children, early childhood education

Procedia PDF Downloads 43
1571 Protocol for Dynamic Load Distributed Low Latency Web-Based Augmented Reality and Virtual Reality

Authors: Rohit T. P., Sahil Athrij, Sasi Gopalan

Abstract:

Currently, the content entertainment industry is dominated by mobile devices. As the trends slowly shift towards Augmented/Virtual Reality applications the computational demands on these devices are increasing exponentially and we are already reaching the limits of hardware optimizations. This paper proposes a software solution to this problem. By leveraging the capabilities of cloud computing we can offload the work from mobile devices to dedicated rendering servers that are way more powerful. But this introduces the problem of latency. This paper introduces a protocol that can achieve high-performance low latency Augmented/Virtual Reality experience. There are two parts to the protocol, 1) In-flight compression The main cause of latency in the system is the time required to transmit the camera frame from client to server. The round trip time is directly proportional to the amount of data transmitted. This can therefore be reduced by compressing the frames before sending. Using some standard compression algorithms like JPEG can result in minor size reduction only. Since the images to be compressed are consecutive camera frames there won't be a lot of changes between two consecutive images. So inter-frame compression is preferred. Inter-frame compression can be implemented efficiently using WebGL but the implementation of WebGL limits the precision of floating point numbers to 16bit in most devices. This can introduce noise to the image due to rounding errors, which will add up eventually. This can be solved using an improved interframe compression algorithm. The algorithm detects changes between frames and reuses unchanged pixels from the previous frame. This eliminates the need for floating point subtraction thereby cutting down on noise. The change detection is also improved drastically by taking the weighted average difference of pixels instead of the absolute difference. The kernel weights for this comparison can be fine-tuned to match the type of image to be compressed. 2) Dynamic Load distribution Conventional cloud computing architectures work by offloading as much work as possible to the servers, but this approach can cause a hit on bandwidth and server costs. The most optimal solution is obtained when the device utilizes 100% of its resources and the rest is done by the server. The protocol balances the load between the server and the client by doing a fraction of the computing on the device depending on the power of the device and network conditions. The protocol will be responsible for dynamically partitioning the tasks. Special flags will be used to communicate the workload fraction between the client and the server and will be updated in a constant interval of time ( or frames ). The whole of the protocol is designed so that it can be client agnostic. Flags are available to the client for resetting the frame, indicating latency, switching mode, etc. The server can react to client-side changes on the fly and adapt accordingly by switching to different pipelines. The server is designed to effectively spread the load and thereby scale horizontally. This is achieved by isolating client connections into different processes.

Keywords: 2D kernelling, augmented reality, cloud computing, dynamic load distribution, immersive experience, mobile computing, motion tracking, protocols, real-time systems, web-based augmented reality application

Procedia PDF Downloads 60
1570 Soft Computing Approach for Diagnosis of Lassa Fever

Authors: Roseline Oghogho Osaseri, Osaseri E. I.

Abstract:

Lassa fever is an epidemic hemorrhagic fever caused by the Lassa virus, an extremely virulent arena virus. This highly fatal disorder kills 10% to 50% of its victims, but those who survive its early stages usually recover and acquire immunity to secondary attacks. One of the major challenges in giving proper treatment is lack of fast and accurate diagnosis of the disease due to multiplicity of symptoms associated with the disease which could be similar to other clinical conditions and makes it difficult to diagnose early. This paper proposed an Adaptive Neuro Fuzzy Inference System (ANFIS) for the prediction of Lass Fever. In the design of the diagnostic system, four main attributes were considered as the input parameters and one output parameter for the system. The input parameters are Temperature on admission (TA), White Blood Count (WBC), Proteinuria (P) and Abdominal Pain (AP). Sixty-one percent of the datasets were used in training the system while fifty-nine used in testing. Experimental results from this study gave a reliable and accurate prediction of Lassa fever when compared with clinically confirmed cases. In this study, we have proposed Lassa fever diagnostic system to aid surgeons and medical healthcare practictionals in health care facilities who do not have ready access to Polymerase Chain Reaction (PCR) diagnosis to predict possible Lassa fever infection.

Keywords: anfis, lassa fever, medical diagnosis, soft computing

Procedia PDF Downloads 245
1569 Mathematics Bridging Theory and Applications for a Data-Driven World

Authors: Zahid Ullah, Atlas Khan

Abstract:

In today's data-driven world, the role of mathematics in bridging the gap between theory and applications is becoming increasingly vital. This abstract highlights the significance of mathematics as a powerful tool for analyzing, interpreting, and extracting meaningful insights from vast amounts of data. By integrating mathematical principles with real-world applications, researchers can unlock the full potential of data-driven decision-making processes. This abstract delves into the various ways mathematics acts as a bridge connecting theoretical frameworks to practical applications. It explores the utilization of mathematical models, algorithms, and statistical techniques to uncover hidden patterns, trends, and correlations within complex datasets. Furthermore, it investigates the role of mathematics in enhancing predictive modeling, optimization, and risk assessment methodologies for improved decision-making in diverse fields such as finance, healthcare, engineering, and social sciences. The abstract also emphasizes the need for interdisciplinary collaboration between mathematicians, statisticians, computer scientists, and domain experts to tackle the challenges posed by the data-driven landscape. By fostering synergies between these disciplines, novel approaches can be developed to address complex problems and make data-driven insights accessible and actionable. Moreover, this abstract underscores the importance of robust mathematical foundations for ensuring the reliability and validity of data analysis. Rigorous mathematical frameworks not only provide a solid basis for understanding and interpreting results but also contribute to the development of innovative methodologies and techniques. In summary, this abstract advocates for the pivotal role of mathematics in bridging theory and applications in a data-driven world. By harnessing mathematical principles, researchers can unlock the transformative potential of data analysis, paving the way for evidence-based decision-making, optimized processes, and innovative solutions to the challenges of our rapidly evolving society.

Keywords: mathematics, bridging theory and applications, data-driven world, mathematical models

Procedia PDF Downloads 57
1568 A Fast Parallel and Distributed Type-2 Fuzzy Algorithm Based on Cooperative Mobile Agents Model for High Performance Image Processing

Authors: Fatéma Zahra Benchara, Mohamed Youssfi, Omar Bouattane, Hassan Ouajji, Mohamed Ouadi Bensalah

Abstract:

The aim of this paper is to present a distributed implementation of the Type-2 Fuzzy algorithm in a parallel and distributed computing environment based on mobile agents. The proposed algorithm is assigned to be implemented on a SPMD (Single Program Multiple Data) architecture which is based on cooperative mobile agents as AVPE (Agent Virtual Processing Element) model in order to improve the processing resources needed for performing the big data image segmentation. In this work we focused on the application of this algorithm in order to process the big data MRI (Magnetic Resonance Images) image of size (n x m). It is encapsulated on the Mobile agent team leader in order to be split into (m x n) pixels one per AVPE. Each AVPE perform and exchange the segmentation results and maintain asynchronous communication with their team leader until the convergence of this algorithm. Some interesting experimental results are obtained in terms of accuracy and efficiency analysis of the proposed implementation, thanks to the mobile agents several interesting skills introduced in this distributed computational model.

Keywords: distributed type-2 fuzzy algorithm, image processing, mobile agents, parallel and distributed computing

Procedia PDF Downloads 410
1567 Integrated Teaching of Hardware Courses for the Undergraduates of Computer Science and Engineering to Attain Focused Outcomes

Authors: Namrata D. Hiremath, Mahalaxmi Bhille, P. G. Sunitha Hiremath

Abstract:

Computer systems play an integral role in all facets of the engineering profession. This calls for an understanding of the processor-level components of computer systems, their design and operation, and their impact on the overall performance of the systems. Systems users are always in need of faster, more powerful, yet cheaper computer systems. The focus of Computer Science engineering graduates is inclined towards software oriented base. To be an efficient programmer there is a need to understand the role of hardware architecture towards the same. It is essential for the students of Computer Science and Engineering to know the basic building blocks of any computing device and how the digital principles can be used to build them. Hence two courses Digital Electronics of 3 credits, which is associated with lab of 1.5 credits and Computer Organization of 5 credits, were introduced at the sophomore level. Activity was introduced with the objective to teach the hardware concepts to the students of Computer science engineering through structured lab. The students were asked to design and implement a component of a computing device using MultiSim simulation tool and build the same using hardware components. The experience of the activity helped the students to understand the real time applications of the SSI and MSI components. The impact of the activity was evaluated and the performance was measured. The paper explains the achievement of the ABET outcomes a, c and k.

Keywords: digital, computer organization, ABET, structured enquiry, course activity

Procedia PDF Downloads 481
1566 Design of an Ensemble Learning Behavior Anomaly Detection Framework

Authors: Abdoulaye Diop, Nahid Emad, Thierry Winter, Mohamed Hilia

Abstract:

Data assets protection is a crucial issue in the cybersecurity field. Companies use logical access control tools to vault their information assets and protect them against external threats, but they lack solutions to counter insider threats. Nowadays, insider threats are the most significant concern of security analysts. They are mainly individuals with legitimate access to companies information systems, which use their rights with malicious intents. In several fields, behavior anomaly detection is the method used by cyber specialists to counter the threats of user malicious activities effectively. In this paper, we present the step toward the construction of a user and entity behavior analysis framework by proposing a behavior anomaly detection model. This model combines machine learning classification techniques and graph-based methods, relying on linear algebra and parallel computing techniques. We show the utility of an ensemble learning approach in this context. We present some detection methods tests results on an representative access control dataset. The use of some explored classifiers gives results up to 99% of accuracy.

Keywords: cybersecurity, data protection, access control, insider threat, user behavior analysis, ensemble learning, high performance computing

Procedia PDF Downloads 109
1565 SPARK: An Open-Source Knowledge Discovery Platform That Leverages Non-Relational Databases and Massively Parallel Computational Power for Heterogeneous Genomic Datasets

Authors: Thilina Ranaweera, Enes Makalic, John L. Hopper, Adrian Bickerstaffe

Abstract:

Data are the primary asset of biomedical researchers, and the engine for both discovery and research translation. As the volume and complexity of research datasets increase, especially with new technologies such as large single nucleotide polymorphism (SNP) chips, so too does the requirement for software to manage, process and analyze the data. Researchers often need to execute complicated queries and conduct complex analyzes of large-scale datasets. Existing tools to analyze such data, and other types of high-dimensional data, unfortunately suffer from one or more major problems. They typically require a high level of computing expertise, are too simplistic (i.e., do not fit realistic models that allow for complex interactions), are limited by computing power, do not exploit the computing power of large-scale parallel architectures (e.g. supercomputers, GPU clusters etc.), or are limited in the types of analysis available, compounded by the fact that integrating new analysis methods is not straightforward. Solutions to these problems, such as those developed and implemented on parallel architectures, are currently available to only a relatively small portion of medical researchers with access and know-how. The past decade has seen a rapid expansion of data management systems for the medical domain. Much attention has been given to systems that manage phenotype datasets generated by medical studies. The introduction of heterogeneous genomic data for research subjects that reside in these systems has highlighted the need for substantial improvements in software architecture. To address this problem, we have developed SPARK, an enabling and translational system for medical research, leveraging existing high performance computing resources, and analysis techniques currently available or being developed. It builds these into The Ark, an open-source web-based system designed to manage medical data. SPARK provides a next-generation biomedical data management solution that is based upon a novel Micro-Service architecture and Big Data technologies. The system serves to demonstrate the applicability of Micro-Service architectures for the development of high performance computing applications. When applied to high-dimensional medical datasets such as genomic data, relational data management approaches with normalized data structures suffer from unfeasibly high execution times for basic operations such as insert (i.e. importing a GWAS dataset) and the queries that are typical of the genomics research domain. SPARK resolves these problems by incorporating non-relational NoSQL databases that have been driven by the emergence of Big Data. SPARK provides researchers across the world with user-friendly access to state-of-the-art data management and analysis tools while eliminating the need for high-level informatics and programming skills. The system will benefit health and medical research by eliminating the burden of large-scale data management, querying, cleaning, and analysis. SPARK represents a major advancement in genome research technologies, vastly reducing the burden of working with genomic datasets, and enabling cutting edge analysis approaches that have previously been out of reach for many medical researchers.

Keywords: biomedical research, genomics, information systems, software

Procedia PDF Downloads 252
1564 Creation and Management of Knowledge for Organization Sustainability and Learning

Authors: Deepa Kapoor, Rajshree Singh

Abstract:

This paper appreciates the emergence and growing importance as a new production factor makes the development of technologies, methodologies and strategies for measurement, creation, and diffusion into one of the main priorities of the organizations in the knowledge society. There are many models for creation and management of knowledge and diverse and varied perspectives for study, analysis, and understanding. In this article, we will conduct a theoretical approach to the type of models for the creation and management of knowledge; we will discuss some of them and see some of the difficulties and the key factors that determine the success of the processes for the creation and management of knowledge.

Keywords: knowledge creation, knowledge management, organizational development, organization learning

Procedia PDF Downloads 322
1563 An Integrated Fuzzy Inference System and Technique for Order of Preference by Similarity to Ideal Solution Approach for Evaluation of Lean Healthcare Systems

Authors: Aydin M. Torkabadi, Ehsan Pourjavad

Abstract:

A decade after the introduction of Lean in Saskatchewan’s public healthcare system, its effectiveness remains a controversial subject among health researchers, workers, managers, and politicians. Therefore, developing a framework to quantitatively assess the Lean achievements is significant. This study investigates the success of initiatives across Saskatchewan health regions by recognizing the Lean healthcare criteria, measuring the success levels, comparing the regions, and identifying the areas for improvements. This study proposes an integrated intelligent computing approach by applying Fuzzy Inference System (FIS) and Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS). FIS is used as an efficient approach to assess the Lean healthcare criteria, and TOPSIS is applied for ranking the values in regards to the level of leanness. Due to the innate uncertainty in decision maker judgments on criteria, principals of the fuzzy theory are applied. Finally, FIS-TOPSIS was established as an efficient technique in determining the lean merit in healthcare systems.

Keywords: lean healthcare, intelligent computing, fuzzy inference system, healthcare evaluation, technique for order of preference by similarity to ideal solution, multi-criteria decision making, MCDM

Procedia PDF Downloads 144
1562 Education of Purchasing Professionals in Austria: Competence Based View

Authors: Volker Koch

Abstract:

This paper deals with the education of purchasing professionals in Austria. In this education, equivalent and measurable criteria are collected in order to create a comparison. The comparison shows the problem. To make the aforementioned comparison possible, methodologies such as KODE-Competence Atlas or presentations in a matrix form are used. The result shows the content taught and whether there are any similarities or interesting differences in the current Austrian purchasers’ formations. Purchasing professionals learning competencies are also illustrated in the study result.

Keywords: competencies, education, purchasing professional, technological-oriented

Procedia PDF Downloads 282
1561 Comparison of Number of Waves Surfed and Duration Using Global Positioning System and Inertial Sensors

Authors: João Madureira, Ricardo Lagido, Inês Sousa, Fraunhofer Portugal

Abstract:

Surf is an increasingly popular sport and its performance evaluation is often qualitative. This work aims at using a smartphone to collect and analyze the GPS and inertial sensors data in order to obtain quantitative metrics of the surfing performance. Two approaches are compared for detection of wave rides, computing the number of waves rode in a surfing session, the starting time of each wave and its duration. The first approach is based on computing the velocity from the Global Positioning System (GPS) signal and finding the velocity thresholds that allow identifying the start and end of each wave ride. The second approach adds information from the Inertial Measurement Unit (IMU) of the smartphone, to the velocity thresholds obtained from the GPS unit, to determine the start and end of each wave ride. The two methods were evaluated using GPS and IMU data from two surfing sessions and validated with similar metrics extracted from video data collected from the beach. The second method, combining GPS and IMU data, was found to be more accurate in determining the number of waves, start time and duration. This paper shows that it is feasible to use smartphones for quantification of performance metrics during surfing. In particular, detection of the waves rode and their duration can be accurately determined using the smartphone GPS and IMU.

Keywords: inertial measurement unit (IMU), global positioning system (GPS), smartphone, surfing performance

Procedia PDF Downloads 388
1560 Scenarios for the Energy Transition in Residential Buildings for the European Regions

Authors: Domenico Carmelo Mongelli, Laura Carnieletto, Michele De Carli, Filippo Busato

Abstract:

Starting from the current context in which the Russian invasion in Ukraine has highlighted Europe's dependence on natural gas imports for heating buildings, this study proposes solutions to resolve this dependency and evaluates related scenarios in the near future. In the first part of this work the methodologies and results of the economic impact are indicated by simulating a massive replacement of boilers powered by fossil fuels with electrically powered hightemperature air-water heat pumps for heating residential buildings in different European climates, without changing the current energy mix. For each individual European region, the costs for the purchase and installation of heat pumps for all residential buildings have been determined. Again for each individual European region, the economic savings during the operation phase that would be obtained in this future scenario of energy transition from fossil fuels to the electrification of domestic heating were calculated. For the European regions for which the economic savings were identified as positive, the payback times of the economic investments were analysed. In the second part of the work, hypothesizing different scenarios for a possible greater use of renewable energy sources and therefore with different possible future scenarios of the energy mix, the methodologies and results of the simulations on the economic analysis and on the environmental analysis are reported which have allowed us to evaluate the future effects of the energy transition from boilers to heat pumps for each European region. In the third part, assuming a rapid short-term diffusion of cooling for European residential buildings, the penetration shares in the cooling market and future projections of energy needs for cooling for each European region have been identified. A database was created where the results of this research relating to 38 European Nations divided into 179 regions were reported. Other previous works on the topics covered were limited to analyzing individual European nations, without ever going into detail about the individual regions within each nation, while the original contribution of the present work lies in the fact that the results achieved allow a specific numerical analysis and punctual for every single European region.

Keywords: buildings, energy, Europe, future

Procedia PDF Downloads 74
1559 Methodologies, Findings, Discussion, and Limitations in Global, Multi-Lingual Research: We Are All Alone - Chinese Internet Drama

Authors: Patricia Portugal Marques de Carvalho Lourenco

Abstract:

A three-phase methodological multi-lingual path was designed, constructed and carried out using the 2020 Chinese Internet Drama Series We Are All Alone as a case study. Phase one, the backbone of the research, comprised of secondary data analysis, providing the structure on which the next two phases would be built on. Phase one incorporated a Google Scholar and a Baidu Index analysis, Star Network Influence Index and Mydramalist.com top two drama reviews, along with an article written about the drama and scrutiny of Chinese related blogs and websites. Phase two was field research elaborated across Latin Europe, and phase three was social media focused, having into account that perceptions are going to be memory conditioned based on past ideas recall. Overall, research has shown the poor cultural expression of Chinese entertainment in Latin Europe and demonstrated the inexistence of Chinese content in French, Italian, Portuguese and Spanish Business to Consumer retailers; a reflection of their low significance in Latin European markets and the short-life cycle of entertainment products in general, bubble-gum, disposable goods without a mid to long-term effect in consumers lives. The process of conducting comprehensive international research was complex and time-consuming, with data not always available in Mandarin, the researcher’s linguistic deficiency, limited Chinese Cultural Knowledge and cultural equivalence. Despite steps being taken to minimize the international proposed research, theoretical limitations concurrent to Latin Europe and China still occurred. Data accuracy was disputable; sampling, data collection/analysis methods are heterogeneous; ascertaining data requirements and the method of analysis to achieve a construct equivalence was challenging and morose to operationalize. Secondary data was also not often readily available in Mandarin; yet, in spite of the array of limitations, research was done, and results were produced.

Keywords: research methodologies, international research, primary data, secondary data, research limitations, online dramas, china, latin europe

Procedia PDF Downloads 59
1558 A Convergent Interacting Particle Method for Computing Kpp Front Speeds in Random Flows

Authors: Tan Zhang, Zhongjian Wang, Jack Xin, Zhiwen Zhang

Abstract:

We aim to efficiently compute the spreading speeds of reaction-diffusion-advection (RDA) fronts in divergence-free random flows under the Kolmogorov-Petrovsky-Piskunov (KPP) nonlinearity. We study a stochastic interacting particle method (IPM) for the reduced principal eigenvalue (Lyapunov exponent) problem of an associated linear advection-diffusion operator with spatially random coefficients. The Fourier representation of the random advection field and the Feynman-Kac (FK) formula of the principal eigenvalue (Lyapunov exponent) form the foundation of our method implemented as a genetic evolution algorithm. The particles undergo advection-diffusion and mutation/selection through a fitness function originated in the FK semigroup. We analyze the convergence of the algorithm based on operator splitting and present numerical results on representative flows such as 2D cellular flow and 3D Arnold-Beltrami-Childress (ABC) flow under random perturbations. The 2D examples serve as a consistency check with semi-Lagrangian computation. The 3D results demonstrate that IPM, being mesh-free and self-adaptive, is simple to implement and efficient for computing front spreading speeds in the advection-dominated regime for high-dimensional random flows on unbounded domains where no truncation is needed.

Keywords: KPP front speeds, random flows, Feynman-Kac semigroups, interacting particle method, convergence analysis

Procedia PDF Downloads 27
1557 Measuring E-Learning Effectiveness Using a Three-Way Comparison

Authors: Matthew Montebello

Abstract:

The way e-learning effectiveness has been notoriously measured within an academic setting is by comparing the e-learning medium to the traditional face-to-face teaching methodology. In this paper, a simple yet innovative comparison methodology is introduced, whereby the effectiveness of next generation e-learning systems are assessed in contrast not only to the face-to-face mode, but also to the classical e-learning modality. Ethical and logistical issues are also discussed, as this three-way approach to compare teaching methodologies was applied and documented in a real empirical study within a higher education institution.

Keywords: e-learning effectiveness, higher education, teaching modality comparison

Procedia PDF Downloads 369
1556 Need for Shariah Screening of Companies in Nigeria: Lessons from Other Jurisdictions

Authors: Aishat Abdul-Qadir Zubair

Abstract:

Background: The absence of Shari’ah screening methodology for companies in Nigeria has further engineered the uncertainty surrounding the acceptability of investing in certain companies by people professing the religion of Islam due to the nature of the activities carried out by these companies. There are some existing shariah screening indices in other jurisdictions whose criteria can be used to check if a company or business is shariah-compliant or not. Examples such as FTSE, DJIM, Standard and Poor to mention just a few. What these indices have tried to do is to ensure that there are benchmarks to check with before investing in companies that carry out mixed activities in their business, wherein some are halal and others may be haram. Purpose: There have been numerous studies on the need to adopt certain screening methodologies as well as call for new methods in screening companies for shariah compliance in order to suit the investments needs of Muslims in other jurisdictions. It is, however, unclear how suitable these methodologies will be to Nigeria. This paper, therefore, seeks to address this gap to consider an appropriate screening methodology to be employed in Nigeria, drawing from the experience of other jurisdictions. Methods: This study employs a triangulation of both quantitative and qualitative methods to analyze the need for Shari’ah screening of companies in Nigeria. The qualitative method is used by way of ijtihad, and this study tries to apply some Islamic Principles of Maqasid al-shari’ah as well as Qawaid al-Fiqiyyah to analyze activities of companies in order to ensure that they are indeed Shari’ah compliant. In addition, using the quantitative data gathered from the interview survey, the perspective of the investors with regards to the need for Shari’ah screening of companies in Nigeria is further analyzed. Results: The result of the study shows that there is a lack of awareness from the teeming Muslim population in Nigeria on the need for Shari’ah screening of companies in Nigeria. The result further shows that there is the need to take into cognizance the peculiar nature of company activities in Nigeria before any particular Shari’ah screening methodology is adopted and setting the necessary benchmarks. Conclusion and Implications: The study concludes that there is the need to ensure that the conscious Muslims in Nigeria screen companies for Shari’ah compliance so that they can easily identify the companies to invest in. The paper, therefore, recommends that the Nigerian government need to come up with a screening methodology that will suit the peculiar nature of companies in Nigeria. The study thus has a direct implication on the Investment regulatory bodies in Nigeria such as the Securities and Exchange Commission (SEC), Central Bank of Nigeria (CBN) as well as the investor Muslims.

Keywords: Shari'ah screening, Muslims, investors, companies

Procedia PDF Downloads 152
1555 Effective Nutrition Label Use on Smartphones

Authors: Vladimir Kulyukin, Tanwir Zaman, Sarat Kiran Andhavarapu

Abstract:

Research on nutrition label use identifies four factors that impede comprehension and retention of nutrition information by consumers: label’s location on the package, presentation of information within the label, label’s surface size, and surrounding visual clutter. In this paper, a system is presented that makes nutrition label use more effective for nutrition information comprehension and retention. The system’s front end is a smartphone application. The system’s back end is a four node Linux cluster for image recognition and data storage. Image frames captured on the smartphone are sent to the back end for skewed or aligned barcode recognition. When barcodes are recognized, corresponding nutrition labels are retrieved from a cloud database and presented to the user on the smartphone’s touchscreen. Each displayed nutrition label is positioned centrally on the touchscreen with no surrounding visual clutter. Wikipedia links to important nutrition terms are embedded to improve comprehension and retention of nutrition information. Standard touch gestures (e.g., zoom in/out) available on mainstream smartphones are used to manipulate the label’s surface size. The nutrition label database currently includes 200,000 nutrition labels compiled from public web sites by a custom crawler. Stress test experiments with the node cluster are presented. Implications for proactive nutrition management and food policy are discussed.

Keywords: mobile computing, cloud computing, nutrition label use, nutrition management, barcode scanning

Procedia PDF Downloads 352
1554 Artificially Intelligent Context Aware Personal Computer Assistant (ACPCA)

Authors: Abdul Mannan Akhtar

Abstract:

In this paper a novel concept of a self learning smart personalized computer assistant (ACPCA) is established which is a context aware system. Based on user habits, moods, and other routines/situational reactions the system will manage various services and suggestions at appropriate times including what schedule to follow, what to watch, what software to be used, what should be deleted etc. This system will utilize a hybrid fuzzyNeural model to predict what the user will do next and support his actions. This will be done by establishing fuzzy sets of user activities, choices, preferences etc. and utilizing their combinations to predict his moods and immediate preferences. Various application of context aware systems exist separately e.g. on certain websites for music or multimedia suggestions but a personalized autonomous system that could adapt to user’s personality does not exist at present. Due to the novelty and massiveness of this concept, this paper will primarily focus on the problem establishment, product features and its functionality; however a small mini case is also implemented on MATLAB to demonstrate some of the aspects of ACPCA. The mini case involves prediction of user moods, activity, routine and food preference using a hybrid fuzzy-Neural soft computing technique.

Keywords: context aware systems, APCPCA, soft computing techniques, artificial intelligence, fuzzy logic, neural network, mood detection, face detection, activity detection

Procedia PDF Downloads 449
1553 Context-Aware Alert Method in Hajj Pilgrim Location-Based Tracking System

Authors: Syarif Hidayat

Abstract:

As millions of people with different backgrounds perform hajj every year in Saudi Arabia, it brings out several problems. Missing people is among many crucial problems need to be encountered. Some people might have had insufficient knowledge of using tracking system equipment. Other might become a victim of an accident, lose consciousness, or even died, prohibiting them to perform certain activity. For those reasons, people could not send proper SOS message. The major contribution of this paper is the application of the diverse alert method in pilgrims tracking system. It offers a simple yet robust solution to send SOS message by pilgrims during Hajj. Knowledge of context aware computing is assumed herein. This study presents four methods that could be utilized by pilgrims to send SOS. The first method is simple mobile application contains only a button. The second method is based on behavior analysis based off GPS location movement anomaly. The third method is by introducing pressing pattern to smartwatch physical button as a panic button. The fourth method is by identifying certain accelerometer pattern recognition as a sign of emergency situations. Presented method in this paper would be an important part of pilgrims tracking system. The discussion provided here includes easy to use design whilst maintaining tracking accuracy, privacy, and security of its users.

Keywords: context aware computing, emergency alert system, GPS, hajj pilgrim tracking, location-based services

Procedia PDF Downloads 200
1552 Educational Knowledge Transfer in Indigenous Mexican Areas Using Cloud Computing

Authors: L. R. Valencia Pérez, J. M. Peña Aguilar, A. Lamadrid Álvarez, A. Pastrana Palma, H. F. Valencia Pérez, M. Vivanco Vargas

Abstract:

This work proposes a Cooperation-Competitive (Coopetitive) approach that allows coordinated work among the Secretary of Public Education (SEP), the Autonomous University of Querétaro (UAQ) and government funds from National Council for Science and Technology (CONACYT) or some other international organizations. To work on an overall knowledge transfer strategy with e-learning over the Cloud, where experts in junior high and high school education, working in multidisciplinary teams, perform analysis, evaluation, design, production, validation and knowledge transfer at large scale using a Cloud Computing platform. Allowing teachers and students to have all the information required to ensure a homologated nationally knowledge of topics such as mathematics, statistics, chemistry, history, ethics, civism, etc. This work will start with a pilot test in Spanish and initially in two regional dialects Otomí and Náhuatl. Otomí has more than 285,000 speaking indigenes in Queretaro and Mexico´s central region. Náhuatl is number one indigenous dialect spoken in Mexico with more than 1,550,000 indigenes. The phase one of the project takes into account negotiations with indigenous tribes from different regions, and the Information and Communication technologies to deliver the knowledge to the indigenous schools in their native dialect. The methodology includes the following main milestones: Identification of the indigenous areas where Otomí and Náhuatl are the spoken dialects, research with the SEP the location of actual indigenous schools, analysis and inventory or current schools conditions, negotiation with tribe chiefs, analysis of the technological communication requirements to reach the indigenous communities, identification and inventory of local teachers technology knowledge, selection of a pilot topic, analysis of actual student competence with traditional education system, identification of local translators, design of the e-learning platform, design of the multimedia resources and storage strategy for “Cloud Computing”, translation of the topic to both dialects, Indigenous teachers training, pilot test, course release, project follow up, analysis of student requirements for the new technological platform, definition of a new and improved proposal with greater reach in topics and regions. Importance of phase one of the project is multiple, it includes the proposal of a working technological scheme, focusing in the cultural impact in Mexico so that indigenous tribes can improve their knowledge about new forms of crop improvement, home storage technologies, proven home remedies for common diseases, ways of preparing foods containing major nutrients, disclose strengths and weaknesses of each region, communicating through cloud computing platforms offering regional products and opening communication spaces for inter-indigenous cultural exchange.

Keywords: Mexicans indigenous tribes, education, knowledge transfer, cloud computing, otomi, Náhuatl, language

Procedia PDF Downloads 388