Search results for: pen and paper or computer
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25729

Search results for: pen and paper or computer

25459 Video Based Ambient Smoke Detection By Detecting Directional Contrast Decrease

Authors: Omair Ghori, Anton Stadler, Stefan Wilk, Wolfgang Effelsberg

Abstract:

Fire-related incidents account for extensive loss of life and material damage. Quick and reliable detection of occurring fires has high real world implications. Whereas a major research focus lies on the detection of outdoor fires, indoor camera-based fire detection is still an open issue. Cameras in combination with computer vision helps to detect flames and smoke more quickly than conventional fire detectors. In this work, we present a computer vision-based smoke detection algorithm based on contrast changes and a multi-step classification. This work accelerates computer vision-based fire detection considerably in comparison with classical indoor-fire detection.

Keywords: contrast analysis, early fire detection, video smoke detection, video surveillance

Procedia PDF Downloads 406
25458 The Application and Applicability of Computer System to Financial Management: A Case Study of College of Education, Oju, Benue State, Nigeria

Authors: Agih Ukuru Agih

Abstract:

This work is an appraisal of the application and applicability of computer system to financial management in improving the speed, performance, accuracy, and efficiency of the College of Education, Oju. The computerization of financial management, which is a recent development that has authentic and dedicated balancing of accounting records, would be of enormous benefits to the college. The core objective of this project is to recommend the software that typically matches a computerized institution, making for improved service, reduced fraud, mishandled funds, and financial records in the College of Education, Oju. Considering major globalization impacts in computerized financial management of the college, the study recommends among other things that the College of Education, Oju should endeavor to be positive towards computerized financial management in the institution.

Keywords: computer system, balancing, accounting records, computerized financial management

Procedia PDF Downloads 347
25457 Development of a Real-Time Brain-Computer Interface for Interactive Robot Therapy: An Exploration of EEG and EMG Features during Hypnosis

Authors: Maryam Alimardani, Kazuo Hiraki

Abstract:

This study presents a framework for development of a new generation of therapy robots that can interact with users by monitoring their physiological and mental states. Here, we focused on one of the controversial methods of therapy, hypnotherapy. Hypnosis has shown to be useful in treatment of many clinical conditions. But, even for healthy people, it can be used as an effective technique for relaxation or enhancement of memory and concentration. Our aim is to develop a robot that collects information about user’s mental and physical states using electroencephalogram (EEG) and electromyography (EMG) signals and performs costeffective hypnosis at the comfort of user’s house. The presented framework consists of three main steps: (1) Find the EEG-correlates of mind state before, during, and after hypnosis and establish a cognitive model for state changes, (2) Develop a system that can track the changes in EEG and EMG activities in real time and determines if the user is ready for suggestion, and (3) Implement our system in a humanoid robot that will talk and conduct hypnosis on users based on their mental states. This paper presents a pilot study in regard to the first stage, detection of EEG and EMG features during hypnosis.

Keywords: hypnosis, EEG, robotherapy, brain-computer interface (BCI)

Procedia PDF Downloads 232
25456 Wait-Optimized Scheduler Algorithm for Efficient Process Scheduling in Computer Systems

Authors: Md Habibur Rahman, Jaeho Kim

Abstract:

Efficient process scheduling is a crucial factor in ensuring optimal system performance and resource utilization in computer systems. While various algorithms have been proposed over the years, there are still limitations to their effectiveness. This paper introduces a new Wait-Optimized Scheduler (WOS) algorithm that aims to minimize process waiting time by dividing them into two layers and considering both process time and waiting time. The WOS algorithm is non-preemptive and prioritizes processes with the shortest WOS. In the first layer, each process runs for a predetermined duration, and any unfinished process is subsequently moved to the second layer, resulting in a decrease in response time. Whenever the first layer is free or the number of processes in the second layer is twice that of the first layer, the algorithm sorts all the processes in the second layer based on their remaining time minus waiting time and sends one process to the first layer to run. This ensures that all processes eventually run, optimizing waiting time. To evaluate the performance of the WOS algorithm, we conducted experiments comparing its performance with traditional scheduling algorithms such as First-Come-First-Serve (FCFS) and Shortest-Job-First (SJF). The results showed that the WOS algorithm outperformed the traditional algorithms in reducing the waiting time of processes, particularly in scenarios with a large number of short tasks with long wait times. Our study highlights the effectiveness of the WOS algorithm in improving process scheduling efficiency in computer systems. By reducing process waiting time, the WOS algorithm can improve system performance and resource utilization. The findings of this study provide valuable insights for researchers and practitioners in developing and implementing efficient process scheduling algorithms.

Keywords: process scheduling, wait-optimized scheduler, response time, non-preemptive, waiting time, traditional scheduling algorithms, first-come-first-serve, shortest-job-first, system performance, resource utilization

Procedia PDF Downloads 63
25455 Computer-Aided Diagnosis of Polycystic Kidney Disease Using ANN

Authors: G. Anjan Babu, G. Sumana, M. Rajasekhar

Abstract:

Many inherited diseases and non-hereditary disorders are common in the development of renal cystic diseases. Polycystic kidney disease (PKD) is a disorder developed within the kidneys in which grouping of cysts filled with water like fluid. PKD is responsible for 5-10% of end-stage renal failure treated by dialysis or transplantation. New experimental models, application of molecular biology techniques have provided new insights into the pathogenesis of PKD. Researchers are showing keen interest for developing an automated system by applying computer aided techniques for the diagnosis of diseases. In this paper a multi-layered feed forward neural network with one hidden layer is constructed, trained and tested by applying back propagation learning rule for the diagnosis of PKD based on physical symptoms and test results of urinanalysis collected from the individual patients. The data collected from 50 patients are used to train and test the network. Among these samples, 75% of the data used for training and remaining 25% of the data are used for testing purpose. Furthermore, this trained network is used to implement for new samples. The output results in normality and abnormality of the patient.

Keywords: dialysis, hereditary, transplantation, polycystic, pathogenesis

Procedia PDF Downloads 353
25454 Artificial Intelligence for Generative Modelling

Authors: Shryas Bhurat, Aryan Vashistha, Sampreet Dinakar Nayak, Ayush Gupta

Abstract:

As the technology is advancing more towards high computational resources, there is a paradigm shift in the usage of these resources to optimize the design process. This paper discusses the usage of ‘Generative Design using Artificial Intelligence’ to build better models that adapt the operations like selection, mutation, and crossover to generate results. The human mind thinks of the simplest approach while designing an object, but the intelligence learns from the past & designs the complex optimized CAD Models. Generative Design takes the boundary conditions and comes up with multiple solutions with iterations to come up with a sturdy design with the most optimal parameter that is given, saving huge amounts of time & resources. The new production techniques that are at our disposal allow us to use additive manufacturing, 3D printing, and other innovative manufacturing techniques to save resources and design artistically engineered CAD Models. Also, this paper discusses the Genetic Algorithm, the Non-Domination technique to choose the right results using biomimicry that has evolved for current habitation for millions of years. The computer uses parametric models to generate newer models using an iterative approach & uses cloud computing to store these iterative designs. The later part of the paper compares the topology optimization technology with Generative Design that is previously being used to generate CAD Models. Finally, this paper shows the performance of algorithms and how these algorithms help in designing resource-efficient models.

Keywords: genetic algorithm, bio mimicry, generative modeling, non-dominant techniques

Procedia PDF Downloads 118
25453 Control and Automation of Sensors in Metering System of Fluid

Authors: Abdelkader Harrouz, Omar Harrouz, Ali Benatiallah

Abstract:

This paper is to present the essential definitions, roles and characteristics of automation of metering system. We discuss measurement, data acquisition and metrological control of a signal sensor from dynamic metering system. After that, we present control of instruments of metering system of fluid with more detailed discussions to the reference standards.

Keywords: communication, metering, computer, sensor

Procedia PDF Downloads 519
25452 Chinese Vocabulary Acquisition and Mobile Assisted Language Learning

Authors: Yuqing Sun

Abstract:

Chinese has been regarded as one of the most difficult languages in learning due to its complex spelling structure, difficult pronunciation, as well as its varying forms. Since vocabulary acquisition is the basic process to acquire a language, to express yourself, to compose a sentence, and to conduct a communication, so learning the vocabulary is of great importance. However, the vocabulary contains pronunciation, spelling, recognition and application which may seem as a huge work. This may pose a question for the language teachers (language teachers in China who teach Chinese to the foreign students): How to teach them in an effective way? Traditionally, teachers have no choice but teach it all by themselves, then with the development of technology, they can use computer as a tool to help them (Computer Assisted Language Learning or CALL). Now, they move into the Mobile Assisted Language Learning (MALL) method to guide their teaching, upon which the appraisal is convincing. It diversifies the learning material and the way of output, which can activate learners’ curiosity and accelerate their understanding. This paper will focus on actual case studies occurring in the universities in China of teaching the foreign students to learn Chinese, and the analysis of the utilization of WeChat channel as an example of MALL model to explore the active role of MALL to enhance the effectiveness of Chinese vocabulary acquisition.

Keywords: Chinese, vocabulary acquisition, MALL, case

Procedia PDF Downloads 382
25451 Efficacy of Isometric Neck Exercises and Stretching with Ergonomics for Neck Pain in Computer Professionals

Authors: Esther Liyanage, Indrajith Liyanage, Masih Khan

Abstract:

Neck pain has become a common epidemiological problem. One of the reasons for this is a sedentary way of life, connected with using a personal computer during all daily activities. Work place and work duration has not been properly adapted to the personal physical conditions of these employees. During 1990’s the importance of workstation design and work methods, or ergonomics on health was brought to the forefront of public attention. Ergonomics is the application of scientific information concerning humans to the design of objects. Ergonomic intervention results in improvement of working posture and a decrease in prevalence of musculoskeletal symptoms. Stretching and resistance exercises to the neck are easy to do, when performed 1-2 times daily reduce discomfort and ease neck stiffness. This study is aimed at finding if ergonomics with exercises to the neck prove beneficial to reduce neck pain in Computer Professionals. The outcomes measures used were: Oswestry neck disability index and VAS score for pain. 100 subjects satisfying the inclusion criteria were included in the study. Results: Ergonomic intervention along with isometric neck exercises and stretching proved to reduce neck pain and disability among computer professionals.

Keywords: ergonomics, neck pain, neck exercises, physiotherapy for neck pain

Procedia PDF Downloads 290
25450 Reliability Analysis of Computer Centre at Yobe State University Using LRU Algorithm

Authors: V. V. Singh, Yusuf Ibrahim Gwanda, Rajesh Prasad

Abstract:

In this paper, we focus on the reliability and performance analysis of Computer Centre (CC) at Yobe State University, Damaturu, Nigeria. The CC consists of three servers: one database mail server, one redundant and one for sharing with the client computers in the CC (called as a local server). Observing the different possibilities of the functioning of the CC, the analysis has been done to evaluate the various popular measures of reliability such as availability, reliability, mean time to failure (MTTF), profit analysis due to the operation of the system. The system can ultimately fail due to the failure of router, redundant server before repairing the mail server and switch failure. The system can also partially fail when a local server fails. The failed devices have restored according to Least Recently Used (LRU) techniques. The system can also fail entirely due to a cooling failure of the server, electricity failure or some natural calamity like earthquake, fire tsunami, etc. All the failure rates are assumed to be constant and follow exponential time distribution, while the repair follows two types of distributions: i.e. general and Gumbel-Hougaard family copula distribution.

Keywords: reliability, availability Gumbel-Hougaard family copula, MTTF, internet data centre

Procedia PDF Downloads 501
25449 Prevention of Road Accidents by Computerized Drowsiness Detection System

Authors: Ujjal Chattaraj, P. C. Dasbebartta, S. Bhuyan

Abstract:

This paper aims to propose a method to detect the action of the driver’s eyes, using the concept of face detection. There are three major key contributing methods which can rapidly process the framework of the facial image and hence produce results which further can program the reactions of the vehicles as pre-programmed for the traffic safety. This paper compares and analyses the methods on the basis of their reaction time and their ability to deal with fluctuating images of the driver. The program used in this study is simple and efficient, built using the AdaBoost learning algorithm. Through this program, the system would be able to discard background regions and focus on the face-like regions. The results are analyzed on a common computer which makes it feasible for the end users. The application domain of this experiment is quite wide, such as detection of drowsiness or influence of alcohols in drivers or detection for the case of identification.

Keywords: AdaBoost learning algorithm, face detection, framework, traffic safety

Procedia PDF Downloads 133
25448 A Real-World Roadmap and Exploration of Quantum Computers Capacity to Trivialise Internet Security

Authors: James Andrew Fitzjohn

Abstract:

This paper intends to discuss and explore the practical aspects of cracking encrypted messages with quantum computers. The theory of this process has been shown and well described both in academic papers and headline-grabbing news articles, but with all theory and hyperbole, we must be careful to assess the practicalities of these claims. Therefore, we will use real-world devices and proof of concept code to prove or disprove the notion that quantum computers will render the encryption technologies used by many websites unfit for purpose. It is time to discuss and implement the practical aspects of the process as many advances in quantum computing hardware/software have recently been made. This paper will set expectations regarding the useful lifespan of RSA and cipher lengths and propose alternative encryption technologies. We will set out comprehensive roadmaps describing when and how encryption schemes can be used, including when they can no longer be trusted. The cost will also be factored into our investigation; for example, it would make little financial sense to spend millions of dollars on a quantum computer to factor a private key in seconds when a commodity GPU could perform the same task in hours. It is hoped that the real-world results depicted in this paper will help influence the owners of websites who can take appropriate actions to improve the security of their provisions.

Keywords: quantum computing, encryption, RSA, roadmap, real world

Procedia PDF Downloads 96
25447 Design of Mobile Teaching for Students Collaborative Learning in Distance Higher Education

Authors: Lisbeth Amhag

Abstract:

The aim of the study is to describe and analyze the design of mobile teaching for students collaborative learning in distance higher education with a focus on mobile technologies as online webinars (web-based seminars or conferencing) by using laptops, smart phones, or tablets. These multimedia tools can provide face-to-face interactions, recorded flipped classroom videos and parallel chat communications. The data collection consists of interviews with 22 students and observations of online face-to-face webinars, as well two surveys. Theoretically, the study joins the research tradition of Computer Supported Collaborative learning, CSCL, as well as Computer Self-Efficacy, CSE concerned with individuals’ media and information literacy. Important conclusions from the study demonstrated mobile interactions increased student centered learning. As the students were appreciating the working methods, they became more engaged and motivated. The mobile technology using among student also contributes to increased flexibility between space and place, as well as media and information literacy.

Keywords: computer self-efficacy, computer supported collaborative learning, distance and open learning, educational design and technologies, media and information literacy, mobile learning

Procedia PDF Downloads 333
25446 Moving from Computer Assisted Learning Language to Mobile Assisted Learning Language Edutainment: A Trend for Teaching and Learning

Authors: Ahmad Almohana

Abstract:

Technology has led to rapid changes in the world, and most importantly to education, particularly in the 21st century. Technology has enhanced teachers’ potential and has resulted in the provision of greater interaction and choices for learners. In addition, technology is helping to improve individuals’ learning experiences and building their capacity to read, listen, speak, search, analyse, memorise and encode languages, as well as bringing learners together and creating a sense of greater involvement. This paper has been organised in the following way: the first section provides a review of the literature related to the implementation of CALL (computer assisted learning language), and it explains CALL and its phases, as well as attempting to highlight and analyse Warschauer’s article. The second section is an attempt to describe the move from CALL to mobilised systems of edutainment, which challenge existing forms of teaching and learning. It also addresses the role of the teacher and the curriculum content, and how this is affected by the computerisation of learning that is taking place. Finally, an empirical study has been conducted to collect data from teachers in Saudi Arabia using quantitive and qualitative method tools. Connections are made between the area of study and the personal experience of the researcher carrying out the study with a methodological reflection on the challenges faced by the teachers of this same system. The major findings were that it is worth spelling out here that despite the circumstances in which students and lecturers are currently working, the participants revealed themselves to be highly intelligent and articulate individuals who were constrained from revealing this criticality and creativity by the system of learning and teaching operant in most schools.

Keywords: CALL, computer assisted learning language, EFL, English as a foreign language, ELT, English language teaching, ETL, enhanced technology learning, MALL, mobile assisted learning language

Procedia PDF Downloads 134
25445 Presenting Internals of Networks Using Bare Machine Technology

Authors: Joel Weymouth, Ramesh K. Karne, Alexander L. Wijesinha

Abstract:

Bare Machine Internet is part of the Bare Machine Computing (BMC) paradigm. It is used in programming application ns to run directly on a device. It is software that runs directly against the hardware using CPU, Memory, and I/O. The software application runs without an Operating System and resident mass storage. An important part of the BMC paradigm is the Bare Machine Internet. It utilizes an Application Development model software that interfaces directly with the hardware on a network server and file server. Because it is “bare,” it is a powerful teaching and research tool that can readily display the internals of the network protocols, software, and hardware of the applications running on the Bare Server. It was also demonstrated that the bare server was accessible by laptop and by smartphone/android. The purpose was to show the further practicality of Bare Internet in Computer Engineering and Computer Science Education and Research. It was also to show that an undergraduate student could take advantage of a bare server with any device and any browser at any release version connected to the internet. This paper presents the Bare Web Server as an educational tool. We will discuss possible applications of this paradigm.

Keywords: bare machine computing, online research, network technology, visualizing network internals

Procedia PDF Downloads 139
25444 A Brain Controlled Robotic Gait Trainer for Neurorehabilitation

Authors: Qazi Umer Jamil, Abubakr Siddique, Mubeen Ur Rehman, Nida Aziz, Mohsin I. Tiwana

Abstract:

This paper discusses a brain controlled robotic gait trainer for neurorehabilitation of Spinal Cord Injury (SCI) patients. Patients suffering from Spinal Cord Injuries (SCI) become unable to execute motion control of their lower proximities due to degeneration of spinal cord neurons. The presented approach can help SCI patients in neuro-rehabilitation training by directly translating patient motor imagery into walkers motion commands and thus bypassing spinal cord neurons completely. A non-invasive EEG based brain-computer interface is used for capturing patient neural activity. For signal processing and classification, an open source software (OpenVibe) is used. Classifiers categorize the patient motor imagery (MI) into a specific set of commands that are further translated into walker motion commands. The robotic walker also employs fall detection for ensuring safety of patient during gait training and can act as a support for SCI patients. The gait trainer is tested with subjects, and satisfactory results were achieved.

Keywords: brain computer interface (BCI), gait trainer, spinal cord injury (SCI), neurorehabilitation

Procedia PDF Downloads 131
25443 Comparative Analysis of the Computer Methods' Usage for Calculation of Hydrocarbon Reserves in the Baltic Sea

Authors: Pavel Shcherban, Vlad Golovanov

Abstract:

Nowadays, the depletion of hydrocarbon deposits on the land of the Kaliningrad region leads to active geological exploration and development of oil and natural gas reserves in the southeastern part of the Baltic Sea. LLC 'Lukoil-Kaliningradmorneft' implements a comprehensive program for the development of the region's shelf in 2014-2023. Due to heterogeneity of reservoir rocks in various open fields, as well as with ambiguous conclusions on the contours of deposits, additional geological prospecting and refinement of the recoverable oil reserves are carried out. The key element is use of an effective technique of computer stock modeling at the first stage of processing of the received data. The following step uses information for the cluster analysis, which makes it possible to optimize the field development approaches. The article analyzes the effectiveness of various methods for reserves' calculation and computer modelling methods of the offshore hydrocarbon fields. Cluster analysis allows to measure influence of the obtained data on the development of a technical and economic model for mining deposits. The relationship between the accuracy of the calculation of recoverable reserves and the need of modernization of existing mining infrastructure, as well as the optimization of the scheme of opening and development of oil deposits, is observed.

Keywords: cluster analysis, computer modelling of deposits, correction of the feasibility study, offshore hydrocarbon fields

Procedia PDF Downloads 139
25442 Digital Forensics Compute Cluster: A High Speed Distributed Computing Capability for Digital Forensics

Authors: Daniel Gonzales, Zev Winkelman, Trung Tran, Ricardo Sanchez, Dulani Woods, John Hollywood

Abstract:

We have developed a distributed computing capability, Digital Forensics Compute Cluster (DFORC2) to speed up the ingestion and processing of digital evidence that is resident on computer hard drives. DFORC2 parallelizes evidence ingestion and file processing steps. It can be run on a standalone computer cluster or in the Amazon Web Services (AWS) cloud. When running in a virtualized computing environment, its cluster resources can be dynamically scaled up or down using Kubernetes. DFORC2 is an open source project that uses Autopsy, Apache Spark and Kafka, and other open source software packages. It extends the proven open source digital forensics capabilities of Autopsy to compute clusters and cloud architectures, so digital forensics tasks can be accomplished efficiently by a scalable array of cluster compute nodes. In this paper, we describe DFORC2 and compare it with a standalone version of Autopsy when both are used to process evidence from hard drives of different sizes.

Keywords: digital forensics, cloud computing, cyber security, spark, Kubernetes, Kafka

Procedia PDF Downloads 367
25441 Multi-Spectral Deep Learning Models for Forest Fire Detection

Authors: Smitha Haridasan, Zelalem Demissie, Atri Dutta, Ajita Rattani

Abstract:

Aided by the wind, all it takes is one ember and a few minutes to create a wildfire. Wildfires are growing in frequency and size due to climate change. Wildfires and its consequences are one of the major environmental concerns. Every year, millions of hectares of forests are destroyed over the world, causing mass destruction and human casualties. Thus early detection of wildfire becomes a critical component to mitigate this threat. Many computer vision-based techniques have been proposed for the early detection of forest fire using video surveillance. Several computer vision-based methods have been proposed to predict and detect forest fires at various spectrums, namely, RGB, HSV, and YCbCr. The aim of this paper is to propose a multi-spectral deep learning model that combines information from different spectrums at intermediate layers for accurate fire detection. A heterogeneous dataset assembled from publicly available datasets is used for model training and evaluation in this study. The experimental results show that multi-spectral deep learning models could obtain an improvement of about 4.68 % over those based on a single spectrum for fire detection.

Keywords: deep learning, forest fire detection, multi-spectral learning, natural hazard detection

Procedia PDF Downloads 203
25440 Simulation with Uncertainties of Active Controlled Vibration Isolation System for Astronaut’s Exercise Platform

Authors: Shield B. Lin, Ziraguen O. Williams

Abstract:

In a task to assist NASA in analyzing the dynamic forces caused by operational countermeasures of an astronaut’s exercise platform impacting the spacecraft, an active proportional-integral-derivative controller commanding a linear actuator is proposed in a vibration isolation system to regulate the movement of the exercise platform. Computer simulation shows promising results that most exciter forces can be reduced or even eliminated. This paper emphasizes on parameter uncertainties, variations and exciter force variations. Drift and variations of system parameters in the vibration isolation system for astronaut’s exercise platform are analyzed. An active controlled scheme is applied with the goals to reduce the platform displacement and to minimize the force being transmitted to the spacecraft structure. The controller must be robust enough to accommodate the wide variations of system parameters and exciter forces. Computer simulation for the vibration isolation system was performed via MATLAB/Simulink and Trick. The simulation results demonstrate the achievement of force reduction with small platform displacement under wide ranges of variations in system parameters.

Keywords: control, counterweight, isolation, vibration

Procedia PDF Downloads 113
25439 F-VarNet: Fast Variational Network for MRI Reconstruction

Authors: Omer Cahana, Maya Herman, Ofer Levi

Abstract:

Magnetic resonance imaging (MRI) is a long medical scan that stems from a long acquisition time. This length is mainly due to the traditional sampling theorem, which defines a lower boundary for sampling. However, it is still possible to accelerate the scan by using a different approach, such as compress sensing (CS) or parallel imaging (PI). These two complementary methods can be combined to achieve a faster scan with high-fidelity imaging. In order to achieve that, two properties have to exist: i) the signal must be sparse under a known transform domain, ii) the sampling method must be incoherent. In addition, a nonlinear reconstruction algorithm needs to be applied to recover the signal. While the rapid advance in the deep learning (DL) field, which has demonstrated tremendous successes in various computer vision task’s, the field of MRI reconstruction is still in an early stage. In this paper, we present an extension of the state-of-the-art model in MRI reconstruction -VarNet. We utilize VarNet by using dilated convolution in different scales, which extends the receptive field to capture more contextual information. Moreover, we simplified the sensitivity map estimation (SME), for it holds many unnecessary layers for this task. Those improvements have shown significant decreases in computation costs as well as higher accuracy.

Keywords: MRI, deep learning, variational network, computer vision, compress sensing

Procedia PDF Downloads 114
25438 Analyzing the Factors that Cause Parallel Performance Degradation in Parallel Graph-Based Computations Using Graph500

Authors: Mustafa Elfituri, Jonathan Cook

Abstract:

Recently, graph-based computations have become more important in large-scale scientific computing as they can provide a methodology to model many types of relations between independent objects. They are being actively used in fields as varied as biology, social networks, cybersecurity, and computer networks. At the same time, graph problems have some properties such as irregularity and poor locality that make their performance different than regular applications performance. Therefore, parallelizing graph algorithms is a hard and challenging task. Initial evidence is that standard computer architectures do not perform very well on graph algorithms. Little is known exactly what causes this. The Graph500 benchmark is a representative application for parallel graph-based computations, which have highly irregular data access and are driven more by traversing connected data than by computation. In this paper, we present results from analyzing the performance of various example implementations of Graph500, including a shared memory (OpenMP) version, a distributed (MPI) version, and a hybrid version. We measured and analyzed all the factors that affect its performance in order to identify possible changes that would improve its performance. Results are discussed in relation to what factors contribute to performance degradation.

Keywords: graph computation, graph500 benchmark, parallel architectures, parallel programming, workload characterization.

Procedia PDF Downloads 111
25437 Multimodal Characterization of Emotion within Multimedia Space

Authors: Dayo Samuel Banjo, Connice Trimmingham, Niloofar Yousefi, Nitin Agarwal

Abstract:

Technological advancement and its omnipresent connection have pushed humans past the boundaries and limitations of a computer screen, physical state, or geographical location. It has provided a depth of avenues that facilitate human-computer interaction that was once inconceivable such as audio and body language detection. Given the complex modularities of emotions, it becomes vital to study human-computer interaction, as it is the commencement of a thorough understanding of the emotional state of users and, in the context of social networks, the producers of multimodal information. This study first acknowledges the accuracy of classification found within multimodal emotion detection systems compared to unimodal solutions. Second, it explores the characterization of multimedia content produced based on their emotions and the coherence of emotion in different modalities by utilizing deep learning models to classify emotion across different modalities.

Keywords: affective computing, deep learning, emotion recognition, multimodal

Procedia PDF Downloads 115
25436 Study of the Late Phase of Core Degradation during Reflooding by Safety Injection System for VVER1000 with ASTECv2 Computer Code

Authors: Antoaneta Stefanova, Rositsa Gencheva, Pavlin Groudev

Abstract:

This paper presents the modeling approach in SBO sequence for VVER 1000 reactors and describes the reactor core behavior at late in-vessel phase in case of late reflooding by HPIS and gives preliminary results for the ASTECv2 validation. The work is focused on investigation of plant behavior during total loss of power and the operator actions. The main goal of these analyses is to assess the phenomena arising during the Station blackout (SBO) followed by primary side high pressure injection system (HPIS) reflooding of already damaged reactor core at very late ‘in-vessel’ phase. The purpose of the analysis is to define how the later HPIS switching on can delay the time of vessel failure or possibly avoid vessel failure. For this purpose has been simulated an SBO scenario with injection of cold water by a high pressure pump (HPP) in cold leg at different stages of core degradation. The times for HPP injection were chosen based on previously performed investigations.

Keywords: VVER, operator action validation, reflooding of overheated reactor core, ASTEC computer code

Procedia PDF Downloads 391
25435 Effect of Filter Paper Technique in Measuring Hydraulic Capacity of Unsaturated Expansive Soil

Authors: Kenechi Kurtis Onochie

Abstract:

This paper shows the use of filter paper technique in the measurement of matric suction of unsaturated expansive soil around the Haspolat region of Lefkosa, North Cyprus in other to establish the soil water characteristics curve (SWCC) or soil water retention curve (SWRC). The dry filter paper approach which is standardized by ASTM, 2003, D 5298-03 in which the filter paper is initially dry was adopted. The whatman No. 42 filter paper was used in the matric suction measurement. The maximum dry density of the soil was obtained as 2.66kg/cm³ and the optimum moisture content as 21%. The soil was discovered to have high air entry value of 1847.46KPa indicating finer particles and 25% hydraulic capacity using filter paper technique. The filter paper technique proved to be very useful for measuring the hydraulic capacity of unsaturated expansive soil.

Keywords: SWCC, matric suction, filter paper, expansive soil

Procedia PDF Downloads 140
25434 Analysis of Secondary School Students' Perceptions about Information Technologies through a Word Association Test

Authors: Fetah Eren, Ismail Sahin, Ismail Celik, Ahmet Oguz Akturk

Abstract:

The aim of this study is to discover secondary school students’ perceptions related to information technologies and the connections between concepts in their cognitive structures. A word association test consisting of six concepts related to information technologies is used to collect data from 244 secondary school students. Concept maps that present students’ cognitive structures are drawn with the help of frequency data. Data are analyzed and interpreted according to the connections obtained as a result of the concept maps. It is determined students associate most with these concepts—computer, Internet, and communication of the given concepts, and associate least with these concepts—computer-assisted education and information technologies. These results show the concepts, Internet, communication, and computer, are an important part of students’ cognitive structures. In addition, students mostly answer computer, phone, game, Internet and Facebook as the key concepts. These answers show students regard information technologies as a means for entertainment and free time activity, not as a means for education.

Keywords: word association test, cognitive structure, information technology, secondary school

Procedia PDF Downloads 382
25433 Deployment of Information and Communication Technology (ICT) to Reduce Occurrences of Terrorism in Nigeria

Authors: Okike Benjamin

Abstract:

Terrorism is the use of violence and threat to intimidate or coerce a person, group, society or even government especially for political purposes. Terrorism may be a way of resisting government by some group who may feel marginalized. It could also be a way of expressing displeasure over the activities of government. On 26th December, 2009, US placed Nigeria as a terrorist nation. Recently, the occurrences of terrorism in Nigeria have increased considerably. In Jos, Plateau state, Nigeria, there was a bomb blast which claimed many lives on the eve of 2010 Christmas. Similarly, there was another bomb blast in Mugadishi (Sani Abacha) Barracks Mammy market on the eve of 2011 New Year. For some time now, it is no longer news that bomb exploded in some Northern part of Nigeria. About 25 years ago, stopping terrorism in America by the Americans relied on old-fashioned tools such as strict physical security at vulnerable places, intelligence gathering by government agents, or individuals, vigilance on the part of all citizens, and a sense of community in which citizens do what could be done to protect each other. Just as technology has virtually been used to better the way many other things are done, so also this powerful new weapon called computer technology can be used to detect and prevent terrorism not only in Nigeria, but all over the world. This paper will x-ray the possible causes and effects of bomb blast, which is an act of terrorism and suggest ways in which Explosive Detection Devices (EDDs) and computer software technology could be deployed to reduce the occurrences of terrorism in Nigeria. This become necessary with the abduction of over 200 schoolgirls in Chibok, Borno State from their hostel by members of Boko Haram sect members on 14th April, 2014. Presently, Barrack Obama and other world leaders have sent some of their military personnel to help rescue those innocent schoolgirls whose offence is simply seeking to acquire western education which the sect strongly believe is forbidden.

Keywords: terrorism, bomb blast, computer technology, explosive detection devices, Nigeria

Procedia PDF Downloads 239
25432 Spontaneous and Posed Smile Detection: Deep Learning, Traditional Machine Learning, and Human Performance

Authors: Liang Wang, Beste F. Yuksel, David Guy Brizan

Abstract:

A computational model of affect that can distinguish between spontaneous and posed smiles with no errors on a large, popular data set using deep learning techniques is presented in this paper. A Long Short-Term Memory (LSTM) classifier, a type of Recurrent Neural Network, is utilized and compared to human classification. Results showed that while human classification (mean of 0.7133) was above chance, the LSTM model was more accurate than human classification and other comparable state-of-the-art systems. Additionally, a high accuracy rate was maintained with small amounts of training videos (70 instances). The derivation of important features to further understand the success of our computational model were analyzed, and it was inferred that thousands of pairs of points within the eyes and mouth are important throughout all time segments in a smile. This suggests that distinguishing between a posed and spontaneous smile is a complex task, one which may account for the difficulty and lower accuracy of human classification compared to machine learning models.

Keywords: affective computing, affect detection, computer vision, deep learning, human-computer interaction, machine learning, posed smile detection, spontaneous smile detection

Procedia PDF Downloads 101
25431 Effects of Computer Aided Instructional Package on Performance and Retention of Genetic Concepts amongst Secondary School Students in Niger State, Nigeria

Authors: Muhammad R. Bello, Mamman A. Wasagu, Yahya M. Kamar

Abstract:

The study investigated the effects of computer-aided instructional package (CAIP) on performance and retention of genetic concepts among secondary school students in Niger State. Quasi-experimental research design i.e. pre-test-post-test experimental and control groups were adopted for the study. The population of the study was all senior secondary school three (SS3) students’ offering biology. A sample of 223 students was randomly drawn from six purposively selected secondary schools. The researchers’ developed computer aided instructional package (CAIP) on genetic concepts was used as treatment instrument for the experimental group while the control group was exposed to the conventional lecture method (CLM). The instrument for data collection was a Genetic Performance Test (GEPET) that had 50 multiple-choice questions which were validated by science educators. A Reliability coefficient of 0.92 was obtained for GEPET using Pearson Product Moment Correlation (PPMC). The data collected were analyzed using IBM SPSS Version 20 package for computation of Means, Standard deviation, t-test, and analysis of covariance (ANCOVA). The ANOVA analysis (Fcal (220) = 27.147, P < 0.05) shows that students who received instruction with CAIP outperformed the students who received instruction with CLM and also had higher retention. The findings also revealed no significant difference in performance and retention between male and female students (tcal (103) = -1.429, P > 0.05). It was recommended amongst others that teachers should use computer-aided instructional package in teaching genetic concepts in order to improve students’ performance and retention in biology subject. Keywords: Computer-aided Instructional Package, Performance, Retention and Genetic Concepts.

Keywords: computer aided instructional package, performance, retention, genetic concepts, senior secondary school students

Procedia PDF Downloads 334
25430 ISME: Integrated Style Motion Editor for 3D Humanoid Character

Authors: Ismahafezi Ismail, Mohd Shahrizal Sunar

Abstract:

The motion of a realistic 3D humanoid character is very important especially for the industries developing computer animations and games. However, this type of motion is seen with a very complex dimensional data as well as body position, orientation, and joint rotation. Integrated Style Motion Editor (ISME), on the other hand, is a method used to alter the 3D humanoid motion capture data utilised in computer animation and games development. Therefore, this study was carried out with the purpose of demonstrating a method that is able to manipulate and deform different motion styles by integrating Key Pose Deformation Technique and Trajectory Control Technique. This motion editing method allows the user to generate new motions from the original motion capture data using a simple interface control. Unlike the previous method, our method produces a realistic humanoid motion style in real time.

Keywords: computer animation, humanoid motion, motion capture, motion editing

Procedia PDF Downloads 356