Search results for: applied computing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9018

Search results for: applied computing

8808 Optimization of Robot Motion Planning Using Biogeography Based Optimization (Bbo)

Authors: Jaber Nikpouri, Arsalan Amralizadeh

Abstract:

In robotics manipulators, the trajectory should be optimum, thus the torque of the robot can be minimized in order to save power. This paper includes an optimal path planning scheme for a robotic manipulator. Recently, techniques based on metaheuristics of natural computing, mainly evolutionary algorithms (EA), have been successfully applied to a large number of robotic applications. In this paper, the improved BBO algorithm is used to minimize the objective function in the presence of different obstacles. The simulation represents that the proposed optimal path planning method has satisfactory performance.

Keywords: biogeography-based optimization, path planning, obstacle detection, robotic manipulator

Procedia PDF Downloads 267
8807 Computing Continuous Skyline Queries without Discriminating between Static and Dynamic Attributes

Authors: Ibrahim Gomaa, Hoda M. O. Mokhtar

Abstract:

Although most of the existing skyline queries algorithms focused basically on querying static points through static databases; with the expanding number of sensors, wireless communications and mobile applications, the demand for continuous skyline queries has increased. Unlike traditional skyline queries which only consider static attributes, continuous skyline queries include dynamic attributes, as well as the static ones. However, as skyline queries computation is based on checking the domination of skyline points over all dimensions, considering both the static and dynamic attributes without separation is required. In this paper, we present an efficient algorithm for computing continuous skyline queries without discriminating between static and dynamic attributes. Our algorithm in brief proceeds as follows: First, it excludes the points which will not be in the initial skyline result; this pruning phase reduces the required number of comparisons. Second, the association between the spatial positions of data points is examined; this phase gives an idea of where changes in the result might occur and consequently enables us to efficiently update the skyline result (continuous update) rather than computing the skyline from scratch. Finally, experimental evaluation is provided which demonstrates the accuracy, performance and efficiency of our algorithm over other existing approaches.

Keywords: continuous query processing, dynamic database, moving object, skyline queries

Procedia PDF Downloads 191
8806 Brain Computer Interface Implementation for Affective Computing Sensing: Classifiers Comparison

Authors: Ramón Aparicio-García, Gustavo Juárez Gracia, Jesús Álvarez Cedillo

Abstract:

A research line of the computer science that involve the study of the Human-Computer Interaction (HCI), which search to recognize and interpret the user intent by the storage and the subsequent analysis of the electrical signals of the brain, for using them in the control of electronic devices. On the other hand, the affective computing research applies the human emotions in the HCI process helping to reduce the user frustration. This paper shows the results obtained during the hardware and software development of a Brain Computer Interface (BCI) capable of recognizing the human emotions through the association of the brain electrical activity patterns. The hardware involves the sensing stage and analogical-digital conversion. The interface software involves algorithms for pre-processing of the signal in time and frequency analysis and the classification of patterns associated with the electrical brain activity. The methods used for the analysis and classification of the signal have been tested separately, by using a database that is accessible to the public, besides to a comparison among classifiers in order to know the best performing.

Keywords: affective computing, interface, brain, intelligent interaction

Procedia PDF Downloads 357
8805 An Analysis of Uncoupled Designs in Chicken Egg

Authors: Pratap Sriram Sundar, Chandan Chowdhury, Sagar Kamarthi

Abstract:

Nature has perfected her designs over 3.5 billion years of evolution. Research fields such as biomimicry, biomimetics, bionics, bio-inspired computing, and nature-inspired designs have explored nature-made artifacts and systems to understand nature’s mechanisms and intelligence. Learning from nature, the researchers have generated sustainable designs and innovation in a variety of fields such as energy, architecture, agriculture, transportation, communication, and medicine. Axiomatic design offers a method to judge if a design is good. This paper analyzes design aspects of one of the nature’s amazing object: chicken egg. The functional requirements (FRs) of components of the object are tabulated and mapped on to nature-chosen design parameters (DPs). The ‘independence axiom’ of the axiomatic design methodology is applied to analyze couplings and to evaluate if eggs’ design is good (i.e., uncoupled design) or bad (i.e., coupled design). The analysis revealed that eggs design is a good design, i.e., uncoupled design. This approach can be applied to any nature’s artifacts to judge whether their design is a good or a bad. This methodology is valuable for biomimicry studies. This approach can also be a very useful teaching design consideration of biology and bio-inspired innovation.

Keywords: uncoupled design, axiomatic design, nature design, design evaluation

Procedia PDF Downloads 143
8804 Semirings of Graphs: An Approach Towards the Algebra of Graphs

Authors: Gete Umbrey, Saifur Rahman

Abstract:

Graphs are found to be most capable in computing, and its abstract structures have been applied in some specific computations and algorithms like in phase encoding controller, processor microcontroller, and synthesis of a CMOS switching network, etc. Being motivated by these works, we develop an independent approach to study semiring structures and various properties by defining the binary operations which in fact, seems analogous to an existing definition in some sense but with a different approach. This work emphasizes specifically on the construction of semigroup and semiring structures on the set of undirected graphs, and their properties are investigated therein. It is expected that the investigation done here may have some interesting applications in theoretical computer science, networking and decision making, and also on joining of two network systems.

Keywords: graphs, join and union of graphs, semiring, weighted graphs

Procedia PDF Downloads 114
8803 Robot Spatial Reasoning via 3D Models

Authors: John Allard, Alex Rich, Iris Aguilar, Zachary Dodds

Abstract:

With this paper we present several experiences deploying novel, low-cost resources for computing with 3D spatial models. Certainly, computing with 3D models undergirds some of our field’s most important contributions to the human experience. Most often, those are contrived artifacts. This work extends that tradition by focusing on novel resources that deliver uncontrived models of a system’s current surroundings. Atop this new capability, we present several projects investigating the student-accessibility of the computational tools for reasoning about the 3D space around us. We conclude that, with current scaffolding, real-world 3D models are now an accessible and viable foundation for creative computational work.

Keywords: 3D vision, matterport model, real-world 3D models, mathematical and computational methods

Procedia PDF Downloads 510
8802 Improving Fault Tolerance and Load Balancing in Heterogeneous Grid Computing Using Fractal Transform

Authors: Saad M. Darwish, Adel A. El-Zoghabi, Moustafa F. Ashry

Abstract:

The popularity of the Internet and the availability of powerful computers and high-speed networks as low-cost commodity components are changing the way we use computers today. These technical opportunities have led to the possibility of using geographically distributed and multi-owner resources to solve large-scale problems in science, engineering, and commerce. Recent research on these topics has led to the emergence of a new paradigm known as Grid computing. To achieve the promising potentials of tremendous distributed resources, effective and efficient load balancing algorithms are fundamentally important. Unfortunately, load balancing algorithms in traditional parallel and distributed systems, which usually run on homogeneous and dedicated resources, cannot work well in the new circumstances. In this paper, the concept of a fast fractal transform in heterogeneous grid computing based on R-tree and the domain-range entropy is proposed to improve fault tolerance and load balancing algorithm by improve connectivity, communication delay, network bandwidth, resource availability, and resource unpredictability. A novel two-dimension figure of merit is suggested to describe the network effects on load balance and fault tolerance estimation. Fault tolerance is enhanced by adaptively decrease replication time and message cost while load balance is enhanced by adaptively decrease mean job response time. Experimental results show that the proposed method yields superior performance over other methods.

Keywords: Grid computing, load balancing, fault tolerance, R-tree, heterogeneous systems

Procedia PDF Downloads 457
8801 Central African Republic Government Recruitment Agency Based on Identity Management and Public Key Encryption

Authors: Koyangbo Guere Monguia Michel Alex Emmanuel

Abstract:

In e-government and especially recruitment, many researches have been conducted to build a trustworthy and reliable online or application system capable to process users or job applicant files. In this research (Government Recruitment Agency), cloud computing, identity management and public key encryption have been used to management domains, access control authorization mechanism and to secure data exchange between entities for reliable procedure of processing files.

Keywords: cloud computing network, identity management systems, public key encryption, access control and authorization

Procedia PDF Downloads 327
8800 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations

Authors: Ram Mohan, Richard Haney, Ajit Kelkar

Abstract:

Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.

Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance

Procedia PDF Downloads 332
8799 Fuzzy Set Approach to Study Appositives and Its Impact Due to Positional Alterations

Authors: E. Mike Dison, T. Pathinathan

Abstract:

Computing with Words (CWW) and Possibilistic Relational Universal Fuzzy (PRUF) are the two concepts which widely represent and measure the vaguely defined natural phenomenon. In this paper, we study the positional alteration of the phrases by which the impact of a natural language proposition gets affected and/or modified. We observe the gradations due to sensitivity/feeling of a statement towards the positional alterations. We derive the classification and modification of the meaning of words due to the positional alteration. We present the results with reference to set theoretic interpretations.

Keywords: appositive, computing with words, possibilistic relational universal fuzzy (PRUF), semantic sentiment analysis, set-theoretic interpretations

Procedia PDF Downloads 127
8798 Survey of Access Controls in Cloud Computing

Authors: Monirah Alkathiry, Hanan Aljarwan

Abstract:

Cloud computing is one of the most significant technologies that the world deals with, in different sectors with different purposes and capabilities. The cloud faces various challenges in securing data from unauthorized access or modification. Consequently, security risks and levels have greatly increased. Therefore, cloud service providers (CSPs) and users need secure mechanisms that ensure that data are kept secret and safe from any disclosures or exploits. For this reason, CSPs need a number of techniques and technologies to manage and secure access to the cloud services to achieve security goals, such as confidentiality, integrity, identity access management (IAM), etc. Therefore, this paper will review and explore various access controls implemented in a cloud environment that achieve different security purposes. The methodology followed in this survey was conducting an assessment, evaluation, and comparison between those access controls mechanisms and technologies based on different factors, such as the security goals it achieves, usability, and cost-effectiveness. This assessment resulted in the fact that the technology used in an access control affects the security goals it achieves as well as there is no one access control method that achieves all security goals. Consequently, such a comparison would help decision-makers to choose properly the access controls that meet their requirements.

Keywords: access controls, cloud computing, confidentiality, identity and access management

Procedia PDF Downloads 102
8797 Building a Hierarchical, Granular Knowledge Cube

Authors: Alexander Denzler, Marcel Wehrle, Andreas Meier

Abstract:

A knowledge base stores facts and rules about the world that applications can use for the purpose of reasoning. By applying the concept of granular computing to a knowledge base, several advantages emerge. These can be harnessed by applications to improve their capabilities and performance. In this paper, the concept behind such a construct, called a granular knowledge cube, is defined, and its intended use as an instrument that manages to cope with different data types and detect knowledge domains is elaborated. Furthermore, the underlying architecture, consisting of the three layers of the storing, representing, and structuring of knowledge, is described. Finally, benefits as well as challenges of deploying it are listed alongside application types that could profit from having such an enhanced knowledge base.

Keywords: granular computing, granular knowledge, hierarchical structuring, knowledge bases

Procedia PDF Downloads 466
8796 Proposed Anticipating Learning Classifier System for Cloud Intrusion Detection (ALCS-CID)

Authors: Wafa' Slaibi Alsharafat

Abstract:

Cloud computing is a modern approach in network environment. According to increased number of network users and online systems, there is a need to help these systems to be away from unauthorized resource access and detect any attempts for privacy contravention. For that purpose, Intrusion Detection System is an effective security mechanism to detect any attempts of attacks for cloud resources and their information. In this paper, Cloud Intrusion Detection System has been proposed in term of reducing or eliminating any attacks. This model concerns about achieving high detection rate after conducting a set of experiments using benchmarks dataset called KDD'99.

Keywords: IDS, cloud computing, anticipating classifier system, intrusion detection

Procedia PDF Downloads 446
8795 KBASE Technological Framework - Requirements

Authors: Ivan Stanev, Maria Koleva

Abstract:

Automated software development issues are addressed in this paper. Layers and packages of a Common Platform for Automated Programming (CPAP) are defined based on Service Oriented Architecture, Cloud computing, Knowledge based automated software engineering (KBASE) and Method of automated programming. Tools of seven leading companies (AWS of Amazon, Azure of Microsoft, App Engine of Google, vCloud of VMWare, Bluemix of IBM, Helion of HP, OCPaaS of Oracle) are analyzed in the context of CPAP. Based on the results of the analysis CPAP requirements are formulated

Keywords: automated programming, cloud computing, knowledge based software engineering, service oriented architecture

Procedia PDF Downloads 268
8794 Geomechanical Technologies for Assessing Three-Dimensional Stability of Underground Excavations Utilizing Remote-Sensing, Finite Element Analysis, and Scientific Visualization

Authors: Kwang Chun, John Kemeny

Abstract:

Light detection and ranging (LiDAR) has been a prevalent remote-sensing technology applied in the geological fields due to its high precision and ease of use. One of the major applications is to use the detailed geometrical information of underground structures as a basis for the generation of a three-dimensional numerical model that can be used in a geotechnical stability analysis such as FEM or DEM. To date, however, straightforward techniques in reconstructing the numerical model from the scanned data of the underground structures have not been well established or tested. In this paper, we propose a comprehensive approach integrating all the various processes, from LiDAR scanning to finite element numerical analysis. The study focuses on converting LiDAR 3D point clouds of geologic structures containing complex surface geometries into a finite element model. This methodology has been applied to Kartchner Caverns in Arizona, where detailed underground and surface point clouds can be used for the analysis of underground stability. Numerical simulations were performed using the finite element code Abaqus and presented by 3D computing visualization solution, ParaView. The results are useful in studying the stability of all types of underground excavations including underground mining and tunneling.

Keywords: finite element analysis, LiDAR, remote-sensing, scientific visualization, underground stability

Procedia PDF Downloads 136
8793 Signs-Only Compressed Row Storage Format for Exact Diagonalization Study of Quantum Fermionic Models

Authors: Michael Danilov, Sergei Iskakov, Vladimir Mazurenko

Abstract:

The present paper describes a high-performance parallel realization of an exact diagonalization solver for quantum-electron models in a shared memory computing system. The proposed algorithm contains a storage format for efficient computing eigenvalues and eigenvectors of a quantum electron Hamiltonian matrix. The results of the test calculations carried out for 15 sites Hubbard model demonstrate reduction in the required memory and good multiprocessor scalability, while maintaining performance of the same order as compressed row storage.

Keywords: sparse matrix, compressed format, Hubbard model, Anderson model

Procedia PDF Downloads 365
8792 Cybersecurity Awareness Among Applied Sciences Student Population

Authors: Sanja Bracun, Nikolina Kasunic

Abstract:

After graduation, the student population of applied sciences will become the population of employees on IT experts’ positions or "just" business users of certain IT technologies for which the level of awareness of existing cybersecurity risks is extremely important. This research results define the current cybersecurity awareness level of students at Zagreb University of Applied Sciences (TVZ), what can be useful not only for teaching staff to form a curriculum related to cybersecurity more accurately but also to employers to know what to expect from their future employees regarding cybersecurity awareness level.

Keywords: student population cybersecurity awareness, cybersecurity awareness, cybersecurity, applied sciences students

Procedia PDF Downloads 218
8791 Hierarchical Checkpoint Protocol in Data Grids

Authors: Rahma Souli-Jbali, Minyar Sassi Hidri, Rahma Ben Ayed

Abstract:

Grid of computing nodes has emerged as a representative means of connecting distributed computers or resources scattered all over the world for the purpose of computing and distributed storage. Since fault tolerance becomes complex due to the availability of resources in decentralized grid environment, it can be used in connection with replication in data grids. The objective of our work is to present fault tolerance in data grids with data replication-driven model based on clustering. The performance of the protocol is evaluated with Omnet++ simulator. The computational results show the efficiency of our protocol in terms of recovery time and the number of process in rollbacks.

Keywords: data grids, fault tolerance, clustering, chandy-lamport

Procedia PDF Downloads 302
8790 Innovation in PhD Training in the Interdisciplinary Research Institute

Authors: B. Shaw, K. Doherty

Abstract:

The Cultural Communication and Computing Research Institute (C3RI) is a diverse multidisciplinary research institute including art, design, media production, communication studies, computing and engineering. Across these disciplines it can seem like there are enormous differences of research practice and convention, including differing positions on objectivity and subjectivity, certainty and evidence, and different political and ethical parameters. These differences sit within, often unacknowledged, histories, codes, and communication styles of specific disciplines, and it is all these aspects that can make understanding of research practice across disciplines difficult. To explore this, a one day event was orchestrated, testing how a PhD community might communicate and share research in progress in a multi-disciplinary context. Instead of presenting results at a conference, research students were tasked to articulate their method of inquiry. A working party of students from across disciplines had to design a conference call, visual identity and an event framework that would work for students across all disciplines. The process of establishing the shape and identity of the conference was revealing. Even finding a linguistic frame that would meet the expectations of different disciplines for the conference call was challenging. The first abstracts submitted either resorted to reporting findings, or only described method briefly. It took several weeks of supported intervention for research students to get ‘inside’ their method and to understand their research practice as a process rich with philosophical and practical decisions and implications. In response to the abstracts the conference committee generated key methodological categories for conference sessions, including sampling, capturing ‘experience’, ‘making models’, researcher identities, and ‘constructing data’. Each session involved presentations by visual artists, communications students and computing researchers with inter-disciplinary dialogue, facilitated by alumni Chairs. The apparently simple focus on method illuminated research process as a site of creativity, innovation and discovery, and also built epistemological awareness, drawing attention to what is being researched and how it can be known. It was surprisingly difficult to limit students to discussing method, and it was apparent that the vocabulary available for method is sometimes limited. However, by focusing on method rather than results, the genuine process of research, rather than one constructed for approval, could be captured. In unlocking the twists and turns of planning and implementing research, and the impact of circumstance and contingency, students had to reflect frankly on successes and failures. This level of self – and public- critique emphasised the degree of critical thinking and rigour required in executing research and demonstrated that honest reportage of research, faults and all, is good valid research. The process also revealed the degree that disciplines can learn from each other- the computing students gained insights from the sensitive social contextualizing generated by communications and art and design students, and art and design students gained understanding from the greater ‘distance’ and emphasis on application that computing students applied to their subjects. Finding the means to develop dialogue across disciplines makes researchers better equipped to devise and tackle research problems across disciplines, potentially laying the ground for more effective collaboration.

Keywords: interdisciplinary, method, research student, training

Procedia PDF Downloads 180
8789 Case-Based Reasoning Approach for Process Planning of Internal Thread Cold Extrusion

Authors: D. Zhang, H. Y. Du, G. W. Li, J. Zeng, D. W. Zuo, Y. P. You

Abstract:

For the difficult issues of process selection, case-based reasoning technology is applied to computer aided process planning system for cold form tapping of internal threads on the basis of similarity in the process. A model is established based on the analysis of process planning. Case representation and similarity computing method are given. Confidence degree is used to evaluate the case. Rule-based reuse strategy is presented. The scheme is illustrated and verified by practical application. The case shows the design results with the proposed method are effective.

Keywords: case-based reasoning, internal thread, cold extrusion, process planning

Procedia PDF Downloads 479
8788 AI/ML Atmospheric Parameters Retrieval Using the “Atmospheric Retrievals conditional Generative Adversarial Network (ARcGAN)”

Authors: Thomas Monahan, Nicolas Gorius, Thanh Nguyen

Abstract:

Exoplanet atmospheric parameters retrieval is a complex, computationally intensive, inverse modeling problem in which an exoplanet’s atmospheric composition is extracted from an observed spectrum. Traditional Bayesian sampling methods require extensive time and computation, involving algorithms that compare large numbers of known atmospheric models to the input spectral data. Runtimes are directly proportional to the number of parameters under consideration. These increased power and runtime requirements are difficult to accommodate in space missions where model size, speed, and power consumption are of particular importance. The use of traditional Bayesian sampling methods, therefore, compromise model complexity or sampling accuracy. The Atmospheric Retrievals conditional Generative Adversarial Network (ARcGAN) is a deep convolutional generative adversarial network that improves on the previous model’s speed and accuracy. We demonstrate the efficacy of artificial intelligence to quickly and reliably predict atmospheric parameters and present it as a viable alternative to slow and computationally heavy Bayesian methods. In addition to its broad applicability across instruments and planetary types, ARcGAN has been designed to function on low power application-specific integrated circuits. The application of edge computing to atmospheric retrievals allows for real or near-real-time quantification of atmospheric constituents at the instrument level. Additionally, edge computing provides both high-performance and power-efficient computing for AI applications, both of which are critical for space missions. With the edge computing chip implementation, ArcGAN serves as a strong basis for the development of a similar machine-learning algorithm to reduce the downlinked data volume from the Compact Ultraviolet to Visible Imaging Spectrometer (CUVIS) onboard the DAVINCI mission to Venus.

Keywords: deep learning, generative adversarial network, edge computing, atmospheric parameters retrieval

Procedia PDF Downloads 145
8787 A Review on Applications of Experts Systems in Medical Sciences

Authors: D. K. Sreekantha, T. M. Girish, R. H. Fattepur

Abstract:

In this article, we have given an overview of medical expert systems, which can be used for the developed of physicians in making decisions such as appropriate, prognostic, and therapeutic decisions which help to organize, store, and gives appropriate medical knowledge needed by physicians and practitioners during medical operations or further treatment. If they support the studies by using these systems, advanced tools in medicine will be developed in the future. New trends in the methodology of development of medical expert systems have also been discussed in this paper. So Authors would like to develop an innovative IT based solution to help doctors in rural areas to gain expertise in Medical Science for treating patients. This paper aims to survey the Soft Computing techniques in treating patient’s problems used throughout the world.

Keywords: expert system, fuzzy logic, knowledge base, soft computing, epilepsy

Procedia PDF Downloads 234
8786 Recent Developments in the Application of Deep Learning to Stock Market Prediction

Authors: Shraddha Jain Sharma, Ratnalata Gupta

Abstract:

Predicting stock movements in the financial market is both difficult and rewarding. Analysts and academics are increasingly using advanced approaches such as machine learning techniques to anticipate stock price patterns, thanks to the expanding capacity of computing and the recent advent of graphics processing units and tensor processing units. Stock market prediction is a type of time series prediction that is incredibly difficult to do since stock prices are influenced by a variety of financial, socioeconomic, and political factors. Furthermore, even minor mistakes in stock market price forecasts can result in significant losses for companies that employ the findings of stock market price prediction for financial analysis and investment. Soft computing techniques are increasingly being employed for stock market prediction due to their better accuracy than traditional statistical methodologies. The proposed research looks at the need for soft computing techniques in stock market prediction, the numerous soft computing approaches that are important to the field, past work in the area with their prominent features, and the significant problems or issue domain that the area involves. For constructing a predictive model, the major focus is on neural networks and fuzzy logic. The stock market is extremely unpredictable, and it is unquestionably tough to correctly predict based on certain characteristics. This study provides a complete overview of the numerous strategies investigated for high accuracy prediction, with a focus on the most important characteristics.

Keywords: stock market prediction, artificial intelligence, artificial neural networks, fuzzy logic, accuracy, deep learning, machine learning, stock price, trading volume

Procedia PDF Downloads 59
8785 Artificial Neurons Based on Memristors for Spiking Neural Networks

Authors: Yan Yu, Wang Yu, Chen Xintong, Liu Yi, Zhang Yanzhong, Wang Yanji, Chen Xingyu, Zhang Miaocheng, Tong Yi

Abstract:

Neuromorphic computing based on spiking neural networks (SNNs) has emerged as a promising avenue for building the next generation of intelligent computing systems. Owing to its high-density integration, low power, and outstanding nonlinearity, memristors have attracted emerging attention on achieving SNNs. However, fabricating a low-power and robust memristor-based spiking neuron without extra electrical components is still a challenge for brain-inspired systems. In this work, we demonstrate a TiO₂-based threshold switching (TS) memristor to emulate a leaky integrate-and-fire (LIF) neuron without auxiliary circuits, used to realize single layer fully connected (FC) SNNs. Moreover, our TiO₂-based resistive switching (RS) memristors realize spiking-time-dependent-plasticity (STDP), originating from the Ag diffusion-based filamentary mechanism. This work demonstrates that TiO2-based memristors may provide an efficient method to construct hardware neuromorphic computing systems.

Keywords: leaky integrate-and-fire, memristor, spiking neural networks, spiking-time-dependent-plasticity

Procedia PDF Downloads 96
8784 Architecture of a Preliminary Course on Computational Thinking

Authors: Mintu Philip, Renumol V. G.

Abstract:

An introductory programming course is a major challenge faced in Computing Education. Many of the introductory programming courses fail because student concentrate mainly on writing programs using a programming language rather than involving in problem solving. Computational thinking is a general approach to solve problems. This paper proposes a new preliminary course that aims to develop computational thinking skills in students, which may help them to become good programmers. The proposed course is designed based on the four basic components of computational thinking - abstract thinking, logical thinking, modeling thinking and constructive thinking. In this course, students are engaged in hands-on problem solving activities using a new problem solving model proposed in this paper.

Keywords: computational thinking, computing education, abstraction, constructive thinking, modelling thinking

Procedia PDF Downloads 417
8783 Loan Repayment Prediction Using Machine Learning: Model Development, Django Web Integration and Cloud Deployment

Authors: Seun Mayowa Sunday

Abstract:

Loan prediction is one of the most significant and recognised fields of research in the banking, insurance, and the financial security industries. Some prediction systems on the market include the construction of static software. However, due to the fact that static software only operates with strictly regulated rules, they cannot aid customers beyond these limitations. Application of many machine learning (ML) techniques are required for loan prediction. Four separate machine learning models, random forest (RF), decision tree (DT), k-nearest neighbour (KNN), and logistic regression, are used to create the loan prediction model. Using the anaconda navigator and the required machine learning (ML) libraries, models are created and evaluated using the appropriate measuring metrics. From the finding, the random forest performs with the highest accuracy of 80.17% which was later implemented into the Django framework. For real-time testing, the web application is deployed on the Alibabacloud which is among the top 4 biggest cloud computing provider. Hence, to the best of our knowledge, this research will serve as the first academic paper which combines the model development and the Django framework, with the deployment into the Alibaba cloud computing application.

Keywords: k-nearest neighbor, random forest, logistic regression, decision tree, django, cloud computing, alibaba cloud

Procedia PDF Downloads 100
8782 A New Protocol Ensuring Users' Privacy in Pervasive Environment

Authors: Mohammed Nadir Djedid, Abdallah Chouarfia

Abstract:

Transparency of the system and its integration into the natural environment of the user are some of the important features of pervasive computing. But these characteristics that are considered as the strongest points of pervasive systems are also their weak points in terms of the user’s privacy. The privacy in pervasive systems involves more than the confidentiality of communications and concealing the identity of virtual users. The physical presence and behavior of the user in the pervasive space cannot be completely hidden and can reveal the secret of his/her identity and affect his/her privacy. This paper shows that the application of major techniques for protecting the user’s privacy still insufficient. A new solution named Shadow Protocol is proposed, which allows the users to authenticate and interact with the surrounding devices within an ubiquitous computing environment while preserving their privacy.

Keywords: pervasive systems, identification, authentication, privacy

Procedia PDF Downloads 447
8781 Investigating the Relationship between Bank and Cloud Provider

Authors: Hatim Elhag

Abstract:

Banking and Financial Service Institutions are possibly the most advanced in terms of technology adoption and use it as a key differentiator. With high levels of business process automation, maturity in the functional portfolio, straight through processing and proven technology outsourcing benefits, Banking sector stand to benefit significantly from Cloud computing capabilities. Additionally, with complex Compliance and Regulatory policies, combined with expansive products and geography coverage, the business impact is even greater. While the benefits are exponential, there are also significant challenges in adopting this model– including Legal, Security, Performance, Reliability, Transformation complexity, Operating control and Governance and most importantly proof for the promised cost benefits. However, new architecture designed should be implemented to align this approach.

Keywords: security, cloud, banking sector, cloud computing

Procedia PDF Downloads 474
8780 Smart Structures for Cost Effective Cultural Heritage Preservation

Authors: Tamara Trček Pečak, Andrej Mohar, Denis Trček

Abstract:

This article investigates the latest technological means, which deploy smart structures that are based on (advanced) wireless sensors technologies and ubiquitous computing in general in order to support the above mentioned decision making. Based on two years of in-field research experiences it gives their analysis for these kinds of purposes and provides appropriate architectures and architectural solutions. Moreover, the directions for future research are stated, because these technologies are currently the most promising ones to enable cost-effective preservation of cultural heritage not only in uncontrolled places, but also in general.

Keywords: smart structures, wireless sensors, sensors networks, green computing, cultural heritage preservation, monitoring, cost effectiveness

Procedia PDF Downloads 417
8779 Molecular Dynamics Simulation on Nanoelectromechanical Graphene Nanoflake Shuttle Device

Authors: Eunae Lee, Oh-Kuen Kwon, Ki-Sub Kim, Jeong Won Kang

Abstract:

We investigated the dynamic properties of graphene-nanoribbon (GNR) memory encapsulating graphene-nanoflake (GNF) shuttle in the potential to be applicable as a non-volatile random access memory via molecular dynamics simulations. This work explicitly demonstrates that the GNR encapsulating the GNF shuttle can be applied to nonvolatile memory. The potential well was originated by the increase of the attractive vdW energy between the GNRs when the GNF approached the edges of the GNRs. So the bistable positions were located near the edges of the GNRs. Such a nanoelectromechanical non-volatile memory based on graphene is also applicable to the development of switches, sensors, and quantum computing.

Keywords: graphene nanoribbon, graphene nanoflake, shuttle memory, molecular dynamics

Procedia PDF Downloads 412