Search results for: computing methodologies
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1970

Search results for: computing methodologies

1730 Improving Fault Tolerance and Load Balancing in Heterogeneous Grid Computing Using Fractal Transform

Authors: Saad M. Darwish, Adel A. El-Zoghabi, Moustafa F. Ashry

Abstract:

The popularity of the Internet and the availability of powerful computers and high-speed networks as low-cost commodity components are changing the way we use computers today. These technical opportunities have led to the possibility of using geographically distributed and multi-owner resources to solve large-scale problems in science, engineering, and commerce. Recent research on these topics has led to the emergence of a new paradigm known as Grid computing. To achieve the promising potentials of tremendous distributed resources, effective and efficient load balancing algorithms are fundamentally important. Unfortunately, load balancing algorithms in traditional parallel and distributed systems, which usually run on homogeneous and dedicated resources, cannot work well in the new circumstances. In this paper, the concept of a fast fractal transform in heterogeneous grid computing based on R-tree and the domain-range entropy is proposed to improve fault tolerance and load balancing algorithm by improve connectivity, communication delay, network bandwidth, resource availability, and resource unpredictability. A novel two-dimension figure of merit is suggested to describe the network effects on load balance and fault tolerance estimation. Fault tolerance is enhanced by adaptively decrease replication time and message cost while load balance is enhanced by adaptively decrease mean job response time. Experimental results show that the proposed method yields superior performance over other methods.

Keywords: Grid computing, load balancing, fault tolerance, R-tree, heterogeneous systems

Procedia PDF Downloads 457
1729 Central African Republic Government Recruitment Agency Based on Identity Management and Public Key Encryption

Authors: Koyangbo Guere Monguia Michel Alex Emmanuel

Abstract:

In e-government and especially recruitment, many researches have been conducted to build a trustworthy and reliable online or application system capable to process users or job applicant files. In this research (Government Recruitment Agency), cloud computing, identity management and public key encryption have been used to management domains, access control authorization mechanism and to secure data exchange between entities for reliable procedure of processing files.

Keywords: cloud computing network, identity management systems, public key encryption, access control and authorization

Procedia PDF Downloads 325
1728 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations

Authors: Ram Mohan, Richard Haney, Ajit Kelkar

Abstract:

Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.

Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance

Procedia PDF Downloads 332
1727 Fuzzy Set Approach to Study Appositives and Its Impact Due to Positional Alterations

Authors: E. Mike Dison, T. Pathinathan

Abstract:

Computing with Words (CWW) and Possibilistic Relational Universal Fuzzy (PRUF) are the two concepts which widely represent and measure the vaguely defined natural phenomenon. In this paper, we study the positional alteration of the phrases by which the impact of a natural language proposition gets affected and/or modified. We observe the gradations due to sensitivity/feeling of a statement towards the positional alterations. We derive the classification and modification of the meaning of words due to the positional alteration. We present the results with reference to set theoretic interpretations.

Keywords: appositive, computing with words, possibilistic relational universal fuzzy (PRUF), semantic sentiment analysis, set-theoretic interpretations

Procedia PDF Downloads 127
1726 Survey of Access Controls in Cloud Computing

Authors: Monirah Alkathiry, Hanan Aljarwan

Abstract:

Cloud computing is one of the most significant technologies that the world deals with, in different sectors with different purposes and capabilities. The cloud faces various challenges in securing data from unauthorized access or modification. Consequently, security risks and levels have greatly increased. Therefore, cloud service providers (CSPs) and users need secure mechanisms that ensure that data are kept secret and safe from any disclosures or exploits. For this reason, CSPs need a number of techniques and technologies to manage and secure access to the cloud services to achieve security goals, such as confidentiality, integrity, identity access management (IAM), etc. Therefore, this paper will review and explore various access controls implemented in a cloud environment that achieve different security purposes. The methodology followed in this survey was conducting an assessment, evaluation, and comparison between those access controls mechanisms and technologies based on different factors, such as the security goals it achieves, usability, and cost-effectiveness. This assessment resulted in the fact that the technology used in an access control affects the security goals it achieves as well as there is no one access control method that achieves all security goals. Consequently, such a comparison would help decision-makers to choose properly the access controls that meet their requirements.

Keywords: access controls, cloud computing, confidentiality, identity and access management

Procedia PDF Downloads 102
1725 Building a Hierarchical, Granular Knowledge Cube

Authors: Alexander Denzler, Marcel Wehrle, Andreas Meier

Abstract:

A knowledge base stores facts and rules about the world that applications can use for the purpose of reasoning. By applying the concept of granular computing to a knowledge base, several advantages emerge. These can be harnessed by applications to improve their capabilities and performance. In this paper, the concept behind such a construct, called a granular knowledge cube, is defined, and its intended use as an instrument that manages to cope with different data types and detect knowledge domains is elaborated. Furthermore, the underlying architecture, consisting of the three layers of the storing, representing, and structuring of knowledge, is described. Finally, benefits as well as challenges of deploying it are listed alongside application types that could profit from having such an enhanced knowledge base.

Keywords: granular computing, granular knowledge, hierarchical structuring, knowledge bases

Procedia PDF Downloads 466
1724 Resilient Environments vs. Resilient Architects: Creativity, Practice and Education

Authors: Y. Perera, M. Pathiraja

Abstract:

Within the paradigm of 'Resilient Built-environments,' in order for architecture to be resilient, 'Resilience' should be identified as an essential component of the architect’s notion of creativity. In much simpler terms, 'Resilient Built-Environment' should necessarily be a by-product of the 'Resilient Architect.' The inherent influence of individualistic notions of creativity upon the practice had intensified the dichotomy between theory and practice unless the notion of 'Resilience' is identified as an integral component of the architect’s notion of creativity. Analysing the architectural position is an ideal way of understanding the architect’s notion of creativity, therefore, in exploring the notion of 'Resilience' and the 'Resilient Architect' within the Sri Lankan platform, the architectural positions of two renowned architects; Geoffrey Bawa and Valentine Gunasekara were explored and analysed. The architectural positions of both the architects asserted specific rules and methodologies adopted within the process of problem solving that had subsequently led to a traceable language / pattern within their architecture. The dominance of such rules within the practice could be detrimental to adaptation of theories / notions, such as 'Resilience' and the formation of the 'Resilient Architect', unless methodologies itself are flexible, robust, despite rigidity, or else the notion of 'Resilience' exist in the form of a methodological rule.

Keywords: architectural position, creativity, education, practice, resilience, theory

Procedia PDF Downloads 287
1723 Product Life Cycle Assessment of Generatively Designed Furniture for Interiors Using Robot Based Additive Manufacturing

Authors: Andrew Fox, Qingping Yang, Yuanhong Zhao, Tao Zhang

Abstract:

Furniture is a very significant subdivision of architecture and its inherent interior design activities. The furniture industry has developed from an artisan-driven craft industry, whose forerunners saw themselves manifested in their crafts and treasured a sense of pride in the creativity of their designs, these days largely reduced to an anonymous collective mass-produced output. Although a very conservative industry, there is great potential for the implementation of collaborative digital technologies allowing a reconfigured artisan experience to be reawakened in a new and exciting form. The furniture manufacturing industry, in general, has been slow to adopt new methodologies for a design using artificial and rule-based generative design. This tardiness has meant the loss of potential to enhance its capabilities in producing sustainable, flexible, and mass customizable ‘right first-time’ designs. This paper aims to demonstrate the concept methodology for the creation of alternative and inspiring aesthetic structures for robot-based additive manufacturing (RBAM). These technologies can enable the economic creation of previously unachievable structures, which traditionally would not have been commercially economic to manufacture. The integration of these technologies with the computing power of generative design provides the tools for practitioners to create concepts which are well beyond the insight of even the most accomplished traditional design teams. This paper aims to address the problem by introducing generative design methodologies employing the Autodesk Fusion 360 platform. Examination of the alternative methods for its use has the potential to significantly reduce the estimated 80% contribution to environmental impact at the initial design phase. Though predominantly a design methodology, generative design combined with RBAM has the potential to leverage many lean manufacturing and quality assurance benefits, enhancing the efficiency and agility of modern furniture manufacturing. Through a case study examination of a furniture artifact, the results will be compared to a traditionally designed and manufactured product employing the Ecochain Mobius product life cycle analysis (LCA) platform. This will highlight the benefits of both generative design and robot-based additive manufacturing from an environmental impact and manufacturing efficiency standpoint. These step changes in design methodology and environmental assessment have the potential to revolutionise the design to manufacturing workflow, giving momentum to the concept of conceiving a pre-industrial model of manufacturing, with the global demand for a circular economy and bespoke sustainable design at its heart.

Keywords: robot, manufacturing, generative design, sustainability, circular econonmy, product life cycle assessment, furniture

Procedia PDF Downloads 109
1722 Proposed Anticipating Learning Classifier System for Cloud Intrusion Detection (ALCS-CID)

Authors: Wafa' Slaibi Alsharafat

Abstract:

Cloud computing is a modern approach in network environment. According to increased number of network users and online systems, there is a need to help these systems to be away from unauthorized resource access and detect any attempts for privacy contravention. For that purpose, Intrusion Detection System is an effective security mechanism to detect any attempts of attacks for cloud resources and their information. In this paper, Cloud Intrusion Detection System has been proposed in term of reducing or eliminating any attacks. This model concerns about achieving high detection rate after conducting a set of experiments using benchmarks dataset called KDD'99.

Keywords: IDS, cloud computing, anticipating classifier system, intrusion detection

Procedia PDF Downloads 446
1721 Challenges and Opportunities in Computing Logistics Cost in E-Commerce Supply Chain

Authors: Pramod Ghadge, Swadesh Srivastava

Abstract:

Revenue generation of a logistics company depends on how the logistics cost of a shipment is calculated. Logistics cost of a shipment is a function of distance & speed of the shipment travel in a particular network, its volumetric size and dead weight. Logistics billing is based mainly on the consumption of the scarce resource (space or weight carrying capacity of a carrier). Shipment’s size or deadweight is a function of product and packaging weight, dimensions and flexibility. Hence, to arrive at a standard methodology to compute accurate cost to bill the customer, the interplay among above mentioned physical attributes along with their measurement plays a key role. This becomes even more complex for an ecommerce company, like Flipkart, which caters to shipments from both warehouse and marketplace in an unorganized non-standard market like India. In this paper, we will explore various methodologies to define a standard way of billing the non-standard shipments across a wide range of size, shape and deadweight. Those will be, usage of historical volumetric/dead weight data to arrive at a factor which can be used to compute the logistics cost of a shipment, also calculating the real/contour volume of a shipment to address the problem of irregular shipment shapes which cannot be solved by conventional bounding box volume measurements. We will also discuss certain key business practices and operational quality considerations needed to bring standardization and drive appropriate ownership in the ecosystem.

Keywords: contour volume, logistics, real volume, volumetric weight

Procedia PDF Downloads 235
1720 The Revealed Preference Methods in Economic Valuation of Environmental Goods: A Review

Authors: Sara Sousa

Abstract:

The environmental goods and services have often been neglected in crucial decisions affecting the environment mainly because the difficulty in estimating their economic value, since we are dealing with non-market goods and, thus, without a price associated. Nevertheless, the inexistence of prices does not necessarily mean these goods have no value. The environment is a key element in today's society that seeks to be as sustainable as possible, where the environmental assets have both use and non-use values. To estimate the use value, researchers may apply the revealed preference methods. This paper provides a theoretical review of the main concepts and methodologies on the economic valuation of the environment, with particular emphasis on the revealed preference techniques. Based on a detailed literature review, this study concludes that, despite some inherent limitations, the revealed preference methodologies – travel cost, hedonic price, and averting behaviour – represent essential tools for the researchers who accept the challenge to estimate the use value of environmental goods and services based on the actual individuals` behaviour. The main purpose of this study is to contribute to an increased theoretical information on the economic valuation of environmental assets, allowing researchers and policymakers to improve future decisions regarding the environment.

Keywords: economic valuation, environmental goods, revealed preference methods, total economic value

Procedia PDF Downloads 101
1719 Quality Assurance in Cardiac Disorder Detection Images

Authors: Anam Naveed, Asma Andleeb, Mehreen Sirshar

Abstract:

In the article, Image processing techniques have been applied on cardiac images for enhancing the image quality. Two types of methodologies considers for survey, invasive techniques and non-invasive techniques. Different image processes for improvement of cardiac image quality and reduce the amount of radiation exposure for invasive techniques are explored. Different image processing algorithms for enhancing the noninvasive cardiac image qualities are described. Beside these two methodologies, third methodology has applied on live streaming of heart rate on ECG window for extracting necessary information, removing noise and enhancing quality. Sensitivity analyses have been carried out to investigate the impacts of cardiac images for diagnosis of cardiac arteries disease and how the enhancement on images will help the cardiologist to diagnoses disease. The paper evaluates strengths and weaknesses of different techniques applied for improved the image quality and draw a conclusion. Some specific limitations must be considered for whole survey, like the patient heart beat must be 70-75 beats/minute while doing the angiography, similarly patient weight and exposure radiation amount has some limitation.

Keywords: cardiac images, CT angiography, critical analysis, exposure radiation, invasive techniques, invasive techniques, non-invasive techniques

Procedia PDF Downloads 317
1718 KBASE Technological Framework - Requirements

Authors: Ivan Stanev, Maria Koleva

Abstract:

Automated software development issues are addressed in this paper. Layers and packages of a Common Platform for Automated Programming (CPAP) are defined based on Service Oriented Architecture, Cloud computing, Knowledge based automated software engineering (KBASE) and Method of automated programming. Tools of seven leading companies (AWS of Amazon, Azure of Microsoft, App Engine of Google, vCloud of VMWare, Bluemix of IBM, Helion of HP, OCPaaS of Oracle) are analyzed in the context of CPAP. Based on the results of the analysis CPAP requirements are formulated

Keywords: automated programming, cloud computing, knowledge based software engineering, service oriented architecture

Procedia PDF Downloads 268
1717 Signs-Only Compressed Row Storage Format for Exact Diagonalization Study of Quantum Fermionic Models

Authors: Michael Danilov, Sergei Iskakov, Vladimir Mazurenko

Abstract:

The present paper describes a high-performance parallel realization of an exact diagonalization solver for quantum-electron models in a shared memory computing system. The proposed algorithm contains a storage format for efficient computing eigenvalues and eigenvectors of a quantum electron Hamiltonian matrix. The results of the test calculations carried out for 15 sites Hubbard model demonstrate reduction in the required memory and good multiprocessor scalability, while maintaining performance of the same order as compressed row storage.

Keywords: sparse matrix, compressed format, Hubbard model, Anderson model

Procedia PDF Downloads 365
1716 Investigation of the Physical Computing in Computational Thinking Practices, Computer Programming Concepts and Self-Efficacy for Crosscutting Ideas in STEM Content Environments

Authors: Sarantos Psycharis

Abstract:

Physical Computing, as an instructional model, is applied in the framework of the Engineering Pedagogy to teach “transversal/cross-cutting ideas” in a STEM content approach. Labview and Arduino were used in order to connect the physical world with real data in the framework of the so called Computational Experiment. Tertiary prospective engineering educators were engaged during their course and Computational Thinking (CT) concepts were registered before and after the intervention across didactic activities using validated questionnaires for the relationship between self-efficacy, computer programming, and CT concepts when STEM content epistemology is implemented in alignment with the Computational Pedagogy model. Results show a significant change in students’ responses for self-efficacy for CT before and after the instruction. Results also indicate a significant relation between the responses in the different CT concepts/practices. According to the findings, STEM content epistemology combined with Physical Computing should be a good candidate as a learning and teaching approach in university settings that enhances students’ engagement in CT concepts/practices.

Keywords: arduino, computational thinking, computer programming, Labview, self-efficacy, STEM

Procedia PDF Downloads 86
1715 The Use of Correlation Difference for the Prediction of Leakage in Pipeline Networks

Authors: Mabel Usunobun Olanipekun, Henry Ogbemudia Omoregbee

Abstract:

Anomalies such as water pipeline and hydraulic or petrochemical pipeline network leakages and bursts have significant implications for economic conditions and the environment. In order to ensure pipeline systems are reliable, they must be efficiently controlled. Wireless Sensor Networks (WSNs) have become a powerful network with critical infrastructure monitoring systems for water, oil and gas pipelines. The loss of water, oil and gas is inevitable and is strongly linked to financial costs and environmental problems, and its avoidance often leads to saving of economic resources. Substantial repair costs and the loss of precious natural resources are part of the financial impact of leaking pipes. Pipeline systems experts have implemented various methodologies in recent decades to identify and locate leakages in water, oil and gas supply networks. These methodologies include, among others, the use of acoustic sensors, measurements, abrupt statistical analysis etc. The issue of leak quantification is to estimate, given some observations about that network, the size and location of one or more leaks in a water pipeline network. In detecting background leakage, however, there is a greater uncertainty in using these methodologies since their output is not so reliable. In this work, we are presenting a scalable concept and simulation where a pressure-driven model (PDM) was used to determine water pipeline leakage in a system network. These pressure data were collected with the use of acoustic sensors located at various node points after a predetermined distance apart. We were able to determine with the use of correlation difference to determine the leakage point locally introduced at a predetermined point between two consecutive nodes, causing a substantial pressure difference between in a pipeline network. After de-noising the signal from the sensors at the nodes, we successfully obtained the exact point where we introduced the local leakage using the correlation difference model we developed.

Keywords: leakage detection, acoustic signals, pipeline network, correlation, wireless sensor networks (WSNs)

Procedia PDF Downloads 59
1714 Improvement of Process Competitiveness Using Intelligent Reference Models

Authors: Julio Macedo

Abstract:

Several methodologies are now available to conceive the improvements of a process so that it becomes competitive as for example total quality, process reengineering, six sigma, define measure analysis improvement control method. These improvements are of different nature and can be external to the process represented by an optimization model or a discrete simulation model. In addition, the process stakeholders are several and have different desired performances for the process. Hence, the methodologies above do not have a tool to aid in the conception of the required improvements. In order to fill this void we suggest the use of intelligent reference models. A reference model is a set of qualitative differential equations and an objective function that minimizes the gap between the current and the desired performance indexes of the process. The reference models are intelligent so when they receive the current state of the problematic process and the desired performance indexes they generate the required improvements for the problematic process. The reference models are fuzzy cognitive maps added with an objective function and trained using the improvements implemented by the high performance firms. Experiments done in a set of students show the reference models allow them to conceive more improvements than students that do not use these models.

Keywords: continuous improvement, fuzzy cognitive maps, process competitiveness, qualitative simulation, system dynamics

Procedia PDF Downloads 56
1713 Hierarchical Checkpoint Protocol in Data Grids

Authors: Rahma Souli-Jbali, Minyar Sassi Hidri, Rahma Ben Ayed

Abstract:

Grid of computing nodes has emerged as a representative means of connecting distributed computers or resources scattered all over the world for the purpose of computing and distributed storage. Since fault tolerance becomes complex due to the availability of resources in decentralized grid environment, it can be used in connection with replication in data grids. The objective of our work is to present fault tolerance in data grids with data replication-driven model based on clustering. The performance of the protocol is evaluated with Omnet++ simulator. The computational results show the efficiency of our protocol in terms of recovery time and the number of process in rollbacks.

Keywords: data grids, fault tolerance, clustering, chandy-lamport

Procedia PDF Downloads 302
1712 Implementing Green IT Practices in Non-IT Industries in Sri Lanka: Contemplating the Feasibility and Methods to Ensure Sustainability

Authors: Manuela Nayantara Jeyaraj

Abstract:

Green IT is a term that refers to the collective strategic and tactical practices that unswervingly condense the carbon footprint to a diminished proportion in an establishment’s computing procedures. This concept has been tightly knit with IT related organizations; hence it has been precluded to be applied within non-IT organizations in Sri Lanka. With the turn of the century, computing technologies have taken over commonplace activities in every nook and corner in Sri Lanka, which is still on the verge of moving forth in its march towards being a developed country. Hence, it needs to be recursively proven that non-IT industries are well-bound to adhere to ‘Green IT’ practices as well, in order to reduce their carbon footprint and move towards considering the practicality of implementing Green-IT practices within their work-arounds. There are several spheres that need to be taken into account in creating awareness of ‘Green IT’, such as the economic breach, technologies available, legislative bounds, community mind-set and many more. This paper tends to reconnoiter causes that currently restrain non-IT organizations from considering Green IT concepts. By doing so, it is expected to prove the beneficial providence gained by implementing this concept within the organization. The ultimate goal is to propose feasible ‘Green IT’ practices that could be implemented within the context of Sri Lankan non-IT sectors in order to ensure that organization’s sustainable growth towards a long term existence.

Keywords: computing practices, Green IT, non-IT industries, Sri Lanka, sustainability

Procedia PDF Downloads 225
1711 AI/ML Atmospheric Parameters Retrieval Using the “Atmospheric Retrievals conditional Generative Adversarial Network (ARcGAN)”

Authors: Thomas Monahan, Nicolas Gorius, Thanh Nguyen

Abstract:

Exoplanet atmospheric parameters retrieval is a complex, computationally intensive, inverse modeling problem in which an exoplanet’s atmospheric composition is extracted from an observed spectrum. Traditional Bayesian sampling methods require extensive time and computation, involving algorithms that compare large numbers of known atmospheric models to the input spectral data. Runtimes are directly proportional to the number of parameters under consideration. These increased power and runtime requirements are difficult to accommodate in space missions where model size, speed, and power consumption are of particular importance. The use of traditional Bayesian sampling methods, therefore, compromise model complexity or sampling accuracy. The Atmospheric Retrievals conditional Generative Adversarial Network (ARcGAN) is a deep convolutional generative adversarial network that improves on the previous model’s speed and accuracy. We demonstrate the efficacy of artificial intelligence to quickly and reliably predict atmospheric parameters and present it as a viable alternative to slow and computationally heavy Bayesian methods. In addition to its broad applicability across instruments and planetary types, ARcGAN has been designed to function on low power application-specific integrated circuits. The application of edge computing to atmospheric retrievals allows for real or near-real-time quantification of atmospheric constituents at the instrument level. Additionally, edge computing provides both high-performance and power-efficient computing for AI applications, both of which are critical for space missions. With the edge computing chip implementation, ArcGAN serves as a strong basis for the development of a similar machine-learning algorithm to reduce the downlinked data volume from the Compact Ultraviolet to Visible Imaging Spectrometer (CUVIS) onboard the DAVINCI mission to Venus.

Keywords: deep learning, generative adversarial network, edge computing, atmospheric parameters retrieval

Procedia PDF Downloads 145
1710 The Core Obstacles of Continuous Improvement Implementation: Some Key Findings from Health and Education Sectors

Authors: Abdullah Alhaqbani

Abstract:

Purpose: Implementing continuous improvement is a challenge that public sector organisations face in becoming successful. Many obstacles hinder public organisations from successfully implementing continuous improvement. This paper aims to highlight the key core obstacles that face public organisations to implement continuous improvement programmes. Approach: Based on the literature, this paper reviews 66 papers that were published between 2000 and 2013 and that focused on the concept of continuous improvement and improvement methodologies in the context of public sector organisations. The methodologies for continuous improvement covered in these papers include Total Quality Management, Six Sigma, process re-engineering, lean thinking and Kaizen. Findings: Of the 24 obstacles found in the literature, 11 barriers were seen as core barriers that frequently occurred in public sector organisations. The findings indicate that lack of top management commitment; organisational culture and political issues and resistance to change are significant obstacles for improvement programmes. Moreover, this review found that improvement methodologies share some core barriers to successful implementation within public organisations. These barriers as well are common in the different geographic area. For instance lack of top management commitment and training that found in the education sector in Albanian are common barriers of improvement studies in Kuwait, Saudi Arabia, Spain, UK and US. Practical implications: Understanding these core issues and barriers will help managers of public organisations to improve their strategies with respect to continuous improvement. Thus, this review highlights the core issues that prevent a successful continuous improvement journey within the public sector. Value: Identifying and understanding the common obstacles to successfully implementing continuous improvement in the public sector will help public organisations to learn how to improve in launching and successfully sustaining such programmes. However, this is not the end; rather, it is just the beginning of a longer improvement journey. Thus, it is intended that this review will identify key learning opportunities for public sector organisations in developing nations which will then be tested via further research.

Keywords: continuous improvement, total quality management, obstacles, public sector

Procedia PDF Downloads 305
1709 A Review on Applications of Experts Systems in Medical Sciences

Authors: D. K. Sreekantha, T. M. Girish, R. H. Fattepur

Abstract:

In this article, we have given an overview of medical expert systems, which can be used for the developed of physicians in making decisions such as appropriate, prognostic, and therapeutic decisions which help to organize, store, and gives appropriate medical knowledge needed by physicians and practitioners during medical operations or further treatment. If they support the studies by using these systems, advanced tools in medicine will be developed in the future. New trends in the methodology of development of medical expert systems have also been discussed in this paper. So Authors would like to develop an innovative IT based solution to help doctors in rural areas to gain expertise in Medical Science for treating patients. This paper aims to survey the Soft Computing techniques in treating patient’s problems used throughout the world.

Keywords: expert system, fuzzy logic, knowledge base, soft computing, epilepsy

Procedia PDF Downloads 233
1708 Knowledge Management as Tool for Environmental Management System Implementation in Higher Education Institutions

Authors: Natalia Marulanda Grisales

Abstract:

The most significant changes in the characteristics of consumers have contributed to the development and adoption of methodologies and tools that enable organizations to be more competitive in the marketplace. One of these methodologies is the integration of Knowledge Management (KM) phases and Environmental Management Systems (EMS). This integration allows companies to manage and share the required knowledge for EMS adoption, from the place where it is generated to the place where it is going to be exploited. The aim of this paper is to identify the relationship between KM phases as a tool for the adoption of EMS in HEI. The methodology has a descriptive scope and a qualitative approach. It is based on a case study and a review of the literature about KM and EMS. We conducted 266 surveys to students, professors and staff at Minuto de Dios University (Colombia). Data derived from the study indicate that if a HEI wants to achieve an adequate knowledge acquisition and knowledge transfer, it must have clear goals for implementing an EMS. Also, HEI should create empowerment and training spaces for students, professors and staff. In the case study, HEI must generate alternatives that enhance spaces of knowledge appropriation. It was found that 85% of respondents have not received any training from HEI about EMS. 88% of respondents believe that the actions taken by the university are not efficient to knowledge transfer in order to develop an EMS.

Keywords: environmental management systems, higher education institutions, knowledge management, training

Procedia PDF Downloads 346
1707 Artificial Neurons Based on Memristors for Spiking Neural Networks

Authors: Yan Yu, Wang Yu, Chen Xintong, Liu Yi, Zhang Yanzhong, Wang Yanji, Chen Xingyu, Zhang Miaocheng, Tong Yi

Abstract:

Neuromorphic computing based on spiking neural networks (SNNs) has emerged as a promising avenue for building the next generation of intelligent computing systems. Owing to its high-density integration, low power, and outstanding nonlinearity, memristors have attracted emerging attention on achieving SNNs. However, fabricating a low-power and robust memristor-based spiking neuron without extra electrical components is still a challenge for brain-inspired systems. In this work, we demonstrate a TiO₂-based threshold switching (TS) memristor to emulate a leaky integrate-and-fire (LIF) neuron without auxiliary circuits, used to realize single layer fully connected (FC) SNNs. Moreover, our TiO₂-based resistive switching (RS) memristors realize spiking-time-dependent-plasticity (STDP), originating from the Ag diffusion-based filamentary mechanism. This work demonstrates that TiO2-based memristors may provide an efficient method to construct hardware neuromorphic computing systems.

Keywords: leaky integrate-and-fire, memristor, spiking neural networks, spiking-time-dependent-plasticity

Procedia PDF Downloads 96
1706 Architecture of a Preliminary Course on Computational Thinking

Authors: Mintu Philip, Renumol V. G.

Abstract:

An introductory programming course is a major challenge faced in Computing Education. Many of the introductory programming courses fail because student concentrate mainly on writing programs using a programming language rather than involving in problem solving. Computational thinking is a general approach to solve problems. This paper proposes a new preliminary course that aims to develop computational thinking skills in students, which may help them to become good programmers. The proposed course is designed based on the four basic components of computational thinking - abstract thinking, logical thinking, modeling thinking and constructive thinking. In this course, students are engaged in hands-on problem solving activities using a new problem solving model proposed in this paper.

Keywords: computational thinking, computing education, abstraction, constructive thinking, modelling thinking

Procedia PDF Downloads 417
1705 Incorporating Adult Learners’ Interests into Learning Styles: Enhancing Education for Lifelong Learners

Authors: Christie DeGregorio

Abstract:

In today's rapidly evolving educational landscape, adult learners are becoming an increasingly significant demographic. These individuals often possess a wealth of life experiences and diverse interests that can greatly influence their learning styles. Recognizing and incorporating these interests into educational practices can lead to enhanced engagement, motivation, and overall learning outcomes for adult learners. This essay aims to explore the significance of incorporating adult learners' interests into learning styles and provide an overview of the methodologies used in related studies. When investigating the incorporation of adult learners' interests into learning styles, researchers have employed various methodologies to gather valuable insights. These methodologies include surveys, interviews, case studies, and classroom observations. Surveys and interviews allow researchers to collect self-reported data directly from adult learners, providing valuable insights into their interests, preferences, and learning styles. Case studies offer an in-depth exploration of individual adult learners, highlighting how their interests can be integrated into personalized learning experiences. Classroom observations provide researchers with a firsthand understanding of the dynamics between adult learners' interests and their engagement within a learning environment. The major findings from studies exploring the incorporation of adult learners' interests into learning styles reveal the transformative impact of this approach. Firstly, aligning educational content with adult learners' interests increases their motivation and engagement in the learning process. By connecting new knowledge and skills to topics they are passionate about, adult learners become active participants in their own education. Secondly, integrating interests into learning styles fosters a sense of relevance and applicability. Adult learners can see the direct connection between the knowledge they acquire and its real-world applications, which enhances their ability to transfer learning to various contexts. Lastly, personalized learning experiences tailored to individual interests enable adult learners to take ownership of their educational journey, promoting lifelong learning habits and self-directedness.

Keywords: integration, personalization, transferability, learning style

Procedia PDF Downloads 42
1704 Mastering Digital Transformation with the Strategy Tandem Innovation Inside-Out/Outside-In: An Approach to Drive New Business Models, Services and Products in the Digital Age

Authors: S. N. Susenburger, D. Boecker

Abstract:

In the age of Volatility, Uncertainty, Complexity, and Ambiguity (VUCA), where digital transformation is challenging long standing traditional hardware and manufacturing companies, innovation needs a different methodology, strategy, mindset, and culture. What used to be a mindset of scaling per quantity is now shifting to orchestrating ecosystems, platform business models and service bundles. While large corporations are trying to mimic the nimbleness and versatile mindset of startups in the core of their digital strategies, they’re at the frontier of facing one of the largest organizational and cultural changes in history. This paper elaborates on how a manufacturing giant transformed its Corporate Information Technology (IT) to enable digital and Internet of Things (IoT) business while establishing the mindset and the approaches of the Innovation Inside-Out/Outside-In Strategy. It gives insights into the core elements of an innovation culture and the tactics and methodologies leveraged to support the cultural shift and transformation into an IoT company. This paper also outlines the core elements for an innovation culture and how the persona 'Connected Engineer' thrives in the digital innovation environment. Further, it explores how tapping domain-focused ecosystems in vibrant innovative cities can be used as a part of the strategy to facilitate partner co-innovation. Therefore, findings from several use cases, observations and surveys led to conclusion for the strategy tandem of Innovation Inside-Out/Outside-In. The findings indicate that it's crucial in which phases and maturity level the Innovation Inside-Out/Outside-In Strategy is activated: cultural aspects of the business and the regional ecosystem need to be considered, as well as cultural readiness from management and active contributors. The 'not invented here syndrome' is a barrier of large corporations that need to be addressed and managed to successfully drive partnerships, as well as embracing co-innovation and a mindset shifting away from physical products toward new business models, services, and IoT platforms. This paper elaborates on various methodologies and approaches tested in different countries and cultures, including the U.S., Brazil, Mexico, and Germany.

Keywords: innovation management, innovation culture, innovation methodologies, digital transformation

Procedia PDF Downloads 108
1703 A Theorem Related to Sample Moments and Two Types of Moment-Based Density Estimates

Authors: Serge B. Provost

Abstract:

Numerous statistical inference and modeling methodologies are based on sample moments rather than the actual observations. A result justifying the validity of this approach is introduced. More specifically, it will be established that given the first n moments of a sample of size n, one can recover the original n sample points. This implies that a sample of size n and its first associated n moments contain precisely the same amount of information. However, it is efficient to make use of a limited number of initial moments as most of the relevant distributional information is included in them. Two types of density estimation techniques that rely on such moments will be discussed. The first one expresses a density estimate as the product of a suitable base density and a polynomial adjustment whose coefficients are determined by equating the moments of the density estimate to the sample moments. The second one assumes that the derivative of the logarithm of a density function can be represented as a rational function. This gives rise to a system of linear equations involving sample moments, the density estimate is then obtained by solving a differential equation. Unlike kernel density estimation, these methodologies are ideally suited to model ‘big data’ as they only require a limited number of moments, irrespective of the sample size. What is more, they produce simple closed form expressions that are amenable to algebraic manipulations. They also turn out to be more accurate as will be shown in several illustrative examples.

Keywords: density estimation, log-density, polynomial adjustments, sample moments

Procedia PDF Downloads 129
1702 Symbolic Partial Differential Equations Analysis Using Mathematica

Authors: Davit Shahnazaryan, Diogo Gomes, Mher Safaryan

Abstract:

Many symbolic computations and manipulations required in the analysis of partial differential equations (PDE) or systems of PDEs are tedious and error-prone. These computations arise when determining conservation laws, entropies or integral identities, which are essential tools for the study of PDEs. Here, we discuss a new Mathematica package for the symbolic analysis of PDEs that automate multiple tasks, saving time and effort. Methodologies: During the research, we have used concepts of linear algebra and partial differential equations. We have been working on creating algorithms based on theoretical mathematics to find results mentioned below. Major Findings: Our package provides the following functionalities; finding symmetry group of different PDE systems, generation of polynomials invariant with respect to different symmetry groups; simplification of integral quantities by integration by parts and null Lagrangian cleaning, computing general forms of expressions by integration by parts; finding equivalent forms of an integral expression that are simpler or more symmetric form; determining necessary and sufficient conditions on the coefficients for the positivity of a given symbolic expression. Conclusion: Using this package, we can simplify integral identities, find conserved and dissipated quantities of time-dependent PDE or system of PDEs. Some examples in the theory of mean-field games and semiconductor equations are discussed.

Keywords: partial differential equations, symbolic computation, conserved and dissipated quantities, mathematica

Procedia PDF Downloads 131
1701 The Assessment of Forest Wood Biomass Potential in Terms of Sustainable Development

Authors: Julija Konstantinavičienė, Vlada Vitunskienė

Abstract:

The role of sustainable biomass, including wood biomass, is becoming more important because of European Green Deal. The New EU Forest strategy is a flagship element of the European Green Deal and a key action on the EU biodiversity strategy for 2030. The first measure of this strategy is promoting sustainable forest management, including encouraging the sustainable use of wood-based resources. The first aim of this research was to develop and present a new approach to the concept of forest wood biomass potential in terms of sustainable development, distinguishing theoretical, technical and sustainable potential and detailing its constraints. The second aim was to prepare the methodology outline of sustainable forest wood biomass potential assessment and empirically check this methodology, considering economic, social and ecological constraints. The basic methodologies of the research: the review of research (with a combination of semi-systematic and integrative review methodologies), rapid assessment method and statistical data analysis. The developed methodology of assessment of forest wood potential in terms of sustainable development can be used in Lithuania and in other countries and will let us compare this potential a different time and spatial levels. The application of the methodology will be able to serve the development of new national strategies for the wood sector.

Keywords: assessment, constraints, forest wood biomass, methodology, potential, sustainability

Procedia PDF Downloads 87