Search results for: Model driven architecture
8054 Towards a Complete Automation Feature Recognition System for Sheet Metal Manufacturing
Authors: Bahaa Eltahawy, Mikko Ylihärsilä, Reino Virrankoski, Esko Petäjä
Abstract:
Sheet metal processing is automated, but the step from product models to the production machine control still requires human intervention. This may cause time consuming bottlenecks in the production process and increase the risk of human errors. In this paper we present a system, which automatically recognizes features from the CAD-model of the sheet metal product. By using these features, the system produces a complete model of the particular sheet metal product. Then the model is used as an input for the sheet metal processing machine. Currently the system is implemented, capable to recognize more than 11 of the most common sheet metal structural features, and the procedure is fully automated. This provides remarkable savings in the production time, and protects against the human errors. This paper presents the developed system architecture, applied algorithms and system software implementation and testing.Keywords: Feature recognition, automation, sheet metal manufacturing, CAM, CAD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11508053 Bayesian Belief Networks for Test Driven Development
Authors: Vijayalakshmy Periaswamy S., Kevin McDaid
Abstract:
Testing accounts for the major percentage of technical contribution in the software development process. Typically, it consumes more than 50 percent of the total cost of developing a piece of software. The selection of software tests is a very important activity within this process to ensure the software reliability requirements are met. Generally tests are run to achieve maximum coverage of the software code and very little attention is given to the achieved reliability of the software. Using an existing methodology, this paper describes how to use Bayesian Belief Networks (BBNs) to select unit tests based on their contribution to the reliability of the module under consideration. In particular the work examines how the approach can enhance test-first development by assessing the quality of test suites resulting from this development methodology and providing insight into additional tests that can significantly reduce the achieved reliability. In this way the method can produce an optimal selection of inputs and the order in which the tests are executed to maximize the software reliability. To illustrate this approach, a belief network is constructed for a modern software system incorporating the expert opinion, expressed through probabilities of the relative quality of the elements of the software, and the potential effectiveness of the software tests. The steps involved in constructing the Bayesian Network are explained as is a method to allow for the test suite resulting from test-driven development.Keywords: Software testing, Test Driven Development, Bayesian Belief Networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18878052 The Service Failure and Recovery in the Information Technology Services
Authors: Jun Luo, Weiguo Zhang., Dabin Qin
Abstract:
It is important to retain customer satisfaction in information technology services. When a service failure occurs, companies need to take service recovery action to recover their customer satisfaction. Although companies cannot avoid all problems and complaints, they should try to make up. Therefore, service failure and service recovery have become an important and challenging issue for companies. In this paper, the literature and the problems in the information technology services were reviewed. An integrated model of profit driven for the service failure and service recovery was established in view of the benefit of customer and enterprise. Moreover, the interaction between service failure and service recovery strategy was studied, the result of which verified the matching principles of the service recovery strategy and the type of service failure. In addition, the relationship between the cost of service recovery and customer-s cumulative value of service after recovery was analyzed with the model. The result attributes to managers in deciding on appropriate resource allocations for recovery strategies.Keywords: service failure, service recovery, informationtechnology services
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21068051 Energy-Aware Scheduling in Real-Time Systems: An Analysis of Fair Share Scheduling and Priority-Driven Preemptive Scheduling
Authors: Su Xiaohan, Jin Chicheng, Liu Yijing, Burra Venkata Durga Kumar
Abstract:
Energy-aware scheduling in real-time systems aims to minimize energy consumption, but issues related to resource reservation and timing constraints remain challenges. This study focuses on analyzing two scheduling algorithms, Fair-Share Scheduling (FFS) and Priority-Driven Preemptive Scheduling (PDPS), for solving these issues and energy-aware scheduling in real-time systems. Based on research on both algorithms and the processes of solving two problems, it can be found that FFS ensures fair allocation of resources but needs to improve with an imbalanced system load. And PDPS prioritizes tasks based on criticality to meet timing constraints through preemption but relies heavily on task prioritization and may not be energy efficient. Therefore, improvements to both algorithms with energy-aware features will be proposed. Future work should focus on developing hybrid scheduling techniques that minimize energy consumption through intelligent task prioritization, resource allocation, and meeting time constraints.
Keywords: Energy-aware scheduling, fair-share scheduling, priority-driven preemptive scheduling, real-time systems, optimization, resource reservation, timing constraints.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1208050 Implementation of TinyHash based on Hash Algorithm for Sensor Network
Authors: HangRok Lee, YongJe Choi, HoWon Kim
Abstract:
In recent years, it has been proposed security architecture for sensor network.[2][4]. One of these, TinySec by Chris Kalof, Naveen Sastry, David Wagner had proposed Link layer security architecture, considering some problems of sensor network. (i.e : energy, bandwidth, computation capability,etc). The TinySec employs CBC_mode of encryption and CBC-MAC for authentication based on SkipJack Block Cipher. Currently, This TinySec is incorporated in the TinyOS for sensor network security. This paper introduces TinyHash based on general hash algorithm. TinyHash is the module in order to replace parts of authentication and integrity in the TinySec. it implies that apply hash algorithm on TinySec architecture. For compatibility about TinySec, Components in TinyHash is constructed as similar structure of TinySec. And TinyHash implements the HMAC component for authentication and the Digest component for integrity of messages. Additionally, we define the some interfaces for service associated with hash algorithm.Keywords: sensor network security, nesC, TinySec, TinyOS, Hash, HMAC, integrity
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23548049 Least Square-SVM Detector for Wireless BPSK in Multi-Environmental Noise
Authors: J. P. Dubois, Omar M. Abdul-Latif
Abstract:
Support Vector Machine (SVM) is a statistical learning tool developed to a more complex concept of structural risk minimization (SRM). In this paper, SVM is applied to signal detection in communication systems in the presence of channel noise in various environments in the form of Rayleigh fading, additive white Gaussian background noise (AWGN), and interference noise generalized as additive color Gaussian noise (ACGN). The structure and performance of SVM in terms of the bit error rate (BER) metric is derived and simulated for these advanced stochastic noise models and the computational complexity of the implementation, in terms of average computational time per bit, is also presented. The performance of SVM is then compared to conventional binary signaling optimal model-based detector driven by binary phase shift keying (BPSK) modulation. We show that the SVM performance is superior to that of conventional matched filter-, innovation filter-, and Wiener filter-driven detectors, even in the presence of random Doppler carrier deviation, especially for low SNR (signal-to-noise ratio) ranges. For large SNR, the performance of the SVM was similar to that of the classical detectors. However, the convergence between SVM and maximum likelihood detection occurred at a higher SNR as the noise environment became more hostile.Keywords: Colour noise, Doppler shift, innovation filter, least square-support vector machine, matched filter, Rayleigh fading, Wiener filter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18138048 Design of Orientation-Free Handler and Fuzzy Controller for Wire-Driven Heavy Object Lifting System
Authors: Bo-Wei Song, Yun-Jung Lee
Abstract:
This paper presents an intention interface and controller for a wire-driven heavy object lifting system that assists the operator with moving a heavy object. The handler is designed to allow a comfortable working posture for the operator. Plus, as a human assistive system, the operator is involved in the control loop, where a fuzzy control system is used to consider the human control characteristics. The effectiveness and performance of the proposed system are proved by experiments.
Keywords: Fuzzy controller, Handler design, Heavy object lifting system, Human-assistive device, Human-in-the-loop system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16578047 Research on a Forest Fire Spread Simulation Driven by the Wind Field in Complex Terrain
Authors: Ying Shang, Chencheng Wang
Abstract:
The wind field is the main driving factor for the spread of forest fires. For the simulation results of forest fire spread to be more accurate, it is necessary to obtain more detailed wind field data. Therefore, this paper studied the mountainous fine wind field simulation method coupled with WRF (Weather Research and Forecasting Model) and CFD (Computational Fluid Dynamics) to realize the numerical simulation of the wind field in a mountainous area with a scale of 30 m and a small measurement error. Local topographical changes have an important impact on the wind field. Based on the Rothermel fire spread model, a forest fire in Idaho in the western United States was simulated. The historical data proved that the simulation results had a good accuracy. They showed that the fire spread rate will decrease rapidly with time and then reach a steady state. After reaching a steady state, the fire spread growth area will not only be affected by the slope, but will also show a significant quadratic linear positive correlation with the wind speed change.
Keywords: Wind field, numerical simulation, forest fire spread, fire behavior model, complex terrain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3788046 An Intelligent Human-Computer Interaction System for Decision Support
Authors: Chee Siong Teh, Chee Peng Lim
Abstract:
This paper proposes a novel architecture for developing decision support systems. Unlike conventional decision support systems, the proposed architecture endeavors to reveal the decision-making process such that humans' subjectivity can be incorporated into a computerized system and, at the same time, to preserve the capability of the computerized system in processing information objectively. A number of techniques used in developing the decision support system are elaborated to make the decisionmarking process transparent. These include procedures for high dimensional data visualization, pattern classification, prediction, and evolutionary computational search. An artificial data set is first employed to compare the proposed approach with other methods. A simulated handwritten data set and a real data set on liver disease diagnosis are then employed to evaluate the efficacy of the proposed approach. The results are analyzed and discussed. The potentials of the proposed architecture as a useful decision support system are demonstrated.
Keywords: Interactive evolutionary computation, multivariate data projection, pattern classification, topographic map.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14548045 A Materialized Approach to the Integration of XML Documents: the OSIX System
Authors: H. Ahmad, S. Kermanshahani, A. Simonet, M. Simonet
Abstract:
The data exchanged on the Web are of different nature from those treated by the classical database management systems; these data are called semi-structured data since they do not have a regular and static structure like data found in a relational database; their schema is dynamic and may contain missing data or types. Therefore, the needs for developing further techniques and algorithms to exploit and integrate such data, and extract relevant information for the user have been raised. In this paper we present the system OSIX (Osiris based System for Integration of XML Sources). This system has a Data Warehouse model designed for the integration of semi-structured data and more precisely for the integration of XML documents. The architecture of OSIX relies on the Osiris system, a DL-based model designed for the representation and management of databases and knowledge bases. Osiris is a viewbased data model whose indexing system supports semantic query optimization. We show that the problem of query processing on a XML source is optimized by the indexing approach proposed by Osiris.Keywords: Data integration, semi-structured data, views, XML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15908044 Statistical Analysis-Driven Risk Assessment of Criteria Air Pollutants: A Sulfur Dioxide Case Study
Authors: Ehsan Bashiri
Abstract:
A 7-step method (with 25 sub-steps) to assess risk of air pollutants is introduced. These steps are: pre-considerations, sampling, statistical analysis, exposure matrix and likelihood, doseresponse matrix and likelihood, total risk evaluation, and discussion of findings. All mentioned words and expressions are wellunderstood; however, almost all steps have been modified, improved, and coupled in such a way that a comprehensive method has been prepared. Accordingly, the SADRA (Statistical Analysis-Driven Risk Assessment) emphasizes extensive and ongoing application of analytical statistics in traditional risk assessment models. A Sulfur Dioxide case study validates the claim and provides a good illustration for this method.Keywords: Criteria air pollutants, Matrix of risk, Riskassessment, Statistical analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17068043 Complexity Leadership and Knowledge Management in Higher Education
Authors: Prabhakar Venugopal Gantasala
Abstract:
Complex environments triggered by globalization have necessitated new paradigms of leadership – Complexity Leadership that encompass multiple roles that leaders need to take upon. Success of Higher Education institutions depends on how well leaders can provide adaptive, administrative and enabling leadership. Complexity Leadership seems all the more relevant for institutions that are knowledge-driven and thrive on Knowledge creation, Knowledge storage and retrieval, Knowledge Sharing and Knowledge applications. Discussed in this paper are the elements of Globalization and the opportunities and challenges that are brought forth by globalization. The Complexity leadership paradigm in a knowledge-based economy and the need for such a paradigm shift for higher education institutions is presented. Further, the paper also discusses the support the leader requires in a knowledge-driven economy through knowledge management initiatives.Keywords: Globalization, Complexity Leadership, Knowledge Management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17958042 Finite Element Analysis of Different Architectures for Bone Scaffold
Authors: Nimisha R. Shirbhate, Sanjay Bokade
Abstract:
Bone Scaffolds are fundamental architecture or a support structure that allows the regeneration of lost or damaged tissues and they are developed as a crucial tool in biomedical engineering. The structure of bone scaffolds plays an important role in treating bone defects. The shape of the bone scaffold performs a vital role, specifically pore size and shape, which help understand the behavior and strength of the scaffold. In this article, first, fundamental aspects of bone scaffold design are established. Second, the behavior of each architecture of the bone scaffold with biomaterials is discussed. Finally, for each structure, the stress analysis was carried out. This study aimed to design a porous and mechanically strong bone regeneration scaffold that can be successfully manufactured. Four porous architectures of the bone scaffold were designed using Rhinoceros solid modelling software. The structure model consisted of repeatable unit cells arranged in layers to fill the chosen scaffold volume. The mechanical behavior of used biocompatible material is studied with the help of ANSYS 19.2 software. It is also playing significant role to predict the strength of defined structures or 3 dimensional models.
Keywords: Bone scaffold, stress analysis, porous structure, static loading.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5348041 Decision Support System Based on Data Warehouse
Authors: Yang Bao, LuJing Zhang
Abstract:
Typical Intelligent Decision Support System is 4-based, its design composes of Data Warehouse, Online Analytical Processing, Data Mining and Decision Supporting based on models, which is called Decision Support System Based on Data Warehouse (DSSBDW). This way takes ETL,OLAP and DM as its implementing means, and integrates traditional model-driving DSS and data-driving DSS into a whole. For this kind of problem, this paper analyzes the DSSBDW architecture and DW model, and discusses the following key issues: ETL designing and Realization; metadata managing technology using XML; SQL implementing, optimizing performance, data mapping in OLAP; lastly, it illustrates the designing principle and method of DW in DSSBDW.
Keywords: Decision Support System, Data Warehouse, Data Mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38628040 Direct Torque Control - DTC of Induction Motor Used for Piloting a Centrifugal Pump Supplied by a Photovoltaic Generator
Authors: S. Abouda, F. Nollet, A. Chaari, N. Essounbouli, Y. Koubaa
Abstract:
In this paper we propose the study of a centrifugal pump control system driven by a three-phase induction motor, which is supplied by a PhotoVoltaic PV generator. The system includes solar panel, a DC / DC converter equipped with its MPPT control, a voltage inverter to three-phase Pulse Width Modulation - PWM and a centrifugal pump driven by a three phase induction motor. In order to control the flow of the centrifugal pump, a Direct Torque Control - DTC of the induction machine is used. To illustrate the performances of the control, simulation results are carried out using Matlab/Simulink.
Keywords: Photovoltaic generators, Maximum power point tracking (MPPT), DC/DC converters, Induction motor, Direct torque control (DTC).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31648039 Description and Analysis of Embedded Firewall Techniques
Authors: Ahmed Abou Elfarag, A. Baith M., Hassan H. Alkhishali
Abstract:
With the turn of this century, many researchers started showing interest in Embedded Firewall (EF) implementations. These are not the usual firewalls that are used as checkpoints at network gateways. They are, rather, applied near those hosts that need protection. Hence by using them, individual or grouped network components can be protected from the inside as well as from external attacks. This paper presents a study of EF-s, looking at their architecture and problems. A comparative study assesses how practical each kind is. It particularly focuses on the architecture, weak points, and portability of each kind. A look at their use by different categories of users is also presented.Keywords: Embedded Firewall (EF), Network Interface Card (NIC), Virtual Machine Software (VMware), Virtual Firewall (VF).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17228038 Modal Analysis for Study of Minor Historical Architecture
Authors: Milorad Pavlovic, Anna Manzato, Antonella Cecchi
Abstract:
Cultural heritage conservation is a challenge for contemporary society. In recent decades, significant resources have been allocated for the conservation and restoration of architectural heritage. Historical buildings were restored, protected and reinforced with the intent to limit the risks of degradation or loss, due to phenomena of structural damage and to external factors such as differential settlements, earthquake effects, etc. The wide diffusion of historic masonry constructions in Italy, Europe and the Mediterranean area requires reliable tools for the evaluation of their structural safety. In this paper is presented a free modal analysis performed on a minor historical architecture located in the village of Bagno Grande, near the city of L’Aquila in Italy. The location is characterized by a complex urban context, seriously damaged by the earthquake of 2009. The aim of this work is to check the structural behavior of a masonry building characterized by several boundary conditions imposed by adjacent buildings and infrastructural facilities.
Keywords: FEM, masonry, minor historical architecture, modal analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11168037 Multilevel Activation Functions For True Color Image Segmentation Using a Self Supervised Parallel Self Organizing Neural Network (PSONN) Architecture: A Comparative Study
Authors: Siddhartha Bhattacharyya, Paramartha Dutta, Ujjwal Maulik, Prashanta Kumar Nandi
Abstract:
The paper describes a self supervised parallel self organizing neural network (PSONN) architecture for true color image segmentation. The proposed architecture is a parallel extension of the standard single self organizing neural network architecture (SONN) and comprises an input (source) layer of image information, three single self organizing neural network architectures for segmentation of the different primary color components in a color image scene and one final output (sink) layer for fusion of the segmented color component images. Responses to the different shades of color components are induced in each of the three single network architectures (meant for component level processing) by applying a multilevel version of the characteristic activation function, which maps the input color information into different shades of color components, thereby yielding a processed component color image segmented on the basis of the different shades of component colors. The number of target classes in the segmented image corresponds to the number of levels in the multilevel activation function. Since the multilevel version of the activation function exhibits several subnormal responses to the input color image scene information, the system errors of the three component network architectures are computed from some subnormal linear index of fuzziness of the component color image scenes at the individual level. Several multilevel activation functions are employed for segmentation of the input color image scene using the proposed network architecture. Results of the application of the multilevel activation functions to the PSONN architecture are reported on three real life true color images. The results are substantiated empirically with the correlation coefficients between the segmented images and the original images.
Keywords: Colour image segmentation, fuzzy set theory, multi-level activation functions, parallel self-organizing neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20228036 Simulation Model of an Ultra-Light Overhead Conveyor System; Analysis of the Process in the Warehouse
Authors: Batin Latif Aylak, Bernd Noche, M. Baran Cantepe, Aydin Karakaya
Abstract:
Ultra-light overhead conveyor systems are rope-based conveying systems with individually driven vehicles. The vehicles can move automatically on the rope and this can be realized by energy and signals. The ultra-light overhead conveyor systems always must be integrated with a logistical process by finding a best way for a cheaper material flow in order to guarantee precise and fast workflows. This paper analyzes the process of an ultra-light overhead conveyor system using necessary assumptions. The analysis consists of three scenarios. These scenarios are based on raising the vehicle speeds with equal increments at each case. The correlation between the vehicle speed and system throughput is investigated. A discrete-event simulation model of an ultra-light overhead conveyor system is constructed using DOSIMIS-3 software to implement three scenarios. According to simulation results; the optimal scenario, hence the optimal vehicle speed, is found out among three scenarios. This simulation model demonstrates the effect of increased speed on the system throughput.
Keywords: Logistics, material flow, simulation, ultra-light overhead conveyor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25648035 A Quality Optimization Approach: An Application on Next Generation Networks
Authors: Gülfem I. Alptekin, S. Emre Alptekin
Abstract:
The next generation wireless systems, especially the cognitive radio networks aim at utilizing network resources more efficiently. They share a wide range of available spectrum in an opportunistic manner. In this paper, we propose a quality management model for short-term sub-lease of unutilized spectrum bands to different service providers. We built our model on competitive secondary market architecture. To establish the necessary conditions for convergent behavior, we utilize techniques from game theory. Our proposed model is based on potential game approach that is suitable for systems with dynamic decision making. The Nash equilibrium point tells the spectrum holders the ideal price values where profit is maximized at the highest level of customer satisfaction. Our numerical results show that the price decisions of the network providers depend on the price and QoS of their own bands as well as the prices and QoS levels of their opponents- bands.Keywords: cognitive radio networks, game theory, nextgeneration wireless networks, spectrum management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15128034 Using the PGAS Programming Paradigm for Biological Sequence Alignment on a Chip Multi-Threading Architecture
Authors: M. Bakhouya, S. A. Bahra, T. El-Ghazawi
Abstract:
The Partitioned Global Address Space (PGAS) programming paradigm offers ease-of-use in expressing parallelism through a global shared address space while emphasizing performance by providing locality awareness through the partitioning of this address space. Therefore, the interest in PGAS programming languages is growing and many new languages have emerged and are becoming ubiquitously available on nearly all modern parallel architectures. Recently, new parallel machines with multiple cores are designed for targeting high performance applications. Most of the efforts have gone into benchmarking but there are a few examples of real high performance applications running on multicore machines. In this paper, we present and evaluate a parallelization technique for implementing a local DNA sequence alignment algorithm using a PGAS based language, UPC (Unified Parallel C) on a chip multithreading architecture, the UltraSPARC T1.Keywords: Partitioned Global Address Space, Unified Parallel C, Multicore machines, Multi-threading Architecture, Sequence alignment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13908033 A Framework for Successful TQM Implementation and Its Effect on the Organizational Sustainability Development
Authors: Redha Elhuni, M. Munir Ahmad
Abstract:
The main purpose of this research is to construct a generic model for successful implementation of Total Quality Management (TQM) in Oil sector, and to find out the effects of this model on the organizational sustainability development (OSD) performance of Libyan oil and gas companies using the structured equation modeling (SEM) approach. The research approach covers both quantitative and qualitative methods. A questionnaire was developed in order to identify the quality factors that are seen by Libyan oil and gas companies to be critical to the success of TQM implementation. Hypotheses were developed to evaluate the impact of TQM implementation on O SD. Data analysis reveals that there is a significant positive effect of the TQM implementation on OSD. 24 quality factors are found to be critical and absolutely essential for successful TQM implementation. The results generated a structure of the TQMSD implementation framework based on the four major road map constructs (Top management commitment, employee involvement and participation, customer-driven processes, and continuous improvement culture).
Keywords: TQM, CQFs, Oil & Gas, OSD, Libya.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42848032 Machine Learning Framework: Competitive Intelligence and Key Drivers Identification of Market Share Trends among Healthcare Facilities
Authors: A. Appe, B. Poluparthi, L. Kasivajjula, U. Mv, S. Bagadi, P. Modi, A. Singh, H. Gunupudi, S. Troiano, J. Paul, J. Stovall, J. Yamamoto
Abstract:
The necessity of data-driven decisions in healthcare strategy formulation is rapidly increasing. A reliable framework which helps identify factors impacting a healthcare provider facility or a hospital (from here on termed as facility) market share is of key importance. This pilot study aims at developing a data-driven machine learning-regression framework which aids strategists in formulating key decisions to improve the facility’s market share which in turn impacts in improving the quality of healthcare services. The US (United States) healthcare business is chosen for the study, and the data spanning 60 key facilities in Washington State and about 3 years of historical data are considered. In the current analysis, market share is termed as the ratio of the facility’s encounters to the total encounters among the group of potential competitor facilities. The current study proposes a two-pronged approach of competitor identification and regression approach to evaluate and predict market share, respectively. Leveraged model agnostic technique, SHAP (SHapley Additive exPlanations), to quantify the relative importance of features impacting the market share. Typical techniques in literature to quantify the degree of competitiveness among facilities use an empirical method to calculate a competitive factor to interpret the severity of competition. The proposed method identifies a pool of competitors, develops Directed Acyclic Graphs (DAGs) and feature level word vectors, and evaluates the key connected components at the facility level. This technique is robust since it is data-driven, which minimizes the bias from empirical techniques. The DAGs factor in partial correlations at various segregations and key demographics of facilities along with a placeholder to factor in various business rules (for e.g., quantifying the patient exchanges, provider references, and sister facilities). Identified are the multiple groups of competitors among facilities. Leveraging the competitors' identified developed and fine-tuned Random Forest Regression model to predict the market share. To identify key drivers of market share at an overall level, permutation feature importance of the attributes was calculated. For relative quantification of features at a facility level, incorporated SHAP, a model agnostic explainer. This helped to identify and rank the attributes at each facility which impacts the market share. This approach proposes an amalgamation of the two popular and efficient modeling practices, viz., machine learning with graphs and tree-based regression techniques to reduce the bias. With these, we helped to drive strategic business decisions.
Keywords: Competition, DAGs, hospital, healthcare, machine learning, market share, random forest, SHAP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2848031 Enhancement Throughput of Unplanned Wireless Mesh Networks Deployment Using Partitioning Hierarchical Cluster (PHC)
Authors: Ahmed K. Hasan, A. A. Zaidan, Anas Majeed, B. B. Zaidan, Rosli Salleh, Omar Zakaria, Ali Zuheir
Abstract:
Wireless mesh networks based on IEEE 802.11 technology are a scalable and efficient solution for next generation wireless networking to provide wide-area wideband internet access to a significant number of users. The deployment of these wireless mesh networks may be within different authorities and without any planning, they are potentially overlapped partially or completely in the same service area. The aim of the proposed model is design a new model to Enhancement Throughput of Unplanned Wireless Mesh Networks Deployment Using Partitioning Hierarchical Cluster (PHC), the unplanned deployment of WMNs are determinates there performance. We use throughput optimization approach to model the unplanned WMNs deployment problem based on partitioning hierarchical cluster (PHC) based architecture, in this paper the researcher used bridge node by allowing interworking traffic between these WMNs as solution for performance degradation.Keywords: Wireless Mesh Networks, 802.11s Internetworking, partitioning Hierarchical Cluste.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15338030 CPU Architecture Based on Static Hardware Scheduler Engine and Multiple Pipeline Registers
Authors: Ionel Zagan, Vasile Gheorghita Gaitan
Abstract:
The development of CPUs and of real-time systems based on them made it possible to use time at increasingly low resolutions. Together with the scheduling methods and algorithms, time organizing has been improved so as to respond positively to the need for optimization and to the way in which the CPU is used. This presentation contains both a detailed theoretical description and the results obtained from research on improving the performances of the nMPRA (Multi Pipeline Register Architecture) processor by implementing specific functions in hardware. The proposed CPU architecture has been developed, simulated and validated by using the FPGA Virtex-7 circuit, via a SoC project. Although the nMPRA processor hardware structure with five pipeline stages is very complex, the present paper presents and analyzes the tests dedicated to the implementation of the CPU and of the memory on-chip for instructions and data. In order to practically implement and test the entire SoC project, various tests have been performed. These tests have been performed in order to verify the drivers for peripherals and the boot module named Bootloader.
Keywords: Hardware scheduler, nMPRA processor, real-time systems, scheduling methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10968029 A Preference-Based Multi-Agent Data Mining Framework for Social Network Service Users' Decision Making
Authors: Ileladewa Adeoye Abiodun, Cheng Wai Khuen
Abstract:
Multi-Agent Systems (MAS) emerged in the pursuit to improve our standard of living, and hence can manifest complex human behaviors such as communication, decision making, negotiation and self-organization. The Social Network Services (SNSs) have attracted millions of users, many of whom have integrated these sites into their daily practices. The domains of MAS and SNS have lots of similarities such as architecture, features and functions. Exploring social network users- behavior through multiagent model is therefore our research focus, in order to generate more accurate and meaningful information to SNS users. An application of MAS is the e-Auction and e-Rental services of the Universiti Cyber AgenT(UniCAT), a Social Network for students in Universiti Tunku Abdul Rahman (UTAR), Kampar, Malaysia, built around the Belief- Desire-Intention (BDI) model. However, in spite of the various advantages of the BDI model, it has also been discovered to have some shortcomings. This paper therefore proposes a multi-agent framework utilizing a modified BDI model- Belief-Desire-Intention in Dynamic and Uncertain Situations (BDIDUS), using UniCAT system as a case study.
Keywords: Distributed Data Mining, Multi-Agent Systems, Preference-Based, SNS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15028028 Comanche – A Compiler-Driven I/O Management System
Authors: Wendy Zhang, Ernst L. Leiss, Huilin Ye
Abstract:
Most scientific programs have large input and output data sets that require out-of-core programming or use virtual memory management (VMM). Out-of-core programming is very error-prone and tedious; as a result, it is generally avoided. However, in many instance, VMM is not an effective approach because it often results in substantial performance reduction. In contrast, compiler driven I/O management will allow a program-s data sets to be retrieved in parts, called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a compiler combined with a user level runtime system that can be used to replace standard VMM for out-of-core programs. We describe Comanche and demonstrate on a number of representative problems that it substantially out-performs VMM. Significantly our system does not require any special services from the operating system and does not require modification of the operating system kernel.Keywords: I/O Management, Out-of-core, Compiler, Tile mapping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13188027 Knowledge Transfer and the Translation of Technical Texts
Authors: Ahmed Alaoui
Abstract:
This paper contributes to the ongoing debate as to the relevance of translation studies to professional practitioners. It exposes the various misconceptions permeating the links between theory and practice in the translation landscape in the Arab World. It is a thesis of this paper that specialization in translation should be redefined; taking account of the fact, that specialized knowledge alone is neither crucial nor sufficient in technical translation. It should be tested against the readability of the translated text, the appropriateness of its style and the usability of its content by endusers to carry out their intended tasks. The paper also proposes a preliminary model to establish a working link between theory and practice from the perspective of professional trainers and practitioners, calling for the latter to participate in the production of knowledge in a systematic fashion. While this proposal is driven by a rather intuitive conviction, a research line is needed to specify the methodological moves to establish the mediation strategies that would relate the components in the model of knowledge transfer proposed in this paper.
Keywords: Knowledge transfer, misconceptions, specialized texts, translation theory, translation practice.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38128026 A Study of Recent Contribution on Simulation Tools for Network-on-Chip
Authors: Muthana Saleh Alalaki, Michael Opoku Agyeman
Abstract:
The growth in the number of Intellectual Properties (IPs) or the number of cores on the same chip becomes a critical issue in System-on-Chip (SoC) due to the intra-communication problem between the chip elements. As a result, Network-on-Chip (NoC) has emerged as a system architecture to overcome intra-communication issues. This paper presents a study of recent contributions on simulation tools for NoC. Furthermore, an overview of NoC is covered as well as a comparison between some NoC simulators to help facilitate research in on-chip communication.Keywords: Network-on-Chip, System-on-Chip, embedded systems, computer architecture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14538025 Free Convection in a Darcy Thermally Stratified Porous Medium That Embeds a Vertical Wall of Constant Heat Flux and Concentration
Authors: Maria Neagu
Abstract:
This paper presents the heat and mass driven natural convection succession in a Darcy thermally stratified porous medium that embeds a vertical semi-infinite impermeable wall of constant heat flux and concentration. The scale analysis of the system determines the two possible maps of the heat and mass driven natural convection sequence along the wall as a function of the process parameters. These results are verified using the finite differences method applied to the conservation equations.
Keywords: Finite difference method, natural convection, porous medium, scale analysis, thermal stratification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1604