Search results for: reliable information
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12302

Search results for: reliable information

12152 Conceptual Model for Logistics Information System

Authors: Ana María Rojas Chaparro, Cristian Camilo Sarmiento Chaves

Abstract:

Given the growing importance of logistics as a discipline for efficient management of materials flow and information, the adoption of tools that permit to create facilities in making decisions based on a global perspective of the system studied has been essential. The article shows how from a concepts-based model is possible to organize and represent in appropriate way the reality, showing accurate and timely information, features that make this kind of models an ideal component to support an information system, recognizing that information as relevant to establish particularities that allow get a better performance about the evaluated sector.

Keywords: system, information, conceptual model, logistics

Procedia PDF Downloads 496
12151 [Keynote Talk]: Software Reliability Assessment and Fault Tolerance: Issues and Challenges

Authors: T. Gayen

Abstract:

Although, there are several software reliability models existing today there does not exist any versatile model even today which can be used for the reliability assessment of software. Complex software has a large number of states (unlike the hardware) so it becomes practically difficult to completely test the software. Irrespective of the amount of testing one does, sometimes it becomes extremely difficult to assure that the final software product is fault free. The Black Box Software Reliability models are found be quite uncertain for the reliability assessment of various systems. As mission critical applications need to be highly reliable and since it is not always possible to ensure the development of highly reliable system. Hence, in order to achieve fault-free operation of software one develops some mechanism to handle faults remaining in the system even after the development. Although, several such techniques are currently in use to achieve fault tolerance, yet these mechanisms may not always be very suitable for various systems. Hence, this discussion is focused on analyzing the issues and challenges faced with the existing techniques for reliability assessment and fault tolerance of various software systems.

Keywords: black box, fault tolerance, failure, software reliability

Procedia PDF Downloads 426
12150 Evaluation of Hydrogen Particle Volume on Surfaces of Selected Nanocarbons

Authors: M. Ziółkowska, J. T. Duda, J. Milewska-Duda

Abstract:

This paper describes an approach to the adsorption phenomena modeling aimed at specifying the adsorption mechanisms on localized or nonlocalized adsorbent sites, when applied to the nanocarbons. The concept comes from the fundamental thermodynamic description of adsorption equilibrium and is based on numerical calculations of the hydrogen adsorbed particles volume on the surface of selected nanocarbons: single-walled nanotube and nanocone. This approach enables to obtain information on adsorption mechanism and then as a consequence to take appropriate mathematical adsorption model, thus allowing for a more reliable identification of the material porous structure. Theoretical basis of the approach is discussed and newly derived results of the numerical calculations are presented for the selected nanocarbons.

Keywords: adsorption, mathematical modeling, nanocarbons, numerical analysis

Procedia PDF Downloads 268
12149 Application of Seasonal Autoregressive Integrated Moving Average Model for Forecasting Monthly Flows in Waterval River, South Africa

Authors: Kassahun Birhanu Tadesse, Megersa Olumana Dinka

Abstract:

Reliable future river flow information is basic for planning and management of any river systems. For data scarce river system having only a river flow records like the Waterval River, a univariate time series models are appropriate for river flow forecasting. In this study, a univariate Seasonal Autoregressive Integrated Moving Average (SARIMA) model was applied for forecasting Waterval River flow using GRETL statistical software. Mean monthly river flows from 1960 to 2016 were used for modeling. Different unit root tests and Mann-Kendall trend analysis were performed to test the stationarity of the observed flow time series. The time series was differenced to remove the seasonality. Using the correlogram of seasonally differenced time series, different SARIMA models were identified, their parameters were estimated, and diagnostic check-up of model forecasts was performed using white noise and heteroscedasticity tests. Finally, based on minimum Akaike Information (AIc) and Hannan-Quinn (HQc) criteria, SARIMA (3, 0, 2) x (3, 1, 3)12 was selected as the best model for Waterval River flow forecasting. Therefore, this model can be used to generate future river information for water resources development and management in Waterval River system. SARIMA model can also be used for forecasting other similar univariate time series with seasonality characteristics.

Keywords: heteroscedasticity, stationarity test, trend analysis, validation, white noise

Procedia PDF Downloads 205
12148 Characterization of Onboard Reliable Error Correction Code FORSDRAM Controller

Authors: N. Pitcheswara Rao

Abstract:

In the process of conveying the information there may be a chance of signal being corrupted which leads to the erroneous bits in the message. The message may consist of single, double and multiple bit errors. In high-reliability applications, memory can sustain multiple soft errors due to single or multiple event upsets caused by environmental factors. The traditional hamming code with SEC-DED capability cannot be address these types of errors. It is possible to use powerful non-binary BCH code such as Reed-Solomon code to address multiple errors. However, it could take at least a couple dozen cycles of latency to complete first correction and run at a relatively slow speed. In order to overcome this drawback i.e., to increase speed and latency we are using reed-Muller code.

Keywords: SEC-DED, BCH code, Reed-Solomon code, Reed-Muller code

Procedia PDF Downloads 428
12147 Integration Multi-Layer Security Modeling with Fuzzy Logic in Service-Oriented Architectures

Authors: Zeinab Ranjbar

Abstract:

Service-oriented architecture in the world today, it is proposed to exchange information and services of interest to those such as IT managers, business managers, designers and system builders scene. The basic architecture of the software used to provide service to all users.the worries of all people (managers, business managers, designers, and system builders scene) effectiveness of this model, how reliable it is in security transactions.To increase the reliability of multi-layer fuzzy logic Architectures used.

Keywords: SOA, service oriented architecture, fuzzy logic, multi layer, SOA security

Procedia PDF Downloads 386
12146 Econometric Analysis of Organic Vegetable Production in Turkey

Authors: Ersin Karakaya, Halit Tutar

Abstract:

Reliable foods must be consumed in terms of healthy nutrition. The production and dissemination of diatom products in Turkey is rapidly evolving on the basis of preserving ecological balance, ensuring sustainability in agriculture and offering quality, reliable products to consumers. In this study, year in Turkey as (2002- 2015) to determine values of such as cultivated land of organic vegetable production, production levels, production quantity, number of products, number of farmers. It is intended to make the econometric analysis of the factors affecting the production of organic vegetable production (Number of products, Number of farmers and cultivated land). The main material of the study has created secondary data in relation to the 2002-2015 period as organic vegetable production in Turkey and regression analysis of the factors affecting the value of production of organic vegetable is determined by the Least Squares Method with EViews statistical software package.

Keywords: number of farmers, cultivated land, Eviews, Turkey

Procedia PDF Downloads 307
12145 Factors Affecting Online Health Seeking Behaviors in Middle-Income Class Filipino Adults

Authors: Reinzo Vittorio B. Cardenas, Heather Venice L. Abogado, Andrea Therese V. Afable, Rhea D. Avillanoza, Marie Abegail P. Ayagan, Catherine D. Bantayan

Abstract:

As the Internet provides fast and reliable health-related information, the tendency to self-diagnose increases to further understand medical jargon in a diagnosis with a physician and decreases costly consultation fees. The study aimed to explore and understand the factors affecting online health-seeking behaviors in middle-income class adults in Metro Manila. The study was conducted from March to April of 2021 with a sample size of 200 individuals aged 20 to 49 years old. The study was delivered via an online survey that used a questionnaire adapted from the research of Lee et al. (2015). Specifically, the survey consisted of three sections: assessing web-based health-seeking behaviors, consultation with health professionals, and participants' hesitancy to consult with physicians, which used a mix of a 5-point Likert-type scale with multiple responses and multiple-choice options. The results showed that the age and educational attainment of the respondents had a negative effect while presenting a positive effect of socio-economic status on health-seeking behavior. Lastly, there was a significant effect of participant’s hesitancy for professional consultation on their health-seeking behavior. The results gleaned from the study indicated that various individual and socio-economic factors might significantly affect one’s health-seeking behaviors. Although hesitancy had a significant effect on the spectrum of health-seeking behaviors, this does not imply that certain factors are specifically related to an individual’s tendency to seek health information. This information instead becomes essential in understanding the patient-physician relationship and giving patients a more holistic treatment.

Keywords: health-seeking behavior, health information, Internet, physician consultation

Procedia PDF Downloads 216
12144 Characterization of Onboard Reliable Error Correction Code for SDRAM Controller

Authors: Pitcheswara Rao Nelapati

Abstract:

In the process of conveying the information there may be a chance of signal being corrupted which leads to the erroneous bits in the message. The message may consist of single, double and multiple bit errors. In high-reliability applications, memory can sustain multiple soft errors due to single or multiple event upsets caused by environmental factors. The traditional hamming code with SEC-DED capability cannot be address these types of errors. It is possible to use powerful non-binary BCH code such as Reed-Solomon code to address multiple errors. However, it could take at least a couple dozen cycles of latency to complete first correction and run at a relatively slow speed. In order to overcome this drawback i.e., to increase speed and latency we are using reed-Muller code.

Keywords: SEC-DED, BCH code, Reed-Solomon code, Reed-Muller code

Procedia PDF Downloads 429
12143 Investigation of Information Security Incident Management Based on International Standard ISO/IEC 27002 in Educational Hospitals in 2014

Authors: Nahid Tavakoli, Asghar Ehteshami, Akbar Hassanzadeh, Fatemeh Amini

Abstract:

Introduction: The Information security incident management guidelines was been developed to help hospitals to meet their information security event and incident management requirements. The purpose of this Study was to investigate on Information Security Incident Management in Isfahan’s educational hospitals in accordance to ISO/IEC 27002 standards. Methods: This was a cross-sectional study to investigate on Information Security Incident Management of educational hospitals in 2014. Based on ISO/IEC 27002 standards, two checklists were applied to check the compliance with standards on Reporting Information Security Events and Weakness and Management of Information Security Incidents and Improvements. One inspector was trained to carry out the assessments in the hospitals. The data was analyzed by SPSS. Findings: In general the score of compliance Information Security Incident Management requirements in two steps; Reporting Information Security Events and Weakness and Management of Information Security Incidents and Improvements was %60. There was the significant difference in various compliance levels among the hospitals (p-valueKeywords: information security incident management, information security management, standards, hospitals

Procedia PDF Downloads 575
12142 Extraction of Urban Building Damage Using Spectral, Height and Corner Information

Authors: X. Wang

Abstract:

Timely and accurate information on urban building damage caused by earthquake is important basis for disaster assessment and emergency relief. Very high resolution (VHR) remotely sensed imagery containing abundant fine-scale information offers a large quantity of data for detecting and assessing urban building damage in the aftermath of earthquake disasters. However, the accuracy obtained using spectral features alone is comparatively low, since building damage, intact buildings and pavements are spectrally similar. Therefore, it is of great significance to detect urban building damage effectively using multi-source data. Considering that in general height or geometric structure of buildings change dramatically in the devastated areas, a novel multi-stage urban building damage detection method, using bi-temporal spectral, height and corner information, was proposed in this study. The pre-event height information was generated using stereo VHR images acquired from two different satellites, while the post-event height information was produced from airborne LiDAR data. The corner information was extracted from pre- and post-event panchromatic images. The proposed method can be summarized as follows. To reduce the classification errors caused by spectral similarity and errors in extracting height information, ground surface, shadows, and vegetation were first extracted using the post-event VHR image and height data and were masked out. Two different types of building damage were then extracted from the remaining areas: the height difference between pre- and post-event was used for detecting building damage showing significant height change; the difference in the density of corners between pre- and post-event was used for extracting building damage showing drastic change in geometric structure. The initial building damage result was generated by combining above two building damage results. Finally, a post-processing procedure was adopted to refine the obtained initial result. The proposed method was quantitatively evaluated and compared to two existing methods in Port au Prince, Haiti, which was heavily hit by an earthquake in January 2010, using pre-event GeoEye-1 image, pre-event WorldView-2 image, post-event QuickBird image and post-event LiDAR data. The results showed that the method proposed in this study significantly outperformed the two comparative methods in terms of urban building damage extraction accuracy. The proposed method provides a fast and reliable method to detect urban building collapse, which is also applicable to relevant applications.

Keywords: building damage, corner, earthquake, height, very high resolution (VHR)

Procedia PDF Downloads 213
12141 Analyzing On-Line Process Data for Industrial Production Quality Control

Authors: Hyun-Woo Cho

Abstract:

The monitoring of industrial production quality has to be implemented to alarm early warning for unusual operating conditions. Furthermore, identification of their assignable causes is necessary for a quality control purpose. For such tasks many multivariate statistical techniques have been applied and shown to be quite effective tools. This work presents a process data-based monitoring scheme for production processes. For more reliable results some additional steps of noise filtering and preprocessing are considered. It may lead to enhanced performance by eliminating unwanted variation of the data. The performance evaluation is executed using data sets from test processes. The proposed method is shown to provide reliable quality control results, and thus is more effective in quality monitoring in the example. For practical implementation of the method, an on-line data system must be available to gather historical and on-line data. Recently large amounts of data are collected on-line in most processes and implementation of the current scheme is feasible and does not give additional burdens to users.

Keywords: detection, filtering, monitoring, process data

Procedia PDF Downloads 559
12140 The Implementation of Information Security Audits in Public Sector: Perspective from Indonesia

Authors: Nur Imroatun Sholihat, Gresika Bunga Sylvana

Abstract:

Currently, cyber attack became an incredibly serious problem due to its increasing trend all over the world. Therefore, information security becomes prominent for every organization including public sector organization. In Indonesia, unfortunately, Ministry of Finance (MoF) is the only public sector organization that has already formally established procedure to assess its information security adequacy by performing information security audits (November 2017). We assess the implementation of information security audits in the MoF using qualitative data obtained by interviewing IT auditors and by analysis of related documents. For this reason, information security audit practice in the MoF could become the acceptable benchmark for all other public sector organizations in Indonesia. This study is important because, to the best of the author’s knowledge, our research into information security audits practice in Indonesia’s public sector have not been found yet. Results showed that information security audits performed mostly by doing pentest (penetration testing) to MoF’s critical applications.

Keywords: information security audit, information technology, Ministry of Finance of Indonesia, public sector organization

Procedia PDF Downloads 237
12139 Improved Performance Scheme for Joint Transmission in Downlink Coordinated Multi-Point Transmission

Authors: Young-Su Ryu, Su-Hyun Jung, Myoung-Jin Kim, Hyoung-Kyu Song

Abstract:

In this paper, improved performance scheme for joint transmission is proposed in downlink (DL) coordinated multi-point(CoMP) in case of constraint transmission power. This scheme is that serving transmission point (TP) request a joint transmission to inter-TP and selects one pre-coding technique according to channel state information(CSI) from user equipment(UE). The simulation results show that the bit error rate(BER) and throughput performances of the proposed scheme provide high spectral efficiency and reliable data at the cell edge.

Keywords: CoMP, joint transmission, minimum mean square error, zero-forcing, zero-forcing dirty paper coding

Procedia PDF Downloads 553
12138 An Introduction to Corporate Financial Reporting Practices in India

Authors: Pradip Kumar Das

Abstract:

India is a developing country and is also one of the most industrialized developing countries of the world. In post-independence period, industry has grown rapidly in India and with industrialization corporate sector in the country has been growing day after day. Nowadays, the investment is not limited to be shareholders alone, apart from the shareholders the common people of the society have also started investing in shares of the corporate sectors. Thus, the responsibilities of the corporate sectors have increased much. Corporate financial reporting refers to a system which provides valuable information to different types of users in the society for taking resourceful decisions with regards to investment policy, organization credit worthiness, profitability, liquidity, provision of taxation etc. The quality of information available to different users fosters the efficient allocation of resources which are very urgent for economic development of a country like India. It is the responsibility of the management of the corporate sector to convey reliable and authentic information with the help of generally accepted accounting principles. Corporate sectors which disclose information through annual reports should be sufficient enough for the purpose of bringing out the salient features relating to business performances and other activities. However, the disclosures practices of the corporate sectors though annual reports have undergone several major changes from time to time. Many a time, these vital changes are in the fashion of presenting information in the annual reports and addition of so many non-statutory disclosures of the company. Very often managements of the corporate sectors are blamed for concealing true picture which is not desirable at all. The corporate financial reporting practice which in the current period has gained a place of prime importance suffers from certain limitations and invites question from the public about its reliability. Thus, the wide gap created by management between the exhibited picture and the real picture sometimes attains to such extent that the purpose of the reporting practice loses its importance. The requirement of full and adequate disclosure of information including information relating to human resources in the annual report in free trade economy of India helps the prospective investors to select the best portfolio of their investments. This paper is a reflection of a modest attempt of the author to highlight the corporate reporting practices followed in India. A cursory glance of the conceptual study shows limitations along with reliability of the reporting practices and suggests measures to overcome the shortcomings of the financial reporting practices.

Keywords: corporate enterprise, cursory glance, portfolio, yawning gap

Procedia PDF Downloads 415
12137 Calibration of Discrete Element Method Parameters for Modelling DRI Pellets Flow

Authors: A. Hossein Madadi-Najafabadi, Masoud Nasiri

Abstract:

The discrete element method is a powerful technique for numerical modeling the flow of granular materials such as direct reduced iron. It would enable us to study processes and equipment related to the production and handling of the material. However, the characteristics and properties of the granules have to be adjusted precisely to achieve reliable results in a DEM simulation. The main properties for DEM simulation are size distribution, density, Young's modulus, Poisson's ratio and the contact coefficients of restitution, rolling friction and sliding friction. In the present paper, the mentioned properties are determined for DEM simulation of DRI pellets. A reliable DEM simulation would contribute to optimizing the handling system of DRIs in an iron-making plant. Among the mentioned properties, Young's modulus is the most important parameter, which is usually hard to get for particulate solids. Here, an especial method is utilized to precisely determine this parameter for DRI.

Keywords: discrete element method, direct reduced iron, simulation parameters, granular material

Procedia PDF Downloads 180
12136 A Secure System for Handling Information from Heterogeous Sources

Authors: Shoohira Aftab, Hammad Afzal

Abstract:

Information integration is a well known procedure to provide consolidated view on sets of heterogeneous information sources. It not only provides better statistical analysis of information but also facilitates users to query without any knowledge on the underlying heterogeneous information sources The problem of providing a consolidated view of information can be handled using Semantic data (information stored in such a way that is understandable by machines and integrate-able without manual human intervention). However, integrating information using semantic web technology without any access management enforced, will results in increase of privacy and confidentiality concerns. In this research we have designed and developed a framework that would allow information from heterogeneous formats to be consolidated, thus resolving the issue of interoperability. We have also devised an access control system for defining explicit privacy constraints. We designed and applied our framework on both semantic and non-semantic data from heterogeneous resources. Our approach is validated using scenario based testing.

Keywords: information integration, semantic data, interoperability, security, access control system

Procedia PDF Downloads 356
12135 Review of Dielectric Permittivity Measurement Techniques

Authors: Ahmad H. Abdelgwad, Galal E. Nadim, Tarek M. Said, Amr M. Gody

Abstract:

The prime objective of this manuscript is to provide intensive review of the techniques used for permittivity measurements. The measurement techniques, relevant for any desired application, rely on the nature of the measured dielectric material, both electrically and physically, the degree of accuracy required, and the frequency of interest. Regardless of the way that distinctive sorts of instruments can be utilized, measuring devices that provide reliable determinations of the required electrical properties including the obscure material in the frequency range of interest can be considered. The challenge in making precise dielectric property or permittivity measurements is in designing of the material specimen holder for those measurements (RF and MW frequency ranges) and adequately modeling the circuit for reliable computation of the permittivity from the electrical measurements. If the RF circuit parameters such as the impedance or admittance are estimated appropriately at a certain frequency, the material’s permittivity at this frequency can be estimated by the equations which relate the way in which the dielectric properties of the material affect on the parameters of the circuit.

Keywords: dielectric permittivity, free space measurement, waveguide techniques, coaxial probe, cavity resonator

Procedia PDF Downloads 369
12134 A Situational Awareness Map for Allocating Relief Resources after Earthquake Occurrence

Authors: Hamid Reza Ranjbar, Ali Reza Azmoude Ardalan, Hamid Dehghani, Mohammad Reza Sarajian

Abstract:

Natural disasters are unexpected events which predicting them is difficult. Earthquake is one of the most devastating disasters among natural hazards with high rate of mortality and wide extent of damages. After the earthquake occurrence, managing the critical condition and allocating limited relief sources requiring a complete awareness of damaged area. The information for allocating relief teams should be precise and reliable as much as possible, and be presented in the appropriate time after the earthquake occurrence. This type of information was previously presented in the form of a damage map; conducting relief teams by using damage map mostly lead to waste of time for finding alive occupants under the rubble. In this research, a proposed standard for prioritizing damaged buildings in terms of requiring rescue and relief was presented. This standard prioritizes damaged buildings into four levels of priority including very high, high, moderate and low by considering key parameters such as type of land use, activity time, and inactivity time of each land use, time of earthquake occurrence and distinct index. The priority map by using the proposed standard could be a basis for guiding relief teams towards the areas with high relief priority.

Keywords: Damage map, GIS, priority map, USAR

Procedia PDF Downloads 404
12133 AI-Enabled Smart Contracts for Reliable Traceability in the Industry 4.0

Authors: Harris Niavis, Dimitra Politaki

Abstract:

The manufacturing industry was collecting vast amounts of data for monitoring product quality thanks to the advances in the ICT sector and dedicated IoT infrastructure is deployed to track and trace the production line. However, industries have not yet managed to unleash the full potential of these data due to defective data collection methods and untrusted data storage and sharing. Blockchain is gaining increasing ground as a key technology enabler for Industry 4.0 and the smart manufacturing domain, as it enables the secure storage and exchange of data between stakeholders. On the other hand, AI techniques are more and more used to detect anomalies in batch and time-series data that enable the identification of unusual behaviors. The proposed scheme is based on smart contracts to enable automation and transparency in the data exchange, coupled with anomaly detection algorithms to enable reliable data ingestion in the system. Before sensor measurements are fed to the blockchain component and the smart contracts, the anomaly detection mechanism uniquely combines artificial intelligence models to effectively detect unusual values such as outliers and extreme deviations in data coming from them. Specifically, Autoregressive integrated moving average, Long short-term memory (LSTM) and Dense-based autoencoders, as well as Generative adversarial networks (GAN) models, are used to detect both point and collective anomalies. Towards the goal of preserving the privacy of industries' information, the smart contracts employ techniques to ensure that only anonymized pointers to the actual data are stored on the ledger while sensitive information remains off-chain. In the same spirit, blockchain technology guarantees the security of the data storage through strong cryptography as well as the integrity of the data through the decentralization of the network and the execution of the smart contracts by the majority of the blockchain network actors. The blockchain component of the Data Traceability Software is based on the Hyperledger Fabric framework, which lays the ground for the deployment of smart contracts and APIs to expose the functionality to the end-users. The results of this work demonstrate that such a system can increase the quality of the end-products and the trustworthiness of the monitoring process in the smart manufacturing domain. The proposed AI-enabled data traceability software can be employed by industries to accurately trace and verify records about quality through the entire production chain and take advantage of the multitude of monitoring records in their databases.

Keywords: blockchain, data quality, industry4.0, product quality

Procedia PDF Downloads 189
12132 The Descending Genicular Artery Perforator Free Flap as a Reliable Flap: Literature Review

Authors: Doran C. Kalmin

Abstract:

The descending genicular artery (DGA) perforator free flap provides an alternative to free flap reconstruction based on a review of the literature detailing both anatomical and clinical studies. The descending genicular artery (DGA) supplies skin, muscle, tendon, and bone located around the medial aspect of the knee that has been used in several pioneering reports in reconstructing defects located in various areas throughout the body. After the success of the medial femoral condyle flap in early studies, a small number of studies have been published detailing the use of the DGA in free flap reconstruction. Despite early success in the use of the DGA flap, acceptance within the Plastic and Reconstructive Surgical community has been limited due primarily to anatomical variations of the pedicle. This literature review is aimed at detailing the progression of the DGA perforator free flap and its variations as an alternative and reliable free flap for reconstruction of composite defects with an exploration into both anatomical and clinical studies. A literature review was undertaken, and the progression of the DGA flap is explored from the early review by Acland et al. pioneering the saphenous free flap to exploring modern changes and studies of the anatomy of the DGA. An extensive review of the literature was undertaken that details the anatomy and its variations, approaches to harvesting the flap, the advantages, and disadvantages of the DGA perforator free flap as well as flap outcomes. There are 15 published clinical series of DGA perforator free flaps that incorporate cutaneous, osteoperiosteal, cartilage, osteocutaneous, osteoperiosteal and muscle, osteoperiosteal and subcutaneous and tendocutatenous. The commonest indication for using a DGA free flap was for non-union of bone, particularly that of the scaphoid whereby the medial femoral condyle could be used. In the case series, a success rate of over 90% was established, showing that these early studies have had good success with a wide range of tissue transfers. The greatest limitation is the anatomical variation of the DGA and therefore, the challenges associated with raising the flap. Despite the variation in anatomy and around 10-15% absence of the DGA, the saphenous artery can be used as well as the superior medial genicular artery if the vascular bone is required as part of the flap. Despite only a handful of anatomical and clinical studies describing the DGA perforator free flap, it ultimately provides a reliable flap that can include a variety of composite structure used for reconstruction in almost any area throughout the body. Although it has limitations, it provides a reliable option for free flap reconstruction that can routinely be performed as a single-stage procedure.

Keywords: anatomical study, clinical study, descending genicular artery, literature review, perforator free flap reconstruction

Procedia PDF Downloads 144
12131 Best Season for Seismic Survey in Zaria Area, Nigeria: Data Quality and Implications

Authors: Ibe O. Stephen, Egwuonwu N. Gabriel

Abstract:

Variations in seismic P-wave velocity and depth resolution resulting from variations in subsurface water saturation were investigated in this study in order to determine the season of the year that gives the most reliable P-wave velocity and depth resolution of the subsurface in Zaria Area, Nigeria. A 2D seismic refraction tomography technique involving an ABEM Terraloc MK6 Seismograph was used to collect data across a borehole of standard log with the centre of the spread situated at the borehole site. Using the same parameters this procedure was repeated along the same spread for at least once in a month for at least eight months in a year for four years. The choice for each survey time depended on when there was significant variation in rainfall data. The seismic data collected were tomographically inverted. The results suggested that the average P-wave velocity ranges of the subsurface in the area are generally higher when the ground was wet than when it was dry. The results also suggested that the overburden of about 9.0 m in thickness, the weathered basement of about 14.0 m in thickness and the fractured basement at a depth of about 23.0 m best fitted the borehole log. This best fit was consistently obtained in the months between March and May when the average total rainfall was about 44.8 mm in the area. The results had also shown that the velocity ranges in both dry and wet formations fall within the standard ranges as provided in literature. In terms of velocity, this study has not in any way clearly distinguished the quality of the results of the seismic data obtained when the subsurface was dry from the results of the data collected when the subsurface was wet. It was concluded that for more detailed and reliable seismic studies in Zaria Area and its environs with similar climatic condition, the surveys are best conducted between March and May. The most reliable seismic data for depth resolution are most likely obtainable in the area between March and May.

Keywords: best season, variations in depth resolution, variations in P-wave velocity, variations in subsurface water saturation, Zaria area

Procedia PDF Downloads 288
12130 Intelligent Process Data Mining for Monitoring for Fault-Free Operation of Industrial Processes

Authors: Hyun-Woo Cho

Abstract:

The real-time fault monitoring and diagnosis of large scale production processes is helpful and necessary in order to operate industrial process safely and efficiently producing good final product quality. Unusual and abnormal events of the process may have a serious impact on the process such as malfunctions or breakdowns. This work try to utilize process measurement data obtained in an on-line basis for the safe and some fault-free operation of industrial processes. To this end, this work evaluated the proposed intelligent process data monitoring framework based on a simulation process. The monitoring scheme extracts the fault pattern in the reduced space for the reliable data representation. Moreover, this work shows the results of using linear and nonlinear techniques for the monitoring purpose. It has shown that the nonlinear technique produced more reliable monitoring results and outperforms linear methods. The adoption of the qualitative monitoring model helps to reduce the sensitivity of the fault pattern to noise.

Keywords: process data, data mining, process operation, real-time monitoring

Procedia PDF Downloads 640
12129 A New Reliability based Channel Allocation Model in Mobile Networks

Authors: Anujendra, Parag Kumar Guha Thakurta

Abstract:

The data transmission between mobile hosts and base stations (BSs) in Mobile networks are often vulnerable to failure. Thus, efficient link connectivity, in terms of the services of both base stations and communication channels of the network, is required in wireless mobile networks to achieve highly reliable data transmission. In addition, it is observed that the number of blocked hosts is increased due to insufficient number of channels during heavy load in the network. Under such scenario, the channels are allocated accordingly to offer a reliable communication at any given time. Therefore, a reliability-based channel allocation model with acceptable system performance is proposed as a MOO problem in this paper. Two conflicting parameters known as Resource Reuse factor (RRF) and the number of blocked calls are optimized under reliability constraint in this problem. The solution to such MOO problem is obtained through NSGA-II (Non-dominated Sorting Genetic Algorithm). The effectiveness of the proposed model in this work is shown with a set of experimental results.

Keywords: base station, channel, GA, pareto-optimal, reliability

Procedia PDF Downloads 408
12128 The Role of Online Videos in Undergraduate Casual-Leisure Information Behaviors

Authors: Nei-Ching Yeh

Abstract:

This study describes undergraduate casual-leisure information behaviors relevant to online videos. Diaries and in-depth interviews were used to collect data. Twenty-four undergraduates participated in this study (9 men, 15 women; all were aged 18–22 years). This study presents a model of casual-leisure information behaviors and contributes new insights into user experience in casual-leisure settings, such as online video programs, with implications for other information domains.

Keywords: casual-leisure information behaviors, information behavior, online videos, role

Procedia PDF Downloads 309
12127 Development and Psychometric Validation of the Hospitalised Older Adults Dignity Scale for Measuring Dignity during Acute Hospital Admissions

Authors: Abdul-Ganiyu Fuseini, Bernice Redley, Helen Rawson, Lenore Lay, Debra Kerr

Abstract:

Aim: The study aimed to develop and validate a culturally appropriate patient-reported outcome measure for measuring dignity for older adults during acute hospital admissions. Design: A three-phased mixed-method sequential exploratory design was used. Methods: Concept elicitation and generation of items for the scale was informed by older adults’ perspectives about dignity during acute hospitalization and a literature review. Content validity evaluation and pre-testing were undertaken using standard instrument development techniques. A cross-sectional survey design was conducted involving 270 hospitalized older adults for evaluation of construct and convergent validity, internal consistency reliability, and test–retest reliability of the scale. Analysis was performed using Statistical Package for the Social Sciences, version 25. Reporting of the study was guided by the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) checklist. Results: We established the 15-item Hospitalized Older Adults’ Dignity Scale that has a 5-factor structure: Shared Decision-Making (3 items); Healthcare Professional-Patient Communication (3 items); Patient Autonomy (4 items); Patient Privacy (2 items); and Respectful Care (3 items). Excellent content validity, adequate construct and convergent validity, acceptable internal consistency reliability, and good test-retest reliability were demonstrated. Conclusion: We established the Hospitalized Older Adults Dignity Scale as a valid and reliable scale to measure dignity for older adults during acute hospital admissions. Future studies using confirmatory factor analysis are needed to corroborate the dimensionality of the factor structure and external validity of the scale. Routine use of the scale may provide information that informs the development of strategies to improve dignity-related care in the future. Impact: The development and validation of the Hospitalized Older Adults Dignity Scale will provide healthcare professionals with a feasible and reliable scale for measuring older adults’ dignity during acute hospitalization. Routine use of the scale may enable the capturing and incorporation of older patients’ perspectives about their healthcare experience and provide information that informs the development of strategies to improve dignity-related care in the future.

Keywords: dignity, older adults, hospitalisation, scale, patients, dignified care, acute care

Procedia PDF Downloads 90
12126 Interoperable Design Coordination Method for Sharing Communication Information Using Building Information Model Collaboration Format

Authors: Jin Gang Lee, Hyun-Soo Lee, Moonseo Park

Abstract:

The utilization of BIM and IFC allows project participants to collaborate across different areas by consistently sharing interoperable product information represented in a model. Comments or markups generated during the coordination process can be categorized as communication information, which can be shared in less standardized manner. It can be difficult to manage and reuse such information compared to the product information in a model. The present study proposes an interoperable coordination method using BCF (the BIM Collaboration Format) for managing and sharing the communication information during BIM based coordination process. A management function for coordination in the BIM collaboration system is developed to assess its ability to share the communication information in BIM collaboration projects. This approach systematically links communication information during the coordination process to the building model and serves as a type of storage system for retrieving knowledge created during BIM collaboration projects.

Keywords: design coordination, building information model, BIM collaboration format, industry foundation classes

Procedia PDF Downloads 432
12125 A Comprehensive Overview of Solar and Vertical Axis Wind Turbine Integration Micro-Grid

Authors: Adnan Kedir Jarso, Mesfin Megra Rorisa, Haftom Gebreslassie Gebregwergis, Frie Ayalew Yimam, Seada Hussen Adem

Abstract:

A microgrid is a small-scale power grid that can operate independently or in conjunction with the main power grid. It is a promising solution for providing reliable and sustainable energy to remote areas. The integration of solar and vertical axis wind turbines (VAWTs) in a microgrid can provide a stable and efficient source of renewable energy. This paper provides a comprehensive overview of the integration of solar and VAWTs in a microgrid. The paper discusses the design, operation, and control of a microgrid that integrates solar and VAWTs. The paper also examines the performance of the microgrid in terms of efficiency, reliability, and cost-effectiveness. The paper highlights the advantages and disadvantages of using solar and VAWTs in a microgrid. The paper concludes that the integration of solar and VAWTs in a microgrid is a promising solution for providing reliable and sustainable energy to remote areas. The paper recommends further research to optimize the design and operation of a microgrid that integrates solar and VAWTs. The paper also recommends the development of policies and regulations that promote the use of microgrids that integrate solar and VAWTs. In conclusion, the integration of solar and VAWTs in a microgrid is a promising solution for providing reliable and sustainable energy to remote areas. The paper provides a comprehensive overview of the integration of solar and VAWTs in a microgrid and highlights the advantages and disadvantages of using solar and VAWTs in a microgrid. The paper recommends further research and the development of policies and regulations that promote the use of microgrids that integrate solar and VAWTs.

Keywords: hybrid generation, intermittent power, optimization, photovoltaic, vertical axis wind turbine

Procedia PDF Downloads 96
12124 The Effect of Information Technologies on Business Performance: An Application on Small Hotels

Authors: Abdullah Karaman, Kursad Sayin

Abstract:

In this research, which information technologies are used in small hotel businesses, and the information technologies-performance perception of the managers are pointed out. During the research, the questionnaire was prepared and the small scale hotel managers were interviewed face to face and they filled out the questionnaire and the answers acquired were evaluated. As the result of the research, it was obtained that the managers do not care much about the information technologies usage in practice even though they accepted that the information technologies are important in terms of performance.

Keywords: information technologies, managers, performance, small hotels

Procedia PDF Downloads 489
12123 Improve Student Performance Prediction Using Majority Vote Ensemble Model for Higher Education

Authors: Wade Ghribi, Abdelmoty M. Ahmed, Ahmed Said Badawy, Belgacem Bouallegue

Abstract:

In higher education institutions, the most pressing priority is to improve student performance and retention. Large volumes of student data are used in Educational Data Mining techniques to find new hidden information from students' learning behavior, particularly to uncover the early symptom of at-risk pupils. On the other hand, data with noise, outliers, and irrelevant information may provide incorrect conclusions. By identifying features of students' data that have the potential to improve performance prediction results, comparing and identifying the most appropriate ensemble learning technique after preprocessing the data, and optimizing the hyperparameters, this paper aims to develop a reliable students' performance prediction model for Higher Education Institutions. Data was gathered from two different systems: a student information system and an e-learning system for undergraduate students in the College of Computer Science of a Saudi Arabian State University. The cases of 4413 students were used in this article. The process includes data collection, data integration, data preprocessing (such as cleaning, normalization, and transformation), feature selection, pattern extraction, and, finally, model optimization and assessment. Random Forest, Bagging, Stacking, Majority Vote, and two types of Boosting techniques, AdaBoost and XGBoost, are ensemble learning approaches, whereas Decision Tree, Support Vector Machine, and Artificial Neural Network are supervised learning techniques. Hyperparameters for ensemble learning systems will be fine-tuned to provide enhanced performance and optimal output. The findings imply that combining features of students' behavior from e-learning and students' information systems using Majority Vote produced better outcomes than the other ensemble techniques.

Keywords: educational data mining, student performance prediction, e-learning, classification, ensemble learning, higher education

Procedia PDF Downloads 107