Search results for: performance measurement framework
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18402

Search results for: performance measurement framework

18162 Digital Encoder Based Power Frequency Deviation Measurement

Authors: Syed Javed Arif, Mohd Ayyub Khan, Saleem Anwar Khan

Abstract:

In this paper, a simple method is presented for measurement of power frequency deviations. A phase locked loop (PLL) is used to multiply the signal under test by a factor of 100. The number of pulses in this pulse train signal is counted over a stable known period, using decade driving assemblies (DDAs) and flip-flops. These signals are combined using logic gates and then passed through decade counters to give a unique combination of pulses or levels, which are further encoded. These pulses are equally suitable for both control applications and display units. The experimental circuit developed gives a resolution of 1 Hz within the measurement period of 20 ms. The proposed circuit is also simulated in Verilog Hardware Description Language (VHDL) and implemented using Field Programing Gate Arrays (FPGAs). A Mixed signal Oscilloscope (MSO) is used to observe the results of FPGA implementation. These results are compared with the results of the proposed circuit of discrete components. The proposed system is useful for frequency deviation measurement and control in power systems.

Keywords: frequency measurement, digital control, phase locked loop, encoder, Verilog HDL

Procedia PDF Downloads 158
18161 Labour Productivity Measurement and Control Standards for Hotels

Authors: Kristine Joy Simpao

Abstract:

Improving labour productivity is one of the most enthralling and challenging aspects of managing hotels and restaurant business. The demand to secure countless productivity became an increasingly pivotal role of managers to survive and sustain the business. Besides making business profitable, they are in the doom to make every resource to become productive and effective towards achieving company goal while maximizing the value of organization. This paper examines what productivity means to the services industry, in particular, to the hotel industry. This is underpinned by an investigation of the extent of practice of respondent hotels to the labour productivity aspect in the areas of materials management, human resource management and leadership management and in a way, computing the labour productivity ratios using the hotel simple ratios of productivity in order to find a suitable measurement and control standards for hotels with SBMA, Olongapo City as the locale of the study. The finding shows that hotels labour productivity ratings are not perfect with some practices that are far below particularly on strategic and operational decisions in improving performance and productivity of its human resources. It further proves of the no significant difference ratings among the respondent’s type in all areas which indicated that they are having similar perception of the weak implementation of some of the indicators in the labour productivity practices. Furthermore, the results in the computation of labour productivity efficiency ratios resulted relationship of employees versus labour productivity practices are inversely proportional. This study provides a potential measurement and control standards for the enhancement of hotels labour productivity. These standards should also contain labour productivity customized for standard hotels in Subic Bay Freeport Zone to assist hotel owners in increasing the labour productivity while meeting company goals and objectives effectively.

Keywords: labour productivity, hotel, measurement and control, standards, efficiency ratios, practices

Procedia PDF Downloads 290
18160 The Assessment of Natural Ventilation Performance for Thermal Comfort in Educational Space: A Case Study of Design Studio in the Arab Academy for Science and Technology, Alexandria

Authors: Alaa Sarhan, Rania Abd El Gelil, Hana Awad

Abstract:

Through the last decades, the impact of thermal comfort on the working performance of users and occupants of an indoor space has been a concern. Research papers concluded that natural ventilation quality directly impacts the levels of thermal comfort. Natural ventilation must be put into account during the design process in order to improve the inhabitant's efficiency and productivity. One example of daily long-term occupancy spaces is educational facilities. Many individuals spend long times receiving a considerable amount of knowledge, and it takes additional time to apply this knowledge. Thus, this research is concerned with user's level of thermal comfort in design studios of educational facilities. The natural ventilation quality in spaces is affected by a number of parameters including orientation, opening design, and many other factors. This research aims to investigate the conscious manipulation of the physical parameters of the spaces and its impact on natural ventilation performance which subsequently affects thermal comfort of users. The current research uses inductive and deductive methods to define natural ventilation design considerations, which are used in a field study in a studio in the university building in Alexandria (AAST) to evaluate natural ventilation performance through analyzing and comparing the current case to the developed framework and conducting computational fluid dynamics simulation. Results have proved that natural ventilation performance is successful by only 50% of the natural ventilation design framework; these results are supported by CFD simulation.

Keywords: educational buildings, natural ventilation, , mediterranean climate, thermal comfort

Procedia PDF Downloads 183
18159 Distributed Processing for Content Based Lecture Video Retrieval on Hadoop Framework

Authors: U. S. N. Raju, Kothuri Sai Kiran, Meena G. Kamal, Vinay Nikhil Pabba, Suresh Kanaparthi

Abstract:

There is huge amount of lecture video data available for public use, and many more lecture videos are being created and uploaded every day. Searching for videos on required topics from this huge database is a challenging task. Therefore, an efficient method for video retrieval is needed. An approach for automated video indexing and video search in large lecture video archives is presented. As the amount of video lecture data is huge, it is very inefficient to do the processing in a centralized computation framework. Hence, Hadoop Framework for distributed computing for Big Video Data is used. First, step in the process is automatic video segmentation and key-frame detection to offer a visual guideline for the video content navigation. In the next step, we extract textual metadata by applying video Optical Character Recognition (OCR) technology on key-frames. The OCR and detected slide text line types are adopted for keyword extraction, by which both video- and segment-level keywords are extracted for content-based video browsing and search. The performance of the indexing process can be improved for a large database by using distributed computing on Hadoop framework.

Keywords: video lectures, big video data, video retrieval, hadoop

Procedia PDF Downloads 493
18158 Design and Comparative Analysis of Grid-Connected Bipv System with Monocrystalline Silicon and Polycrystalline Silicon in Kandahar Climate

Authors: Ahmad Shah Irshad, Naqibullah Kargar, Wais Samadi

Abstract:

Building an integrated photovoltaic (BIPV) system is a new and modern technique for solar energy production in Kandahar. Due to its location, Kandahar has abundant sources of solar energy. People use both monocrystalline and polycrystalline silicon solar PV modules for the grid-connected solar PV system, and they don’t know which technology performs better for the BIPV system. This paper analyses the parameters described by IEC61724, “Photovoltaic System Performance Monitoring Guidelines for Measurement, Data Exchange and Analysis,” to evaluate which technology shows better performance for the BIPV system. The monocrystalline silicon BIPV system has a 3.1% higher array yield than the polycrystalline silicon BIPV system. The final yield is 0.2%, somewhat higher for monocrystalline silicon than polycrystalline silicon. Monocrystalline silicon has 0.2% and 4.5% greater yearly yield factor and capacity factors than polycrystalline silicon, respectively. Monocrystalline silicon shows 0.3% better performance than polycrystalline silicon. With 1.7% reduction and 0.4% addition in collection losses and useful energy produced, respectively, monocrystalline silicon solar PV system shows good performance than polycrystalline silicon solar PV system. But system losses are the same for both technologies. The monocrystalline silicon BIPV system injects 0.2% more energy into the grid than the polycrystalline silicon BIPV system.

Keywords: photovoltaic technologies, performance analysis, solar energy, solar irradiance, performance ratio

Procedia PDF Downloads 331
18157 Effects of Foam Rolling with Different Application Volumes on the Isometric Force of the Calf Muscle with Consideration of Muscle Activity

Authors: T. Poppendieker, H. Maurer, C. Segieth

Abstract:

Over the past ten years, foam rolling has become a new trend in the fitness and health market. It is also a frequently used technique for self-massage. However, the scope of effects from foam rolling has only recently started to be researched and understood. The focus of this study is to examine the effects of prolonged foam rolling on muscle performance. Isometric muscle force was used as a parameter to determine an improving impact of the myofascial roller in two different application volumes. Besides the maximal muscle force, data were also collected on muscle activation during all tests. Twenty-four (17 females, 7 males) healthy students with an average age of 23.4 ± 2.8 years were recruited. The study followed a cross-over pre-/post design in which the order of conditions was counterbalanced. The subjects performed a one-minute and three-minute foam rolling application set on two separate days. Isometric maximal muscle force of the dominant calf was tested before and after the self-myofascial release application. The statistic software program SPSS 22 was used to analyze the data of the maximal isometric force of the calf muscle by a 2 x 2 (time of measurement x intervention) analysis of variance with repeated measures. The statistic significance level was set at p ≤ 0.05. Neither for the main effect of time of measurement (F(1,23) = .93, p = .36, f = .20) nor for the interaction of time of measurement x intervention (F(1,23) = 1.99, p = .17, f = 0.29) significant p-values were found. However, the effect size indicates a mean interaction effect with a tendency of greater pre-post improvements under the three-minute foam rolling condition. Changes in maximal force did not correlate with changes in EMG-activity (r = .02, p = .95 in the short and r = -.11, p = .65 in the long rolling condition). Results support findings of previous studies and suggest a positive potential for use of the foam roll as a means for keeping muscle force at least at the same performance level while leading to an increase in flexibility.

Keywords: application volume differences, foam rolling, isometric maximal force, self-myofascial release

Procedia PDF Downloads 259
18156 A Monte Carlo Fuzzy Logistic Regression Framework against Imbalance and Separation

Authors: Georgios Charizanos, Haydar Demirhan, Duygu Icen

Abstract:

Two of the most impactful issues in classical logistic regression are class imbalance and complete separation. These can result in model predictions heavily leaning towards the imbalanced class on the binary response variable or over-fitting issues. Fuzzy methodology offers key solutions for handling these problems. However, most studies propose the transformation of the binary responses into a continuous format limited within [0,1]. This is called the possibilistic approach within fuzzy logistic regression. Following this approach is more aligned with straightforward regression since a logit-link function is not utilized, and fuzzy probabilities are not generated. In contrast, we propose a method of fuzzifying binary response variables that allows for the use of the logit-link function; hence, a probabilistic fuzzy logistic regression model with the Monte Carlo method. The fuzzy probabilities are then classified by selecting a fuzzy threshold. Different combinations of fuzzy and crisp input, output, and coefficients are explored, aiming to understand which of these perform better under different conditions of imbalance and separation. We conduct numerical experiments using both synthetic and real datasets to demonstrate the performance of the fuzzy logistic regression framework against seven crisp machine learning methods. The proposed framework shows better performance irrespective of the degree of imbalance and presence of separation in the data, while the considered machine learning methods are significantly impacted.

Keywords: fuzzy logistic regression, fuzzy, logistic, machine learning

Procedia PDF Downloads 36
18155 Business Domain Modelling Using an Integrated Framework

Authors: Mohammed Hasan Salahat, Stave Wade

Abstract:

This paper presents an application of a “Systematic Soft Domain Driven Design Framework” as a soft systems approach to domain-driven design of information systems development. The framework combining techniques from Soft Systems Methodology (SSM), the Unified Modeling Language (UML), and an implementation pattern knows as ‘Naked Objects’. This framework have been used in action research projects that have involved the investigation and modeling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within this framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to develop the domain model using UML for the given business domain. The framework is proposed and evaluated in our previous works, and a real case study ‘Information Retrieval System for Academic Research’ is used, in this paper, to show further practice and evaluation of the framework in different business domain. We argue that there are advantages from combining and using techniques from different methodologies in this way for business domain modeling. The framework is overviewed and justified as multi-methodology using Mingers Multi-Methodology ideas.

Keywords: SSM, UML, domain-driven design, soft domain-driven design, naked objects, soft language, information retrieval, multimethodology

Procedia PDF Downloads 530
18154 Alternate Approaches to Quality Measurement: An Exploratory Study in Differentiation of “Quality” Characteristics in Services and Supports

Authors: Caitlin Bailey, Marian Frattarola Saulino, Beth Steinberg

Abstract:

Today, virtually all programs offered to people with intellectual and developmental disabilities tout themselves as person-centered, community-based and inclusive, yet there is a vast range in type and quality of services that use these similar descriptors. The issue is exacerbated by the fields’ measurement practices around quality, inclusion, independent living, choice and person-centered outcomes. For instance, community inclusion for people with disabilities is often measured by the number of times person steps into his or her community. These measurement approaches set standards for quality too low so that agencies supporting group home residents to go bowling every week can report the same outcomes as an agency that supports one person to join a book club that includes people based on their literary interests rather than disability labels. Ultimately, lack of delineation in measurement contributes to the confusion between face value “quality” and true quality services and supports for many people with disabilities and their families. This exploratory study adopts alternative approaches to quality measurement including co-production methods and systems theoretical framework in order to identify the factors that 1) lead to high-quality supports and, 2) differentiate high-quality services. Project researchers have partnered with community practitioners who are all committed to providing quality services and supports but vary in the degree to which they are actually able to provide them. The study includes two parts; first, an online survey distributed to more than 500 agencies that have demonstrated commitment to providing high-quality services; and second, four in-depth case studies with agencies in three United States and Israel providing a variety of supports to children and adults with disabilities. Results from both the survey and in-depth case studies were thematically analyzed and coded. Results show that there are specific factors that differentiate service quality; however meaningful quality measurement practices also require that researchers explore the contextual factors that contribute to quality. These not only include direct services and interactions, but also characteristics of service users, their environments as well as organizations providing services, such as management and funding structures, culture and leadership. Findings from this study challenge researchers, policy makers and practitioners to examine existing quality service standards and measurements and to adopt alternate methodologies and solutions to differentiate and scale up evidence-based quality practices so that all people with disabilities have access to services that support them to live, work, and enjoy where and with whom they choose.

Keywords: co-production, inclusion, independent living, quality measurement, quality supports

Procedia PDF Downloads 366
18153 Decision Quality as an Antecedent to Export Performance. Empirical Evidence under a Contingency Theory Lens

Authors: Evagelos Korobilis-Magas, Adekunle Oke

Abstract:

The constantly increasing tendency towards a global economy and the subsequent increase in exporting, as a result, has inevitably led to a growing interest in the topic of export success as well. Numerous studies, particularly in the past three decades, have examined a plethora of determinants to export performance. However, to the authors' best knowledge, no study up to date has ever considered decision quality as a potential antecedent to export success by attempting to test the relationship between decision quality and export performance. This is a surprising deficiency given that the export marketing literature has long ago suggested that quality decisions are regarded as the crucial intervening variable between sound decision–making and export performance. This study integrates the different definitions of decision quality proposed in the literature and the key themes incorporated therein and adapts it to an export context. Apart from laying the conceptual foundations for the delineation of this elusive but very important construct, this study is the first ever to test the relationship between decision quality and export performance. Based on survey data from a sample of 189 British export decision-makers and within a contingency theory framework, the results reveal that there is a direct, positive link between decision quality and export performance. This finding opens significant future research avenues and has very important implications for both theory and practice.

Keywords: export performance, decision quality, mixed methods, contingency theory

Procedia PDF Downloads 64
18152 Application of an Analytical Model to Obtain Daily Flow Duration Curves for Different Hydrological Regimes in Switzerland

Authors: Ana Clara Santos, Maria Manuela Portela, Bettina Schaefli

Abstract:

This work assesses the performance of an analytical model framework to generate daily flow duration curves, FDCs, based on climatic characteristics of the catchments and on their streamflow recession coefficients. According to the analytical model framework, precipitation is considered to be a stochastic process, modeled as a marked Poisson process, and recession is considered to be deterministic, with parameters that can be computed based on different models. The analytical model framework was tested for three case studies with different hydrological regimes located in Switzerland: pluvial, snow-dominated and glacier. For that purpose, five time intervals were analyzed (the four meteorological seasons and the civil year) and two developments of the model were tested: one considering a linear recession model and the other adopting a nonlinear recession model. Those developments were combined with recession coefficients obtained from two different approaches: forward and inverse estimation. The performance of the analytical framework when considering forward parameter estimation is poor in comparison with the inverse estimation for both, linear and nonlinear models. For the pluvial catchment, the inverse estimation shows exceptional good results, especially for the nonlinear model, clearing suggesting that the model has the ability to describe FDCs. For the snow-dominated and glacier catchments the seasonal results are better than the annual ones suggesting that the model can describe streamflows in those conditions and that future efforts should focus on improving and combining seasonal curves instead of considering single annual ones.

Keywords: analytical streamflow distribution, stochastic process, linear and non-linear recession, hydrological modelling, daily discharges

Procedia PDF Downloads 133
18151 Data-Mining Approach to Analyzing Industrial Process Information for Real-Time Monitoring

Authors: Seung-Lock Seo

Abstract:

This work presents a data-mining empirical monitoring scheme for industrial processes with partially unbalanced data. Measurement data of good operations are relatively easy to gather, but in unusual special events or faults it is generally difficult to collect process information or almost impossible to analyze some noisy data of industrial processes. At this time some noise filtering techniques can be used to enhance process monitoring performance in a real-time basis. In addition, pre-processing of raw process data is helpful to eliminate unwanted variation of industrial process data. In this work, the performance of various monitoring schemes was tested and demonstrated for discrete batch process data. It showed that the monitoring performance was improved significantly in terms of monitoring success rate of given process faults.

Keywords: data mining, process data, monitoring, safety, industrial processes

Procedia PDF Downloads 368
18150 Experimenting with Error Performance of Systems Employing Pulse Shaping Filters on a Software-Defined-Radio Platform

Authors: Chia-Yu Yao

Abstract:

This paper presents experimental results on testing the symbol-error-rate (SER) performance of quadrature amplitude modulation (QAM) systems employing symmetric pulse-shaping square-root (SR) filters designed by minimizing the roughness function and by minimizing the peak-to-average power ratio (PAR). The device used in the experiments is the 'bladeRF' software-defined-radio platform. PAR is a well-known measurement, whereas the roughness function is a concept for measuring the jitter-induced interference. The experimental results show that the system employing minimum-roughness pulse-shaping SR filters outperforms the system employing minimum-PAR pulse-shaping SR filters in the sense of SER performance.

Keywords: pulse-shaping filters, FIR filters, jittering, QAM

Procedia PDF Downloads 319
18149 Neural Synchronization - The Brain’s Transfer of Sensory Data

Authors: David Edgar

Abstract:

To understand how the brain’s subconscious and conscious functions, we must conquer the physics of Unity, which leads to duality’s algorithm. Where the subconscious (bottom-up) and conscious (top-down) processes function together to produce and consume intelligence, we use terms like ‘time is relative,’ but we really do understand the meaning. In the brain, there are different processes and, therefore, different observers. These different processes experience time at different rates. A sensory system such as the eyes cycles measurement around 33 milliseconds, the conscious process of the frontal lobe cycles at 300 milliseconds, and the subconscious process of the thalamus cycle at 5 milliseconds. Three different observers experience time differently. To bridge observers, the thalamus, which is the fastest of the processes, maintains a synchronous state and entangles the different components of the brain’s physical process. The entanglements form a synchronous cohesion between the brain components allowing them to share the same state and execute in the same measurement cycle. The thalamus uses the shared state to control the firing sequence of the brain’s linear subconscious process. Sharing state also allows the brain to cheat on the amount of sensory data that must be exchanged between components. Only unpredictable motion is transferred through the synchronous state because predictable motion already exists in the shared framework. The brain’s synchronous subconscious process is entirely based on energy conservation, where prediction regulates energy usage. So, the eyes every 33 milliseconds dump their sensory data into the thalamus every day. The thalamus is going to perform a motion measurement to identify the unpredictable motion in the sensory data. Here is the trick. The thalamus conducts its measurement based on the original observation time of the sensory system (33 ms), not its own process time (5 ms). This creates a data payload of synchronous motion that preserves the original sensory observation. Basically, a frozen moment in time (Flat 4D). The single moment in time can then be processed through the single state maintained by the synchronous process. Other processes, such as consciousness (300 ms), can interface with the synchronous state to generate awareness of that moment. Now, synchronous data traveling through a separate faster synchronous process creates a theoretical time tunnel where observation time is tunneled through the synchronous process and is reproduced on the other side in the original time-relativity. The synchronous process eliminates time dilation by simply removing itself from the equation so that its own process time does not alter the experience. To the original observer, the measurement appears to be instantaneous, but in the thalamus, a linear subconscious process generating sensory perception and thought production is being executed. It is all just occurring in the time available because other observation times are slower than thalamic measurement time. For life to exist in the physical universe requires a linear measurement process, it just hides by operating at a faster time relativity. What’s interesting is time dilation is not the problem; it’s the solution. Einstein said there was no universal time.

Keywords: neural synchronization, natural intelligence, 99.95% IoT data transmission savings, artificial subconscious intelligence (ASI)

Procedia PDF Downloads 102
18148 Obstacles to Innovation for SMEs: Evidence from Germany

Authors: Natalia Strobel, Jan Kratzer

Abstract:

Achieving effective innovation is a complex task and during this process firms (especially SMEs) often face obstacles. However, research into obstacles to innovation focusing on SMEs is very scarce. In this study, we propose a theoretical framework for describing these obstacles to innovation and investigate their influence on the innovative performance of SMEs. Data were collected in 2013 through face-to-face interviews with executives of 49 technology SMEs from Germany. The semi-structured interviews were designed on the basis of scales for measuring innovativeness, financial/competitive performance and obstacles to innovation, next to purely open questions. We find that the internal obstacles lack the know-how, capacity overloading, unclear roles and tasks, as well as the external obstacle governmental bureaucracy negatively influence the innovative performance of SMEs. However, in contrast to prior findings this study shows that cooperation ties of firms might also negatively influence the innovative performance.

Keywords: innovation, innovation process, obstacles, SME

Procedia PDF Downloads 328
18147 Conceptual Framework of Continuous Academic Lecturer Model in Islamic Higher Education

Authors: Lailial Muhtifah, Sirtul Marhamah

Abstract:

This article forwards the conceptual framework of continuous academic lecturer model in Islamic higher education (IHE). It is intended to make a contribution to the broader issue of how the concept of excellence can promote adherence to standards in higher education and drive quality enhancement. This model reveals a process and steps to increase performance and achievement of excellence regular lecturer gradually. Studies in this model are very significant to realize excellence academic culture in IHE. Several steps were identified from previous studies through literature study and empirical findings. A qualitative study was conducted at institute. Administrators and lecturers were interviewed, and lecturers learning communities observed to explore institute culture policies, and procedures. The original in this study presents and called Continuous Academic Lecturer Model (CALM) with its components, namely Standard, Quality, and Excellent as the basis for this framework (SQE). Innovation Excellence Framework requires Leaders to Support (LS) lecturers to achieve a excellence culture. So, the model named CALM-SQE+LS. Several components of performance and achievement of CALM-SQE+LS Model should be disseminated and cultivated to all lecturers in university excellence in terms of innovation. The purpose of this article is to define the concept of “CALM-SQE+LS”. Originally, there were three components in the Continuous Academic Lecturer Model i.e. standard, quality, and excellence plus leader support. This study is important to the community as specific cases that may inform educational leaders on mechanisms that may be leveraged to ensure successful implementation of policies and procedures outline of CALM with its components (SQE+LS) in institutional culture and professional leader literature. The findings of this study learn how continuous academic lecturer is part of a group's culture, how it benefits in university. This article blends the available criteria into several sub-component to give new insights towards empowering lecturer the innovation excellence at the IHE. The proposed conceptual framework is also presented.

Keywords: continuous academic lecturer model, excellence, quality, standard

Procedia PDF Downloads 175
18146 Performance Evaluation of a Minimum Mean Square Error-Based Physical Sidelink Share Channel Receiver under Fading Channel

Authors: Yang Fu, Jaime Rodrigo Navarro, Jose F. Monserrat, Faiza Bouchmal, Oscar Carrasco Quilis

Abstract:

Cellular Vehicle to Everything (C-V2X) is considered a promising solution for future autonomous driving. From Release 16 to Release 17, the Third Generation Partnership Project (3GPP) has introduced the definitions and services for 5G New Radio (NR) V2X. Experience from previous generations has shown that establishing a simulator for C-V2X communications is an essential preliminary step to achieve reliable and stable communication links. This paper proposes a complete framework of a link-level simulator based on the 3GPP specifications for the Physical Sidelink Share Channel (PSSCH) of the 5G NR Physical Layer (PHY). In this framework, several algorithms in the receiver part, i.e., sliding window in channel estimation and Minimum Mean Square Error (MMSE)-based equalization, are developed. Finally, the performance of the developed PSSCH receiver is validated through extensive simulations under different assumptions.

Keywords: C-V2X, channel estimation, link-level simulator, sidelink, 3GPP

Procedia PDF Downloads 90
18145 Design of Open Framework Based Smart ESS Profile for PV-ESS and UPS-ESS

Authors: Young-Su Ryu, Won-Gi Jeon, Byoung-Chul Song, Jae-Hong Park, Ki-Won Kwon

Abstract:

In this paper, an open framework based smart energy storage system (ESS) profile for photovoltaic (PV)-ESS and uninterruptible power supply (UPS)-ESS is proposed and designed. An open framework based smart ESS is designed and developed for unifying the different interfaces among manufacturers. The smart ESS operates under the profile which provides the specifications of peripheral devices such as different interfaces and to the open framework. The profile requires well systemicity and expandability for addible peripheral devices. Especially, the smart ESS should provide the expansion with existing systems such as UPS and the linkage with new renewable energy technology such as PV. This paper proposes and designs an open framework based smart ESS profile for PV-ESS and UPS-ESS. The designed profile provides the existing smart ESS and also the expandability of additional peripheral devices on smart ESS such as PV and UPS.

Keywords: energy storage system (ESS), open framework, profile, photovoltaic (PV), uninterruptible power supply (UPS)

Procedia PDF Downloads 440
18144 Developing a Toolkit of Undergraduate Nursing Student’ Desirable Characteristics (TNDC) : An application Item Response Theory

Authors: Parinyaporn Thanaboonpuang, Siridej Sujiva, Shotiga Pasiphul

Abstract:

The higher education reform that integration of nursing programmes into the higher education system. Learning outcomes represent one of the essential building blocks for transparency within higher education systems and qualifications. The purpose of this study is to develop a toolkit of undergraduate nursing student’desirable characteristics assessment on Thai Qualifications Framework for Higher education and to test psychometric property for this instrument. This toolkit seeks to improve on the Computer Multimedia test. There are three skills to be examined: Cognitive skill, Responsibility and Interpersonal Skill, and Information Technology Skill. The study was conduct in 4 phases. In Phase 1. Based on developed a measurement model and Computer Multimedia test. Phase 2 two round focus group were conducted, to determine the content validity of measurement model and the toolkit. In Phase 3, data were collected using a multistage random sampling of 1,156 senior undergraduate nursing student were recruited to test psychometric property. In Phase 4 data analysis was conducted by descriptive statistics, item analysis, inter-rater reliability, exploratory factor analysis and confirmatory factor analysis. The resulting TNDC consists of 74 items across the following four domains: Cognitive skill, Interpersonal Skill, Responsibility and Information Technology Skill. The value of Cronbach’ s alpha for the four domains were .781, 807, .831, and .865, respectively. The final model in confirmatory factor analysis fit quite well with empirical data. The TNDC was found to be appropriate, both theoretically and statistically. Due to these results, it is recommended that the toolkit could be used in future studies for Nursing Program in Thailand.

Keywords: toolkit, nursing student’ desirable characteristics, Thai qualifications framework

Procedia PDF Downloads 506
18143 The Effect of Corporate Governance to Islamic Banking Performance Using Maqasid Index Approach in Indonesia

Authors: Audia Syafa'atur Rahman, Rozali Haron

Abstract:

The practices of Islamic banking are more attuned to the goals of profit maximization rather than obtaining ethical profit. Ethical profit is obtained from interest-free earnings and to give an impact which benefits to the growth of society and economy. Good corporate governance practices are needed to assure the sustainability of Islamic banks in order to achieve Maqasid Shariah with the main purpose of boosting the well-being of people. The Maqasid Shariah performance measurement is used to measure the duties and responsibilities expected to be performed by Islamic banks. It covers not only unification dimension like financial measurement, but also many dimensions covered to reflect the main purpose of Islamic banks. The implementation of good corporate governance is essential because it covers the interests of the stakeholders and facilitates effective monitoring to encourage Islamic banks to utilize resources more efficiently in order to achieve the Maqasid Shariah. This study aims to provide the empirical evidence on the Maqasid performance of Islamic banks in relation to the Maqasid performance evaluation model, to examine the influence of SSB characteristics and board structures to Islamic Banks performance as measured by Maqasid performance evaluation model. By employing the simple additive weighting method, Maqasid index for all the Islamic Banks in Indonesia within 2012 to 2016 ranged from above 11% to 28%. The Maqasid Syariah performance index where results reached above 20% are obtained by Islamic Banks such as Bank Muamalat Indonesia, Bank Panin Syariah, and Bank BRI Syariah. The consistent achievement above 23% is achieved by BMI. Other Islamic Banks such as Bank Victoria Syariah, Bank Jabar Banten Syariah, Bank BNI Syariah, Bank Mega Syariah, BCA Syariah, and Maybank Syariah Indonesia shows a fluctuating value of the Maqasid performance index every year. The impact of SSB characteristics and board structures are tested using random-effects generalized least square. The findings indicate that SSB characteristics (Shariah Supervisory Board size, Shariah Supervisory Board cross membership, Shariah Supervisory Board Education, and Shariah Supervisory Board reputation) and board structures (Board size and Board independence) have an essential role in improving the performance of Islamic Banks. The findings denote Shariah Supervisory Board with smaller size, higher portion of Shariah Supervisory Board cross membership; lesser Shariah Supervisory Board holds doctorate degree, lesser reputable scholar, more members on board of directors, and less independence non-executive directors will enhance the performance of Islamic Banks.

Keywords: Maqasid Shariah, corporate governance, Islamic banks, Shariah supervisory board

Procedia PDF Downloads 206
18142 Compressed Sensing of Fetal Electrocardiogram Signals Based on Joint Block Multi-Orthogonal Least Squares Algorithm

Authors: Xiang Jianhong, Wang Cong, Wang Linyu

Abstract:

With the rise of medical IoT technologies, Wireless body area networks (WBANs) can collect fetal electrocardiogram (FECG) signals to support telemedicine analysis. The compressed sensing (CS)-based WBANs system can avoid the sampling of a large amount of redundant information and reduce the complexity and computing time of data processing, but the existing algorithms have poor signal compression and reconstruction performance. In this paper, a Joint block multi-orthogonal least squares (JBMOLS) algorithm is proposed. We apply the FECG signal to the Joint block sparse model (JBSM), and a comparative study of sparse transformation and measurement matrices is carried out. A FECG signal compression transmission mode based on Rbio5.5 wavelet, Bernoulli measurement matrix, and JBMOLS algorithm is proposed to improve the compression and reconstruction performance of FECG signal by CS-based WBANs. Experimental results show that the compression ratio (CR) required for accurate reconstruction of this transmission mode is increased by nearly 10%, and the runtime is saved by about 30%.

Keywords: telemedicine, fetal ECG, compressed sensing, joint sparse reconstruction, block sparse signal

Procedia PDF Downloads 97
18141 On a Theoretical Framework for Language Learning Apps Evaluation

Authors: Juan Manuel Real-Espinosa

Abstract:

This paper addresses the first step to evaluate language learning apps: what theoretical framework to adopt when designing the app evaluation framework. The answer is not just one since there are several options that could be proposed. However, the question to be clarified is to what extent the learning design of apps is based on a specific learning approach, or on the contrary, on a fusion of elements from several theoretical proposals and paradigms, such as m-learning, mobile assisted language learning, and a number of theories about language acquisition. The present study suggests that the reality is closer to the second assumption. This implies that the theoretical framework against which the learning design of the apps should be evaluated must also be a hybrid theoretical framework, which integrates evaluation criteria from the different theories involved in language learning through mobile applications.

Keywords: mobile-assisted language learning, action-oriented approach, apps evaluation, post-method pedagogy, second language acquisition

Procedia PDF Downloads 167
18140 A System Framework for Dynamic Service Deployment in Container-Based Computing Platform

Authors: Shuen-Tai Wang, Yu-Ching Lin, Hsi-Ya Chang

Abstract:

Cloud computing and virtualization technology have brought an innovative way for people to develop and use software nowadays. However, conventional virtualization comes at the expense of performance loss for applications. Container-based virtualization could be an option as it potentially reduces overhead and minimizes performance decline of the service platform. In this paper, we introduce a system framework and present an implementation of resource broker for dynamic cloud service deployment on the container-based platform to facilitate the efficient execution and improve the utilization. We target the load-aware service deployment approach for task ranking scenario. This proposed effort can collaborate with resource management system to adaptively deploy services according to the different requests. In particular, our approach relies on composing service immediately onto appropriate container according to user’s requirement in order to conserve the waiting time. Our evaluation shows how efficient of the service deployment is and how to expand its applicability to support the variety of cloud service.

Keywords: cloud computing, container-based virtualization, resource broker, service deployment

Procedia PDF Downloads 136
18139 Image Multi-Feature Analysis by Principal Component Analysis for Visual Surface Roughness Measurement

Authors: Wei Zhang, Yan He, Yan Wang, Yufeng Li, Chuanpeng Hao

Abstract:

Surface roughness is an important index for evaluating surface quality, needs to be accurately measured to ensure the performance of the workpiece. The roughness measurement based on machine vision involves various image features, some of which are redundant. These redundant features affect the accuracy and speed of the visual approach. Previous research used correlation analysis methods to select the appropriate features. However, this feature analysis is independent and cannot fully utilize the information of data. Besides, blindly reducing features lose a lot of useful information, resulting in unreliable results. Therefore, the focus of this paper is on providing a redundant feature removal approach for visual roughness measurement. In this paper, the statistical methods and gray-level co-occurrence matrix(GLCM) are employed to extract the texture features of machined images effectively. Then, the principal component analysis(PCA) is used to fuse all extracted features into a new one, which reduces the feature dimension and maintains the integrity of the original information. Finally, the relationship between new features and roughness is established by the support vector machine(SVM). The experimental results show that the approach can effectively solve multi-feature information redundancy of machined surface images and provides a new idea for the visual evaluation of surface roughness.

Keywords: feature analysis, machine vision, PCA, surface roughness, SVM

Procedia PDF Downloads 182
18138 Automated Process Quality Monitoring and Diagnostics for Large-Scale Measurement Data

Authors: Hyun-Woo Cho

Abstract:

Continuous monitoring of industrial plants is one of necessary tasks when it comes to ensuring high-quality final products. In terms of monitoring and diagnosis, it is quite critical and important to detect some incipient abnormal events of manufacturing processes in order to improve safety and reliability of operations involved and to reduce related losses. In this work a new multivariate statistical online diagnostic method is presented using a case study. For building some reference models an empirical discriminant model is constructed based on various past operation runs. When a fault is detected on-line, an on-line diagnostic module is initiated. Finally, the status of the current operating conditions is compared with the reference model to make a diagnostic decision. The performance of the presented framework is evaluated using a dataset from complex industrial processes. It has been shown that the proposed diagnostic method outperforms other techniques especially in terms of incipient detection of any faults occurred.

Keywords: data mining, empirical model, on-line diagnostics, process fault, process monitoring

Procedia PDF Downloads 366
18137 Evaluation of the Architect-Friendliness of LCA-Based Environmental Impact Assessment Tools

Authors: Elke Meex, Elke Knapen, Griet Verbeeck

Abstract:

The focus of sustainable building is gradually shifting from energy efficiency towards the more global environmental impact of building design during all life-cycle stages. In this context, many tools have been developed that use a LCA-approach to assess the environmental impact on a whole building level. Since the building design strongly influences the final environmental performance and the architect plays a key role in the design process, it is important that these tools are adapted to his work method and support the decision making from the early design phase on. Therefore, a comparative evaluation of the degree of architect-friendliness of some LCA tools on building level is made, based on an evaluation framework specifically developed for the architect’s viewpoint. In order to allow comparison of the results, a reference building has been designed, documented for different design phases and entered in all software tools. The evaluation according to the framework shows that the existing tools are not very architect-friendly. Suggestions for improvement are formulated.

Keywords: architect-friendliness, design supportive value, evaluation framework, tool comparison

Procedia PDF Downloads 508
18136 Framework for Developing Change Team to Maximize Change Initiative Success

Authors: Mohammad Z. Ansari, Lisa Brodie, Marilyn Goh

Abstract:

Change facilitators are individuals who utilize change philosophy to make a positive change to organizations. The application of change facilitators can be seen in various change models; Lewin, Lippitt, etc. The facilitators within numerous change models are considered as internal/external consultants. Whilst most of the scholarly paper considers change facilitation as a consensus attempt to improve organization, there is a lack of a framework that develops both the organization and the change facilitator creating a self-sustaining change environment. This research paper introduces the development of the framework for change Leaders, Planners, and Executers (LPE), aiming at various organizational levels (Process, Departmental, and Organisational). The LPE framework is derived by exploring interrelated characteristics between facilitator(s) and the organization through qualitative research for understanding change management techniques and facilitator(s) behavioral aspect from existing Change Management models and Organisation behavior works of literature. The introduced framework assists in highlighting and identify the most appropriate change team to successfully deliver the change initiative within any organization (s).

Keywords: change initiative, LPE framework, change facilitator(s), sustainable change

Procedia PDF Downloads 160
18135 Experimental Investigation of Cup Anemometer under Static and Dynamic Wind Direction Changes: Evaluation of Directional Sensitivity

Authors: Vaibhav Rana, Nicholas Balaresque

Abstract:

The 3-cup anemometer is the most commonly used instrument for wind speed measurement and, consequently, for the wind resource assessment. Though the cup anemometer shows accurate measurement under quasi-static conditions, there is uncertainty in the measurement when subjected to field measurement. Sensitivity to the angle of attacks with respect to horizontal plane, dynamic response, and non-linear behavior in calibration due to friction. The presented work aimed to identify the sensitivity of anemometer to non-horizontal flow. The cup anemometer was investigated under low wind speed wind tunnel, first under the static flow direction changes and second under the dynamic direction changes, at a different angle of attacks, under the similar conditions of reference wind tunnel speeds. The cup anemometer response under both conditions was evaluated and compared. The results showed the anemometer under dynamic wind direction changes is highly sensitive compared to static conditions.

Keywords: wind energy, cup anemometer, directional sensitivity, dynamic behavior, wind tunnel

Procedia PDF Downloads 115
18134 A Stochastic Analytic Hierarchy Process Based Weighting Model for Sustainability Measurement in an Organization

Authors: Faramarz Khosravi, Gokhan Izbirak

Abstract:

A weighted statistical stochastic based Analytical Hierarchy Process (AHP) model for modeling the potential barriers and enablers of sustainability for measuring and assessing the sustainability level is proposed. For context-dependent potential barriers and enablers, the proposed model takes the basis of the properties of the variables describing the sustainability functions and was developed into a realistic analytical model for the sustainable behavior of an organization. This thus serves as a means for measuring the sustainability of the organization. The main focus of this paper was the application of the AHP tool in a statistically-based model for measuring sustainability. Hence a strong weighted stochastic AHP based procedure was achieved. A case study scenario of a widely reported major Canadian electric utility was adopted to demonstrate the applicability of the developed model and comparatively examined its results with those of an equal-weighted model method. Variations in the sustainability of a company, as fluctuations, were figured out during the time. In the results obtained, sustainability index for successive years changed form 73.12%, 79.02%, 74.31%, 76.65%, 80.49%, 79.81%, 79.83% to more exact values 73.32%, 77.72%, 76.76%, 79.41%, 81.93%, 79.72%, and 80,45% according to priorities of factors that have found by expert views, respectively. By obtaining relatively necessary informative measurement indicators, the model can practically and effectively evaluate the sustainability extent of any organization and also to determine fluctuations in the organization over time.

Keywords: AHP, sustainability fluctuation, environmental indicators, performance measurement

Procedia PDF Downloads 97
18133 Performance Prediction Methodology of Slow Aging Assets

Authors: M. Ben Slimene, M.-S. Ouali

Abstract:

Asset management of urban infrastructures faces a multitude of challenges that need to be overcome to obtain a reliable measurement of performances. Predicting the performance of slowly aging systems is one of those challenges, which helps the asset manager to investigate specific failure modes and to undertake the appropriate maintenance and rehabilitation interventions to avoid catastrophic failures as well as to optimize the maintenance costs. This article presents a methodology for modeling the deterioration of slowly degrading assets based on an operating history. It consists of extracting degradation profiles by grouping together assets that exhibit similar degradation sequences using an unsupervised classification technique derived from artificial intelligence. The obtained clusters are used to build the performance prediction models. This methodology is applied to a sample of a stormwater drainage culvert dataset.

Keywords: artificial Intelligence, clustering, culvert, regression model, slow degradation

Procedia PDF Downloads 75