Search results for: inventory level
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3500

Search results for: inventory level

2240 Microarrays Denoising via Smoothing of Coefficients in Wavelet Domain

Authors: Mario Mastriani, Alberto E. Giraldez

Abstract:

We describe a novel method for removing noise (in wavelet domain) of unknown variance from microarrays. The method is based on a smoothing of the coefficients of the highest subbands. Specifically, we decompose the noisy microarray into wavelet subbands, apply smoothing within each highest subband, and reconstruct a microarray from the modified wavelet coefficients. This process is applied a single time, and exclusively to the first level of decomposition, i.e., in most of the cases, it is not necessary a multirresoltuion analysis. Denoising results compare favorably to the most of methods in use at the moment.

Keywords: Directional smoothing, denoising, edge preservation, microarrays, thresholding, wavelets

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1508
2239 Classification of Radio Communication Signals using Fuzzy Logic

Authors: Zuzana Dideková, Beata Mikovičová

Abstract:

Characterization of radio communication signals aims at automatic recognition of different characteristics of radio signals in order to detect their modulation type, the central frequency, and the level. Our purpose is to apply techniques used in image processing in order to extract pertinent characteristics. To the single analysis, we add several rules for checking the consistency of hypotheses using fuzzy logic. This allows taking into account ambiguity and uncertainty that may remain after the extraction of individual characteristics. The aim is to improve the process of radio communications characterization.

Keywords: fuzzy classification, fuzzy inference system, radio communication signals, telecommunications

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1976
2238 Design of Experiment and Computational Fluid Dynamics Used to Optimize Hydrodynamic Characteristics of the Marine Propeller

Authors: Rohit Suryawanshi

Abstract:

In this study, the commercial Computational Fluid Dynamics (CFD), ANSYS-Fluent, has been used to optimize the marine propeller with the design of experiment (DOE) method. At the initial stage, different propeller parameters ware selected for the three different levels. The four characteristics factors are: no. of the blade, camber value, pitch delta & chord at the hub. Then, CAD modelling is performed by considering the selected factor and level. In this investigation, a total of 9 test models are simulated with the Reynolds-Averaged Navier-Stokes (RANS) equations. The standard, realizable

Keywords: Marine propeller, Computational Fluid Dynamics, optimization, DOE, propeller thrust.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 906
2237 Eco-innovation and Economic Performance in Industrial Clusters: Evidence from Italy

Authors: Sara Tessitore, Tiberio Daddi, Fabio Iraldo

Abstract:

The article aims to investigate the presence of a correlation between eco-innovation and economic performance within industrial districts. The case analyzed in this article is based on a study concerning a sample of 54 Italian industrial clusters entitled "Eco-Districts" that has compiled a list of the most eco-efficient districts at the national level. After selecting two districts, this study assesses the economic performance of the last three years through the analysis of trends in four indicators. The results show that only in some cases there is a connection between eco innovation and economic performance.

Keywords: clusters, industrial districts, eco-innovation, economic performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1723
2236 Industry Openness, Human Capital and Wage Inequality: Evidence from Chinese Manufacturing Firms

Authors: Qiong Huang, Satish Chand

Abstract:

This paper uses a primary data from 670 Chinese manufacturing firms, together with the newly introduced regressionbased inequality decomposition method, to study the effect of openness on wage inequality. We find that openness leads to a positive industry wage premium, but its contribution to firm-level wage inequality is relatively small, only 4.69%. The major contributor to wage inequality is human capital, which could explain 14.3% of wage inequality across sample firms.  

Keywords: Openness, human capital, wage inequality, decomposition; China.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1194
2235 Numerical Simulation of a Solar Photovoltaic Panel Cooled by a Forced Air System

Authors: D. Nebbali, R. Nebbali, A. Ouibrahim

Abstract:

This study focuses on the cooling of a photovoltaic panel (PV). Indeed, the cooling improves the conversion capacity of this one and maintains, under extreme conditions of air temperature, the panel temperature at an appreciable level which avoids the altering. To do this, a fan provides forced circulation of air. Because the fan is supplied by the panel, it is necessary to determine the optimum operating point that unites efficiency of the PV with the consumption of the fan. For this matter, numerical simulations are performed at varying mass flow rates of air, under two extreme air temperatures (50°C, 25°C) and a fixed solar radiation (1000W.m2) in a case of no wind.

Keywords: Energy conversion, efficiency, balance energy, solar cell.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2486
2234 Emerging VC Industry: Do Market Expectations Play the Most Important Role in Project Selection? Evidence on Russian Data

Authors: I. Rodionov, A. Semenov, E. Gosteva, O. Sokolova

Abstract:

The venture capital becomes more and more advanced and effective source of the innovation project financing, connected with a high-risk level. In the developed countries, it plays a key role in transforming innovation projects into successful businesses and creating the prosperity of the modern economy. In Russia, there are many necessary preconditions for creation of the effective venture investment system: the network of the public institutes for innovation financing operates; there is a significant number of the small and medium-sized enterprises, capable to sell production with good market potential. However, the current system does not confirm the necessary level of efficiency in practice that can be substantially explained by the absence of the accurate plan of action to form the national venture model and by the lack of experience of successful venture deals with profitable exits in Russian economy. This paper studies the influence of various factors on the venture industry development by the example of the IT-sector in Russia. The choice of the sector is based on the fact, that this segment is the main driver of the venture capital market growth in Russia, and the necessary set of data exists. The size of investment of the second round is used as the dependent variable. To analyse the influence of the previous round, such determinant as the volume of the previous (first) round investments is used. There is also used a dummy variable in regression to examine that the participation of an investor with high reputation and experience in the previous round can influence the size of the next investment round. The regression analysis of short-term interrelations between studied variables reveals prevailing influence of the volume of the first round investments on the venture investments volume of the second round. The most important determinant of the value of the second-round investment is the value of first–round investment, so it means that the most competitive on the Russian market are the start-up teams that can attract more money on the start, and the target market growth is not the factor of crucial importance. This supports the point of view that VC in Russia is driven by endogenous factors and not by exogenous ones that are based on global market growth.

Keywords: Venture industry, venture investment, determinants of the venture sector development, IT-sector.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1560
2233 A Case of Study for 3D Stereoscopic Conversion in Visual Effects Industry

Authors: Jin Zhi

Abstract:

This paper covered a series of key points in terms of 2D to 3D stereoscopic conversion. A successfully applied stereoscopic conversion approach in current visual effects industry was presented. The purpose of this paper is to cover a detailed workflow and concept, which has been successfully used in 3D stereoscopic conversion for feature films in visual effects industry, and therefore to clarify the process in stereoscopic conversion production and provide a clear idea for those entry-level artists to improve an overall understanding of 3D stereoscopic in digital compositing field as well as to the higher education factor of visual effects and hopefully inspire further collaboration and participants particularly between academia and industry.

Keywords: Clean plates, Mattes, Stereoscopic conversion, 3Dprojection, Z-depth.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2236
2232 Waste Management in a Hot Laboratory of Japan Atomic Energy Agency – 1: Overview and Activities in Chemical Processing Facility

Authors: Kazunori Nomura, Hiromichi Ogi, Masaumi Nakahara, Sou Watanabe, Atsuhiro Shibata

Abstract:

Chemical Processing Facility of Japan Atomic Energy Agency is a basic research field for advanced back-end technology developments with using actual high-level radioactive materials such as irradiated fuels from the fast reactor, high-level liquid waste from reprocessing plant. In the nature of a research facility, various kinds of chemical reagents have been offered for fundamental tests. Most of them were treated properly and stored in the liquid waste vessel equipped in the facility, but some were not treated and remained at the experimental space as a kind of legacy waste. It is required to treat the waste in safety. On the other hand, we formulated the Medium- and Long-Term Management Plan of Japan Atomic Energy Agency Facilities. This comprehensive plan considers Chemical Processing Facility as one of the facilities to be decommissioned. Even if the plan is executed, treatment of the “legacy” waste beforehand must be a necessary step for decommissioning operation. Under this circumstance, we launched a collaborative research project called the STRAD project, which stands for Systematic Treatment of Radioactive liquid waste for Decommissioning, in order to develop the treatment processes for wastes of the nuclear research facility. In this project, decomposition methods of chemicals causing a troublesome phenomenon such as corrosion and explosion have been developed and there is a prospect of their decomposition in the facility by simple method. And solidification of aqueous or organic liquid wastes after the decomposition has been studied by adding cement or coagulants. Furthermore, we treated experimental tools of various materials with making an effort to stabilize and to compact them before the package into the waste container. It is expected to decrease the number of transportation of the solid waste and widen the operation space. Some achievements of these studies will be shown in this paper. The project is expected to contribute beneficial waste management outcome that can be shared world widely.

Keywords: Chemical Processing Facility, medium- and long-term management plan of JAEA Facilities, STRAD project, treatment of radioactive waste.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 880
2231 IFC-Based Construction Engineering Domain Otology Development

Authors: Jin Si, Yanzhong Wang

Abstract:

The essence of the 21st century is knowledge economy. Knowledge has become the key resource of economic growth and social development. Construction industry is no exception. Because of the characteristic of complexity, project manager can't depend only on information management. The only way to improve the level of construction project management is to set up a kind of effective knowledge accumulation mechanism. This paper first introduced the IFC standard and the concept of ontology. Then put forward the construction method of the architectural engineering domain ontology based on IFC. And finally build up the concepts, properties and the relationship between the concepts of the ontology. The deficiency of this paper is also pointed out.

Keywords: Construction Engineering, IFC, Ontology

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2113
2230 Technological Innovation Persistence Organizational Innovation Matters

Authors: H. Naciba, C. Le Bas, C. Mothe, T.U. Nguyen-Thi

Abstract:

Organizational innovation favors technological innovation, but does it also influence technological innovation persistence? This article investigates empirically the pattern of technological innovation persistence and tests the potential impact of organizational innovation using firm-level data from three waves of the French Community Innovation Surveys. Evidence shows a positive effect of organizational innovation on technological innovation persistence, according to various measures of organizational innovation. Moreover, this impact is more significant for complex innovators (i.e., those who innovate in both products and processes). These results highlight the complexity of managing organizational practices with regard to the firm-s technological innovation. They also add to comprehension of the drivers of innovation persistence, through a focus on an often forgotten dimension of innovation in a broader sense.

Keywords: Organizational Innovation, Technological Innovation, Persistence

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1470
2229 Backplane Serial Signaling and Protocol for Telecom Systems

Authors: Ali Poureslami, Hossein Borhanifar, Seyed Ali Alavian

Abstract:

In this paper, we implement a modern serial backplane platform for telecommunication inter-rack systems. For combination high reliability and low cost protocol property, we applied high level data link control (HDLC) protocol with low voltage differential signaling (LVDS) bus for card to card communicated over backplane. HDLC protocol is a high performance with several operation modes and is famous in telecommunication systems. LVDS bus is a high reliability with high immunity against electromagnetic interference (EMI) and noise.

Keywords: Backplane, BLVDS, HDLC, EMI, I2C, LCT, OSC, SFP, SNMP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2273
2228 Measuring Heterogeneous Traffic Density

Authors: V. Thamizh Arasan, G. Dhivya

Abstract:

Traffic Density provides an indication of the level of service being provided to the road users. Hence, there is a need to study the traffic flow characteristics with specific reference to density in detail. When the length and speed of the vehicles in a traffic stream vary significantly, the concept of occupancy, rather than density, is more appropriate to describe traffic concentration. When the concept of occupancy is applied to heterogeneous traffic condition, it is necessary to consider the area of the road space and the area of the vehicles as the bases. Hence, a new concept named, 'area-occupancy' is proposed here. It has been found that the estimated area-occupancy gives consistent values irrespective of change in traffic composition.

Keywords: Density Measurement, Heterogeneity, Occupancy, Traffic Flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3243
2227 An Overview of Some High Order and Multi-Level Finite Difference Schemes in Computational Aeroacoustics

Authors: Appanah Rao Appadu, Muhammad Zaid Dauhoo

Abstract:

In this paper, we have combined some spatial derivatives with the optimised time derivative proposed by Tam and Webb in order to approximate the linear advection equation which is given by = 0. Ôêé Ôêé + Ôêé Ôêé x f t u These spatial derivatives are as follows: a standard 7-point 6 th -order central difference scheme (ST7), a standard 9-point 8 th -order central difference scheme (ST9) and optimised schemes designed by Tam and Webb, Lockard et al., Zingg et al., Zhuang and Chen, Bogey and Bailly. Thus, these seven different spatial derivatives have been coupled with the optimised time derivative to obtain seven different finite-difference schemes to approximate the linear advection equation. We have analysed the variation of the modified wavenumber and group velocity, both with respect to the exact wavenumber for each spatial derivative. The problems considered are the 1-D propagation of a Boxcar function, propagation of an initial disturbance consisting of a sine and Gaussian function and the propagation of a Gaussian profile. It is known that the choice of the cfl number affects the quality of results in terms of dissipation and dispersion characteristics. Based on the numerical experiments solved and numerical methods used to approximate the linear advection equation, it is observed in this work, that the quality of results is dependent on the choice of the cfl number, even for optimised numerical methods. The errors from the numerical results have been quantified into dispersion and dissipation using a technique devised by Takacs. Also, the quantity, Exponential Error for Low Dispersion and Low Dissipation, eeldld has been computed from the numerical results. Moreover, based on this work, it has been found that when the quantity, eeldld can be used as a measure of the total error. In particular, the total error is a minimum when the eeldld is a minimum.

Keywords: Optimised time derivative, dissipation, dispersion, cfl number, Nomenclature: k : time step, h : spatial step, β :advection velocity, r: cfl/Courant number, hkrβ= , w =θ, h : exact wave number, n :time level, RPE : Relative phase error per unit time step, AFM :modulus of amplification factor

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1638
2226 A Novel Steganographic Method for Gray-Level Images

Authors: Ahmad T. Al-Taani, Abdullah M. AL-Issa

Abstract:

In this work we propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by dividing the cover into blocks of equal sizes and then embeds the message in the edge of the block depending on the number of ones in left four bits of the pixel. The proposed approach is tested on a database consists of 100 different images. Experimental results, compared with other methods, showed that the proposed approach hide more large information and gave a good visual quality stego-image that can be seen by human eyes.

Keywords: Data Embedding, Cryptography, Watermarking, Steganography, Least Significant Bit, Information Hiding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2271
2225 Comparing Test Equating by Item Response Theory and Raw Score Methods with Small Sample Sizes on a Study of the ARTé: Mecenas Learning Game

Authors: Steven W. Carruthers

Abstract:

The purpose of the present research is to equate two test forms as part of a study to evaluate the educational effectiveness of the ARTé: Mecenas art history learning game. The researcher applied Item Response Theory (IRT) procedures to calculate item, test, and mean-sigma equating parameters. With the sample size n=134, test parameters indicated “good” model fit but low Test Information Functions and more acute than expected equating parameters. Therefore, the researcher applied equipercentile equating and linear equating to raw scores and compared the equated form parameters and effect sizes from each method. Item scaling in IRT enables the researcher to select a subset of well-discriminating items. The mean-sigma step produces a mean-slope adjustment from the anchor items, which was used to scale the score on the new form (Form R) to the reference form (Form Q) scale. In equipercentile equating, scores are adjusted to align the proportion of scores in each quintile segment. Linear equating produces a mean-slope adjustment, which was applied to all core items on the new form. The study followed a quasi-experimental design with purposeful sampling of students enrolled in a college level art history course (n=134) and counterbalancing design to distribute both forms on the pre- and posttests. The Experimental Group (n=82) was asked to play ARTé: Mecenas online and complete Level 4 of the game within a two-week period; 37 participants completed Level 4. Over the same period, the Control Group (n=52) did not play the game. The researcher examined between group differences from post-test scores on test Form Q and Form R by full-factorial Two-Way ANOVA. The raw score analysis indicated a 1.29% direct effect of form, which was statistically non-significant but may be practically significant. The researcher repeated the between group differences analysis with all three equating methods. For the IRT mean-sigma adjusted scores, form had a direct effect of 8.39%. Mean-sigma equating with a small sample may have resulted in inaccurate equating parameters. Equipercentile equating aligned test means and standard deviations, but resultant skewness and kurtosis worsened compared to raw score parameters. Form had a 3.18% direct effect. Linear equating produced the lowest Form effect, approaching 0%. Using linearly equated scores, the researcher conducted an ANCOVA to examine the effect size in terms of prior knowledge. The between group effect size for the Control Group versus Experimental Group participants who completed the game was 14.39% with a 4.77% effect size attributed to pre-test score. Playing and completing the game increased art history knowledge, and individuals with low prior knowledge tended to gain more from pre- to post test. Ultimately, researchers should approach test equating based on their theoretical stance on Classical Test Theory and IRT and the respective  assumptions. Regardless of the approach or method, test equating requires a representative sample of sufficient size. With small sample sizes, the application of a range of equating approaches can expose item and test features for review, inform interpretation, and identify paths for improving instruments for future study.

Keywords: Effectiveness, equipercentile equating, IRT, learning games, linear equating, mean-sigma equating.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1019
2224 Evolved Strokes in Non Photo–Realistic Rendering

Authors: Ashkan Izadi, Vic Ciesielski

Abstract:

We describe a work with an evolutionary computing algorithm for non photo–realistic rendering of a target image. The renderings are produced by genetic programming. We have used two different types of strokes: “empty triangle" and “filled triangle" in color level. We compare both empty and filled triangular strokes to find which one generates more aesthetic pleasing images. We found the filled triangular strokes have better fitness and generate more aesthetic images than empty triangular strokes.

Keywords: Artificial intelligence, Evolutionary programming, Geneticprogramming, Non photo–realistic rendering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1935
2223 Factors Affecting Media Literacy of Early Teenagers

Authors: Khajornjit Bunnag

Abstract:

The purposes of this research are: 1) to study the media literacy of early teenagers, and 2) to study the interaction between gender and timing of media exposure that affects the media literacy of teenagers. The sample of the study included 400 young people aged between 11 to 17 and who were living in Bangkok. The data was collected using questionnaires. Two-way ANOVA was used in analyzing the collected data. The result revealed that gender and timing of media exposure affected the media literacy of early teenagers with statistical significance at the level of 0.05.

Keywords: Gender, Media Literacy, Teenager, Timing of Media Exposure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2606
2222 Computer-Assisted Management of Building Climate and Microgrid with Model Predictive Control

Authors: Vinko Lešić, Mario Vašak, Anita Martinčević, Marko Gulin, Antonio Starčić, Hrvoje Novak

Abstract:

With 40% of total world energy consumption, building systems are developing into technically complex large energy consumers suitable for application of sophisticated power management approaches to largely increase the energy efficiency and even make them active energy market participants. Centralized control system of building heating and cooling managed by economically-optimal model predictive control shows promising results with estimated 30% of energy efficiency increase. The research is focused on implementation of such a method on a case study performed on two floors of our faculty building with corresponding sensors wireless data acquisition, remote heating/cooling units and central climate controller. Building walls are mathematically modeled with corresponding material types, surface shapes and sizes. Models are then exploited to predict thermal characteristics and changes in different building zones. Exterior influences such as environmental conditions and weather forecast, people behavior and comfort demands are all taken into account for deriving price-optimal climate control. Finally, a DC microgrid with photovoltaics, wind turbine, supercapacitor, batteries and fuel cell stacks is added to make the building a unit capable of active participation in a price-varying energy market. Computational burden of applying model predictive control on such a complex system is relaxed through a hierarchical decomposition of the microgrid and climate control, where the former is designed as higher hierarchical level with pre-calculated price-optimal power flows control, and latter is designed as lower level control responsible to ensure thermal comfort and exploit the optimal supply conditions enabled by microgrid energy flows management. Such an approach is expected to enable the inclusion of more complex building subsystems into consideration in order to further increase the energy efficiency.

Keywords: Energy-efficient buildings, Hierarchical model predictive control, Microgrid power flow optimization, Price-optimal building climate control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1523
2221 Assessment Methods for Surgical Skill

Authors: Siti Nor Zawani Ahmmad, Eileen Su Lee Ming, Yeong Che Fai, Fauzan Khairi bin Che Harun

Abstract:

The increasingly sophisticated technologies have now been able to provide assistance for surgeons to improve surgical performance through various training programs. Equally important to learning skills is the assessment method as it determines the learning and technical proficiency of a trainee. A consistent and rigorous assessment system will ensure that trainees acquire the specific level of competency prior to certification. This paper reviews the methods currently in use for assessment of surgical skill and some modern techniques using computer-based measurements and virtual reality systems for more quantitative measurements

Keywords: assessment, surgical skill, checklist, global rating, virtual reality

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2433
2220 Identification of Printed Punjabi Words and English Numerals Using Gabor Features

Authors: Rajneesh Rani, Renu Dhir, G. S. Lehal

Abstract:

Script identification is one of the challenging steps in the development of optical character recognition system for bilingual or multilingual documents. In this paper an attempt is made for identification of English numerals at word level from Punjabi documents by using Gabor features. The support vector machine (SVM) classifier with five fold cross validation is used to classify the word images. The results obtained are quite encouraging. Average accuracy with RBF kernel, Polynomial and Linear Kernel functions comes out to be greater than 99%.

Keywords: Script identification, gabor features, support vector machines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2132
2219 On the Way to the European Research Area: Programmes of the European Union as Factor of the Innovation Development the Scientific Organization in Ukraine

Authors: Yuri Nikitin, Veronika Rukas

Abstract:

Within the framework of the FP7 project "START" the cooperation with European research centres has had a positive impact on raising the level of innovation researches and the introduction of innovations Institute for Superhard Materials of the National Academy of Sciences (ISM NAS) of Ukraine in the economy of Europe and Ukraine, which in turn permits to speeds up the way for Ukrainian science to the European research area through the creation in Ukraine the scientific organizations of innovative type.

Keywords: Programs of the EU, innovative scientific results, innovation competence of the staff, commercialization in business of industry of the Europe and Ukraine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2050
2218 Accreditation and Quality Assurance of Nigerian Universities: The Management Imperative

Authors: F. O Anugom

Abstract:

The general functions of the university amongst other things include teaching, research and community service. Universities are recognized as the apex of learning, accumulating and imparting knowledge and skills of all kinds to students to enable them to be productive, earn their living and to make optimum contributions to national development. This is equivalent to the production of human capital in the form of high level manpower needed to administer the educational society, be useful to the society and manage the economy. Quality has become a matter of major importance for university education in Nigeria. Accreditation is the systematic review of educational programs to ensure that acceptable standards of education, scholarship and infrastructure are being maintained. Accreditation ensures that institution maintain quality. The process is designed to determine whether or not an institution has met or exceeded the published standards for accreditation, and whether it is achieving its mission and stated purposes. Ensuring quality assurance in accreditation process falls in the hands of university management which justified the need for this study. This study examined accreditation and quality assurance: the management imperative. Three research questions and three hypotheses guided the study. The design was a correlation survey with a population of 2,893 university administrators out of which 578 Heads of department and Dean of faculties were sampled. The instrument for data collection was titled Programme Accreditation Exercise scale with high levels of reliability. The research questions were answered with Pearson ‘r’ statistics. T-test statistics was used to test the hypotheses. It was found among others that the quality of accredited programme depends on the level of funding of universities in Nigeria. It was also indicated that quality of programme accreditation and physical facilities of universities in Nigeria have high relationship. But it was also revealed that programme accreditation is positively related to staffing in Nigerian universities. Based on the findings of the study, the researcher recommend that academic administrators should be included in the team of those who ensure quality programs in the universities. Private sector partnership should be encouraged to fund programs to ensure quality of programme in the universities. Independent agencies should be engaged to monitor the activities of accreditation teams to avoid bias.

Keywords: Accreditation, quality assurance, NUC, physical facilities, staffing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1929
2217 The Characterisation of TLC NAND Flash Memory, Leading to a Definable Endurance/Retention Trade-Off

Authors: Sorcha Bennett, Joe Sullivan

Abstract:

Triple-Level Cell (TLC) NAND Flash memory at, and below, 20nm (nanometer) is still largely unexplored by researchers, and with the ever more commonplace existence of Flash in consumer and enterprise applications there is a need for such gaps in knowledge to be filled. At the time of writing, there was little published data or literature on TLC, and more specifically reliability testing, with a further emphasis on both endurance and retention. This paper will give an introduction to NAND Flash memory, followed by an overview of the relevant current research on the reliability of Flash memory, along with the planned future work which will provide results to help characterise the reliability of TLC memory.

Keywords: TLC NAND flash memory, reliability, endurance, retention, trade-off, raw flash, patterns.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3521
2216 A Virtual Simulation Environment for a Design and Verification of a GPGPU

Authors: Kwang Y. Lee, Tae R. Park, Jae C. Kwak, Yong S. Koo

Abstract:

When a small H/W IP is designed, we can develop an appropriate verification environment by observing the simulated signal waves, or using the serial test vectors for the fixed output. In the case of design and verification of a massive parallel processor with multiple IPs, it-s difficult to make a verification system with existing common verification environment, and to verify each partial IP. A TestDrive verification environment can build easy and reliable verification system that can produce highly intuitive results by applying Modelsim and SystemVerilog-s DPI. It shows many advantages, for example a high-level design of a GPGPU processor design can be migrate to FPGA board immediately.

Keywords: Virtual Simulation, Verification, IP Design, GPGPU

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1664
2215 Stereotype Student Model for an Adaptive e-Learning System

Authors: Ani Grubišić, Slavomir Stankov, Branko Žitko

Abstract:

This paper describes a concept of stereotype student model in adaptive knowledge acquisition e-learning system. Defined knowledge stereotypes are based on student's proficiency level and on Bloom's knowledge taxonomy. The teacher module is responsible for the whole adaptivity process: the automatic generation of courseware elements, their dynamic selection and sorting, as well as their adaptive presentation using templates for statements and questions. The adaptation of courseware is realized according to student-s knowledge stereotype.

Keywords: Adaptive e-learning systems, adaptive courseware, stereotypes, Bloom's knowledge taxonomy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2902
2214 Achieving Design-Stage Elemental Cost Planning Accuracy: Case Study of New Zealand

Authors: Johnson Adafin, James O. B. Rotimi, Suzanne Wilkinson, Abimbola O. Windapo

Abstract:

An aspect of client expenditure management that requires attention is the level of accuracy achievable in design-stage elemental cost planning. This has been a major concern for construction clients and practitioners in New Zealand (NZ). Pre-tender estimating inaccuracies are significantly influenced by the level of risk information available to estimators. Proper cost planning activities should ensure the production of a project’s likely construction costs (initial and final), and subsequent cost control activities should prevent unpleasant consequences of cost overruns, disputes and project abandonment. If risks were properly identified and priced at the design stage, observed variance between design-stage elemental cost plans (ECPs) and final tender sums (FTS) (initial contract sums) could be reduced. This study investigates the variations between design-stage ECPs and FTS of construction projects, with a view to identifying risk factors that are responsible for the observed variance. Data were sourced through interviews, and risk factors were identified by using thematic analysis. Access was obtained to project files from the records of study participants (consultant quantity surveyors), and document analysis was employed in complementing the responses from the interviews. Study findings revealed the discrepancies between ECPs and FTS in the region of -14% and +16%. It is opined in this study that the identified risk factors were responsible for the variability observed. The values obtained from the analysis would enable greater accuracy in the forecast of FTS by Quantity Surveyors. Further, whilst inherent risks in construction project developments are observed globally, these findings have important ramifications for construction projects by expanding existing knowledge on what is needed for reasonable budgetary performance and successful delivery of construction projects. The findings contribute significantly to the study by providing quantitative confirmation to justify the theoretical conclusions generated in the literature from around the world. This therefore adds to and consolidates existing knowledge.

Keywords: Accuracy, design-stage, elemental cost plan, final tender sum, New Zealand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1805
2213 Real Time Control Learning Game - Speed Race by Learning at the Wheel - Development of Data Acquisition System

Authors: Κonstantinos Kalovrektis, Chryssanthi Palazi

Abstract:

Schools today face ever-increasing demands in their attempts to ensure that students are well equipped to enter the workforce and navigate a complex world. Research indicates that computer technology can help support learning, implementation of various experiments or learning games, and that it is especially useful in developing the higher-order skills of critical thinking, observation, comprehension, implementation, comparison, analysis and active attention to activities such as research, field work, simulations and scientific inquiry. The ICT in education supports the learning procedure by enabling it to be more flexible and effective, create a rich and attractive training environment and equip the students with knowledge and potential useful for the competitive social environment in which they live. This paper presents the design, the development, and the results of the evaluation analysis of an interactive educational game which using real electric vehicles - toys (material) on a toy race track. When the game starts each student selects a specific vehicle toy. Then students are answering questionnaires in the computer. The vehicles' speed is related to the percentage of right answers in a multiple choice questionnaire (software). Every question has its own significant value depending of the different level of questionnaire. Via the developed software, each right or wrong answers in questionnaire increase or decrease the real time speed of their vehicle toys. Moreover the rate of vehicle's speed increase or decrease depends on the difficulty level of each question. The aim of the work is to attract the student’s interest in a learning process and also to improve their scores. The developed real time game was tested using independent populations of students of age groups: 8-10, 11-14, 15-18 years. Standard educational and statistical analysis tools were used for the evaluation analysis of the game. Results reveal that students using the developed real time control game scored much higher (60%) than students using a traditional simulation game on the same questionnaire. Results further indicate that student's interest in repeating the developed real time control gaming was far higher (70%) than the interest of students using a traditional simulation game.

Keywords: Real time game, sensor, learning games, LabVIEW

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1733
2212 C-LNRD: A Cross-Layered Neighbor Route Discovery for Effective Packet Communication in Wireless Sensor Network

Authors: K. Kalaikumar, E. Baburaj

Abstract:

One of the problems to be addressed in wireless sensor networks is the issues related to cross layer communication. Cross layer architecture shares the information across the layer, ensuring Quality of Services (QoS). With this shared information, MAC protocol adapts effective functionality maintenance such as route selection on changeable sensor network environment. However, time slot assignment and neighbour route selection time duration for cross layer have not been carried out. The time varying physical layer communication over cross layer causes high traffic load in the sensor network. Though, the traffic load was reduced using cross layer optimization procedure, the computational cost is high. To improve communication efficacy in the sensor network, a self-determined time slot based Cross-Layered Neighbour Route Discovery (C-LNRD) method is presented in this paper. In the presented work, the initial process is to discover the route in the sensor network using Dynamic Source Routing based Medium Access Control (MAC) sub layers. This process considers MAC layer operation with dynamic route neighbour table discovery. Then, the discovered route path for packet communication employs Broad Route Distributed Time Slot Assignment method on Cross-Layered Sensor Network system. Broad Route means time slotting on varying length of the route paths. During packet communication in this sensor network, transmission of packets is adjusted over the different time with varying ranges for controlling the traffic rate. Finally, Rayleigh fading model is developed in C-LNRD to identify the performance of the sensor network communication structure. The main task of Rayleigh Fading is to measure the power level of each communication under MAC sub layer. The minimized power level helps to easily reduce the computational cost of packet communication in the sensor network. Experiments are conducted on factors such as power factor, on packet communication, neighbour route discovery time, and information (i.e., packet) propagation speed.

Keywords: Medium access control, neighbour route discovery, wireless sensor network, Rayleigh fading, distributed time slot assignment

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 778
2211 Use of Nanoclay in Various Modified Polyolefins

Authors: Michael Tupý, Alice Tesaříková-Svobodová, Dagmar Měřínská, Vít Petránek

Abstract:

Polyethylene (PE), Polypropylene (PP), Polyethylene (vinyl acetate) (EVA) and PE-ionomer nanocomposite samples were prepared by mixing of the polymer with organofilized montmorillonite fillers Cloisite 93A and Dellite 67G. The amount of each modified montmorillonite (MMT) was fixed to 5% (w/w). The twin-screw kneader was used for the compounding of polymer matrix and chosen nanofillers. The level of MMT exfoliation was studied by the transmission electron microscopy (TEM) observations. The mechanical properties of prepared materials were evaluated by dynamical mechanical analysis at 30°C and by the measurement of tensile properties (stress and strain at break).

Keywords: Polyethylene, Polypropylene, Polyethylene (vinyl acetate), Clay, Nanocomposite, Montmorillonite.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2174