Search results for: quality loss function.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5689

Search results for: quality loss function.

1969 Types of Epilepsies and Findings EEG- LORETA about Epilepsy

Authors: Leila Maleki, Ahmad Esmali Kooraneh, Hossein Taghi Derakhshi

Abstract:

Neural activity in the human brain starts from the early stages of prenatal development. This activity or signals generated by the brain are electrical in nature and represent not only the brain function but also the status of the whole body. At the present moment, three methods can record functional and physiological changes within the brain with high temporal resolution of neuronal interactions at the network level: the electroencephalogram (EEG), the magnet oencephalogram (MEG), and functional magnetic resonance imaging (fMRI); each of these has advantages and shortcomings. EEG recording with a large number of electrodes is now feasible in clinical practice. Multichannel EEG recorded from the scalp surface provides very valuable but indirect information about the source distribution. However, deep electrode measurements yield more reliable information about the source locations intracranial recordings and scalp EEG are used with the source imaging techniques to determine the locations and strengths of the epileptic activity. As a source localization method, Low Resolution Electro-Magnetic Tomography (LORETA) is solved for the realistic geometry based on both forward methods, the Boundary Element Method (BEM) and the Finite Difference Method (FDM). In this paper, we review the findings EEG- LORETA about epilepsy.

Keywords: Epilepsy, EEG, EEG- Loreta, loreta analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3086
1968 Coding based Synchronization Algorithm for Secondary Synchronization Channel in WCDMA

Authors: Deng Liao, Dongyu Qiu, Ahmed K. Elhakeem

Abstract:

A new code synchronization algorithm is proposed in this paper for the secondary cell-search stage in wideband CDMA systems. Rather than using the Cyclically Permutable (CP) code in the Secondary Synchronization Channel (S-SCH) to simultaneously determine the frame boundary and scrambling code group, the new synchronization algorithm implements the same function with less system complexity and less Mean Acquisition Time (MAT). The Secondary Synchronization Code (SSC) is redesigned by splitting into two sub-sequences. We treat the information of scrambling code group as data bits and use simple time diversity BCH coding for further reliability. It avoids involved and time-costly Reed-Solomon (RS) code computations and comparisons. Analysis and simulation results show that the Synchronization Error Rate (SER) yielded by the new algorithm in Rayleigh fading channels is close to that of the conventional algorithm in the standard. This new synchronization algorithm reduces system complexities, shortens the average cell-search time and can be implemented in the slot-based cell-search pipeline. By taking antenna diversity and pipelining correlation processes, the new algorithm also shows its flexible application in multiple antenna systems.

Keywords: WCDMA cell-search, synchronization algorithm, secondary synchronization channel, antenna diversity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2379
1967 Physical Activity and Cognitive Functioning Relationship in Children

Authors: Comfort Mokgothu

Abstract:

This study investigated the relation between processing information and fitness level of active (fit) and sedentary (unfit) children drawn from rural and urban areas in Botswana. It was hypothesized that fit children would display faster simple reaction time (SRT), choice reaction times (CRT) and movement times (SMT). 60, third grade children (7.0 – 9.0 years) were initially selected and based upon fitness testing, 45 participated in the study (15 each of fit urban, unfit urban, fit rural). All children completed anthropometric measures, skinfold testing and submaximal cycle ergometer testing. The cognitive testing included SRT, CRT, SMT and Choice Movement Time (CMT) and memory sequence length. Results indicated that the rural fit group exhibited faster SMT than the urban fit and unfit groups. For CRT, both fit groups were faster than the unfit group. Collectively, the study shows that the relationship that exists between physical fitness and cognitive function amongst the elderly can tentatively be extended to the pediatric population. Physical fitness could be a factor in the speed at which we process information, including decision making, even in children.

Keywords: Decision making, fitness, information processing, reaction time, cognition movement time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 784
1966 Group Learning for the Design of Human Resource Development for Enterprise

Authors: Hao-Hsi Tseng, Hsin-Yun Lee, Yu-Cheng Kuo

Abstract:

In order to understand whether there is a better than the learning function of learning methods and improve the CAD Courses for enterprise’s design human resource development, this research is applied in learning practical learning computer graphics software. In this study, Revit building information model for learning content, design of two different modes of learning curriculum to learning, learning functions, respectively, and project learning. Via a post-test, questionnaires and student interviews, etc., to study the effectiveness of a comparative analysis of two different modes of learning. Students participate in a period of three weeks after a total of nine-hour course, and finally written and hands-on test. In addition, fill in the questionnaire response by the student learning, a total of fifteen questionnaire title, problem type into the base operating software, application software and software-based concept features three directions. In addition to the questionnaire, and participants were invited to two different learning methods to conduct interviews to learn more about learning students the idea of two different modes. The study found that the ad hoc short-term courses in learning, better learning outcomes. On the other hand, functional style for the whole course students are more satisfied, and the ad hoc style student is difficult to accept the ad hoc style of learning.

Keywords: Development, education, human resource, learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1726
1965 EDULOGIC+ - Knowledge Management through Data Analysis in Education

Authors: Alok Sharma, Dr. Harvinder S. Saini, Raviteja Tiruvury

Abstract:

This paper outlines the application of Knowledge Management (KM) principles in the context of Educational institutions. The paper caters to the needs of the engineering institutions for imparting quality education by delineating the instruction delivery process in a highly structured, controlled and quantified manner. This is done using a software tool EDULOGIC+. The central idea has been based on the engineering education pattern in Indian Universities/ Institutions. The data, contents and results produced over contiguous years build the necessary ground for managing the related accumulated knowledge. Application of KM has been explained using certain examples of data analysis and knowledge extraction.

Keywords: Education software system, information system, knowledge management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1743
1964 Uncertainty Propagation and Sensitivity Analysis During Calibration of an Integrated Land Use and Transport Model

Authors: Parikshit Dutta, Mathieu Saujot, Elise Arnaud, Benoit Lefevre, Emmanuel Prados

Abstract:

In this work, propagation of uncertainty during calibration process of TRANUS, an integrated land use and transport model (ILUTM), has been investigated. It has also been examined, through a sensitivity analysis, which input parameters affect the variation of the outputs the most. Moreover, a probabilistic verification methodology of calibration process, which equates the observed and calculated production, has been proposed. The model chosen as an application is the model of the city of Grenoble, France. For sensitivity analysis and uncertainty propagation, Monte Carlo method was employed, and a statistical hypothesis test was used for verification. The parameters of the induced demand function in TRANUS, were assumed as uncertain in the present case. It was found that, if during calibration, TRANUS converges, then with a high probability the calibration process is verified. Moreover, a weak correlation was found between the inputs and the outputs of the calibration process. The total effect of the inputs on outputs was investigated, and the output variation was found to be dictated by only a few input parameters.

Keywords: Uncertainty propagation, sensitivity analysis, calibration under uncertainty, hypothesis testing, integrated land use and transport models, TRANUS, Grenoble.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1516
1963 Impact of Foreign Aid and Levels of Education on Democracy in Pakistan

Authors: H. Mahmood, M. W. Siddiqi, A. Iqbal, M. A. Tabassum

Abstract:

This study examines the relationships between foreign aid, levels of schooling and democracy for Pakistan using the ARDL cointegration approach. The results of study provide strong evidence for fairly robust long run as well as short run relationships among these variables for the period 1973-2008. The results state that foreign aid and primary school enrollments have negative impact on democracy index and high school enrollments have positive impact on democracy index in Pakistan. The study suggests for promotion of education levels and relies on local resources instead of foreign aid for a good quality of political institutions in Pakistan.

Keywords: Cointegration, Democracy, Education, Foreign Aid

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2073
1962 Thermal Effect on Wave Interaction in Composite Structures

Authors: R. K. Apalowo, D. Chronopoulos, V. Thierry

Abstract:

There exist a wide range of failure modes in composite structures due to the increased usage of the structures especially in aerospace industry. Moreover, temperature dependent wave response of composite and layered structures have been continuously studied, though still limited, in the last decade mainly due to the broad operating temperature range of aerospace structures. A wave finite element (WFE) and finite element (FE) based computational method is presented by which the temperature dependent wave dispersion characteristics and interaction phenomenon in composite structures can be predicted. Initially, the temperature dependent mechanical properties of the panel in the range of -100 ◦C to 150 ◦C are measured experimentally using the Thermal Mechanical Analysis (TMA). Temperature dependent wave dispersion characteristics of each waveguide of the structural system, which is discretized as a system of a number of waveguides coupled by a coupling element, is calculated using the WFE approach. The wave scattering properties, as a function of temperature, is determined by coupling the WFE wave characteristics models of the waveguides with the full FE modelling of the coupling element on which defect is included. Numerical case studies are exhibited for two waveguides coupled through a coupling element.

Keywords: Temperature dependent mechanical characteristics, wave propagation properties, damage detection, wave finite element, composite structure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1204
1961 A Comparative Study of Additive and Nonparametric Regression Estimators and Variable Selection Procedures

Authors: Adriano Z. Zambom, Preethi Ravikumar

Abstract:

One of the biggest challenges in nonparametric regression is the curse of dimensionality. Additive models are known to overcome this problem by estimating only the individual additive effects of each covariate. However, if the model is misspecified, the accuracy of the estimator compared to the fully nonparametric one is unknown. In this work the efficiency of completely nonparametric regression estimators such as the Loess is compared to the estimators that assume additivity in several situations, including additive and non-additive regression scenarios. The comparison is done by computing the oracle mean square error of the estimators with regards to the true nonparametric regression function. Then, a backward elimination selection procedure based on the Akaike Information Criteria is proposed, which is computed from either the additive or the nonparametric model. Simulations show that if the additive model is misspecified, the percentage of time it fails to select important variables can be higher than that of the fully nonparametric approach. A dimension reduction step is included when nonparametric estimator cannot be computed due to the curse of dimensionality. Finally, the Boston housing dataset is analyzed using the proposed backward elimination procedure and the selected variables are identified.

Keywords: Additive models, local polynomial regression, residuals, mean square error, variable selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1008
1960 Defect Cause Modeling with Decision Tree and Regression Analysis

Authors: B. Bakır, İ. Batmaz, F. A. Güntürkün, İ. A. İpekçi, G. Köksal, N. E. Özdemirel

Abstract:

The main aim of this study is to identify the most influential variables that cause defects on the items produced by a casting company located in Turkey. To this end, one of the items produced by the company with high defective percentage rates is selected. Two approaches-the regression analysis and decision treesare used to model the relationship between process parameters and defect types. Although logistic regression models failed, decision tree model gives meaningful results. Based on these results, it can be claimed that the decision tree approach is a promising technique for determining the most important process variables.

Keywords: Casting industry, decision tree algorithm C5.0, logistic regression, quality improvement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2513
1959 A Novel Approach to Asynchronous State Machine Modeling on Multisim for Avoiding Function Hazards

Authors: L. Parisi, D. Hamili, N. Azlan

Abstract:

The aim of this study was to design and simulate a particular type of Asynchronous State Machine (ASM), namely a ‘traffic light controller’ (TLC), operated at a frequency of 0.5 Hz. The design task involved two main stages: firstly, designing a 4-bit binary counter using J-K flip flops as the timing signal and, subsequently, attaining the digital logic by deploying ASM design process. The TLC was designed such that it showed a sequence of three different colours, i.e. red, yellow and green, corresponding to set thresholds by deploying the least number of AND, OR and NOT gates possible. The software Multisim was deployed to design such circuit and simulate it for circuit troubleshooting in order for it to display the output sequence of the three different colours on the traffic light in the correct order. A clock signal, an asynchronous 4- bit binary counter that was designed through the use of J-K flip flops along with an ASM were used to complete this sequence, which was programmed to be repeated indefinitely. Eventually, the circuit was debugged and optimized, thus displaying the correct waveforms of the three outputs through the logic analyser. However, hazards occurred when the frequency was increased to 10 MHz. This was attributed to delays in the feedback being too high.

Keywords: Asynchronous State Machine, Traffic Light Controller, Circuit Design, Digital Electronics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3236
1958 Spike Sorting Method Using Exponential Autoregressive Modeling of Action Potentials

Authors: Sajjad Farashi

Abstract:

Neurons in the nervous system communicate with each other by producing electrical signals called spikes. To investigate the physiological function of nervous system it is essential to study the activity of neurons by detecting and sorting spikes in the recorded signal. In this paper a method is proposed for considering the spike sorting problem which is based on the nonlinear modeling of spikes using exponential autoregressive model. The genetic algorithm is utilized for model parameter estimation. In this regard some selected model coefficients are used as features for sorting purposes. For optimal selection of model coefficients, self-organizing feature map is used. The results show that modeling of spikes with nonlinear autoregressive model outperforms its linear counterpart. Also the extracted features based on the coefficients of exponential autoregressive model are better than wavelet based extracted features and get more compact and well-separated clusters. In the case of spikes different in small-scale structures where principal component analysis fails to get separated clouds in the feature space, the proposed method can obtain well-separated cluster which removes the necessity of applying complex classifiers.

Keywords: Exponential autoregressive model, Neural data, spike sorting, time series modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1762
1957 In Vitro Study of Coded Transmission in Synthetic Aperture Ultrasound Imaging Systems

Authors: Ihor Trots, Yuriy Tasinkevych, Andrzej Nowicki, Marcin Lewandowski

Abstract:

In the paper the study of synthetic transmit aperture method applying the Golay coded transmission for medical ultrasound imaging is presented. Longer coded excitation allows to increase the total energy of the transmitted signal without increasing the peak pressure. Moreover signal-to-noise ratio and penetration depth are improved while maintaining high ultrasound image resolution. In the work the 128-element linear transducer array with 0.3 mm inter-element spacing excited by one cycle and the 8 and 16- bit Golay coded sequences at nominal frequency 4 MHz was used. To generate a spherical wave covering the full image region a single element transmission aperture was used and all the elements received the echo signals. The comparison of 2D ultrasound images of the tissue mimicking phantom and in vitro measurements of the beef liver is presented to illustrate the benefits of the coded transmission. The results were obtained using the synthetic aperture algorithm with transmit and receive signals correction based on a single element directivity function.

Keywords: Golay coded sequences, radiation pattern, signal processing, synthetic aperture, ultrasound imaging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1667
1956 Development of a Vegetation Searching System

Authors: Rattanathip Rattanachai, Kunyanuth Kularbphettong

Abstract:

This paper describes the development of a Vegetation Searching System based on Web Application in case of Suan Sunandha Rajabhat University. The model was developed by PHP, JavaScript and MySQL database system and it was designed to support searching for endemic and rare species of trees on Web site. We describe the design methods and functional components of this prototype. To evaluate the system performance, questionnaires for the system usability and Black Box Testing were used to measure expert and user satisfaction. The results were satisfactory as followed: Means for experts and users were 4.30 and 4.50, and standard deviation for experts and users were 0.61and 0.73 respectively. Further analysis showed that the quality of the plant searching Website was also at a good level as well.

Keywords: Endemic species, Vegetation, Web based System, and Black Box Testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1738
1955 On Figuring the City Characteristics and Landscape in Overall Urban Design: A Case Study in Xiangyang Central City, China

Authors: Guyue Zhu, Liangping Hong

Abstract:

Chinese overall urban design faces a large number of problems such as the neglect of urban characteristics, generalization of content, and difficulty in implementation. Focusing on these issues, this paper proposes the main points of shaping urban characteristics in overall urban design: focuses on core problems in city function and scale, landscape pattern, historical culture, social resources and modern city style and digs the urban characteristic genes. Then, we put forward “core problem location and characteristic gene enhancement” as a kind of overall urban design technical method. Firstly, based on the main problems in urban space as a whole, for the operability goal, the method extracts the key genes and integrates into the multi-dimension system in a targeted manner. Secondly, hierarchical management and guidance system is established which may be in line with administrative management. Finally, by converting the results, action plan is drawn up that can be dynamically implemented. Based on the above idea and method, a practical exploration has been performed in the case of Xiangyang central city.

Keywords: City characteristics, overall urban design, planning implementation, Xiangyang central city.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 936
1954 Eclectic Rule-Extraction from Support Vector Machines

Authors: Nahla Barakat, Joachim Diederich

Abstract:

Support vector machines (SVMs) have shown superior performance compared to other machine learning techniques, especially in classification problems. Yet one limitation of SVMs is the lack of an explanation capability which is crucial in some applications, e.g. in the medical and security domains. In this paper, a novel approach for eclectic rule-extraction from support vector machines is presented. This approach utilizes the knowledge acquired by the SVM and represented in its support vectors as well as the parameters associated with them. The approach includes three stages; training, propositional rule-extraction and rule quality evaluation. Results from four different experiments have demonstrated the value of the approach for extracting comprehensible rules of high accuracy and fidelity.

Keywords: Data mining, hybrid rule-extraction algorithms, medical diagnosis, SVMs

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1697
1953 Evaluation of Fitts’ Law Index of Difficulty Formulation for Screen Size Variations

Authors: Hidehiko Okada, Takayuki Akiba

Abstract:

It is well-known as Fitts’ law that the time for a user to point a target on a GUI screen can be modeled as a linear function of “index of difficulty (ID).” In this paper, the authors investigate whether the traditional ID formulation is appropriate independently of device screen sizes. Result of our experiment reveals that the ID formulation may not consistently capture actual difficulty: users’ pointing performances are not consistent among pointing target variations of which index of difficulty are consistent. The term A/W may not be appropriate because the term causes the observed inconsistency. Based on this finding, the authors then evaluate the applicability of possible models other than Fitts’ one. Multiple regression models are found to be able to appropriately represent the effects of target design variations. The authors next make an attempt to improve the definition of ID in Fitts’ model. Our idea is to raise the size or the distance values depending on the screen size. The modified model is found to fit well to the users’ pointing data, which supports the idea. 

Keywords: Fitts’ law, pointing device, small screen, touch user interface, usability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1622
1952 Study on Construction of 3D Topography by UAV-Based Images

Authors: Yun-Yao Chi, Chieh-Kai Tsai, Dai-Ling Li

Abstract:

In this paper, a method of fast 3D topography modeling using the high-resolution camera images is studied based on the characteristics of Unmanned Aerial Vehicle (UAV) system for low altitude aerial photogrammetry and the need of three dimensional (3D) urban landscape modeling. Firstly, the existing high-resolution digital camera with special design of overlap images is designed by reconstructing and analyzing the auto-flying paths of UAVs, which improves the self-calibration function to achieve the high precision imaging by software, and further increased the resolution of the imaging system. Secondly, several-angle images including vertical images and oblique images gotten by the UAV system are used for the detail measure of urban land surfaces and the texture extraction. Finally, the aerial photography and 3D topography construction are both developed in campus of Chang-Jung University and in Guerin district area in Tainan, Taiwan, provide authentication model for construction of 3D topography based on combined UAV-based camera images from system. The results demonstrated that the UAV system for low altitude aerial photogrammetry can be used in the construction of 3D topography production, and the technology solution in this paper offers a new, fast, and technical plan for the 3D expression of the city landscape, fine modeling and visualization.

Keywords: 3D, topography, UAV, images.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 797
1951 Thermal Characterization of Graphene Oxide-Epoxy Nanocomposites Produced by Aqueous Emulsion

Authors: H. A. Brandão Cordeiro, M. G. Bocardo, N. C. Penteado, V. T. de Moraes, S. M. Giampietri Lebrão, G. W. Lebrão

Abstract:

The present study desired to obtain a nanocomposite of epoxy resin reinforced with graphene oxide (OG), for aerospace application, produced by aqueous emulsion. It was obtained proof bodies with 0.00 wt%, 0.10 wt%, 0.25 wt% and 0.50 wt% in weight of nanoparticles, to check the influence of it in the final quality of the obtained product. The validation of the results was done by the application thermal characterization by differential scanning calorimetry (DSC). It was seen that the nanocomposite reinforced with 0.10 wt% of OG showed the best results, the average glass transition temperature, at 2 °C, compared to the pure resin.

Keywords: Aqueous emulsion, graphene, nanocomposites, thermal characterization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 841
1950 On Adaptive Optimization of Filter Performance Based on Markov Representation for Output Prediction Error

Authors: Hong Son Hoang, Remy Baraille

Abstract:

This paper addresses the problem of how one can improve the performance of a non-optimal filter. First the theoretical question on dynamical representation for a given time correlated random process is studied. It will be demonstrated that for a wide class of random processes, having a canonical form, there exists a dynamical system equivalent in the sense that its output has the same covariance function. It is shown that the dynamical approach is more effective for simulating and estimating a Markov and non- Markovian random processes, computationally is less demanding, especially with increasing of the dimension of simulated processes. Numerical examples and estimation problems in low dimensional systems are given to illustrate the advantages of the approach. A very useful application of the proposed approach is shown for the problem of state estimation in very high dimensional systems. Here a modified filter for data assimilation in an oceanic numerical model is presented which is proved to be very efficient due to introducing a simple Markovian structure for the output prediction error process and adaptive tuning some parameters of the Markov equation.

Keywords: Statistical simulation, canonical form, dynamical system, Markov and non-Markovian processes, data assimilation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1292
1949 Multi-Objective Multi-Mode Resource-Constrained Project Scheduling Problem by Preemptive Fuzzy Goal Programming

Authors: Phruksaphanrat B.

Abstract:

This research proposes a preemptive fuzzy goal programming model for multi-objective multi-mode resource constrained project scheduling problem. The objectives of the problem are minimization of the total time and the total cost of the project. Objective in a multi-mode resource-constrained project scheduling problem is often a minimization of makespan. However, both time and cost should be considered at the same time with different level of important priorities. Moreover, all elements of cost functions in a project are not included in the conventional cost objective function. Incomplete total project cost causes an error in finding the project scheduling time. In this research, preemptive fuzzy goal programming is presented to solve the multi-objective multi-mode resource constrained project scheduling problem. It can find the compromise solution of the problem. Moreover, it is also flexible in adjusting to find a variety of alternative solutions. 

Keywords: Multi-mode resource constrained project scheduling problem, Fuzzy set, Goal programming, Preemptive fuzzy goal programming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2753
1948 Rational Chebyshev Tau Method for Solving Natural Convection of Darcian Fluid About a Vertical Full Cone Embedded in Porous Media Whit a Prescribed Wall Temperature

Authors: Kourosh Parand, Zahra Delafkar, Fatemeh Baharifard

Abstract:

The problem of natural convection about a cone embedded in a porous medium at local Rayleigh numbers based on the boundary layer approximation and the Darcy-s law have been studied before. Similarity solutions for a full cone with the prescribed wall temperature or surface heat flux boundary conditions which is the power function of distance from the vertex of the inverted cone give us a third-order nonlinear differential equation. In this paper, an approximate method for solving higher-order ordinary differential equations is proposed. The approach is based on a rational Chebyshev Tau (RCT) method. The operational matrices of the derivative and product of rational Chebyshev (RC) functions are presented. These matrices together with the Tau method are utilized to reduce the solution of the higher-order ordinary differential equations to the solution of a system of algebraic equations. We also present the comparison of this work with others and show that the present method is applicable.

Keywords: Tau method, semi-infinite, nonlinear ODE, rational Chebyshev, porous media.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1929
1947 Analysis of Knowledge Management Trend by Bibliometric Approach

Authors: Hsu-Hao Tsai, Jiann-Min Yang

Abstract:

The analysis is mainly concentrating on the knowledge management literatures productivity trend which subjects as “knowledge management" in SSCI database. The purpose what the analysis will propose is to summarize the trend information for knowledge management researchers since core knowledge will be concentrated in core categories. The result indicated that the literature productivity which topic as “knowledge management" is still increasing extremely and will demonstrate the trend by different categories including author, country/territory, institution name, document type, language, publication year, and subject area. Focus on the right categories, you will catch the core research information. This implies that the phenomenon "success breeds success" is more common in higher quality publications.

Keywords: Knowledge Management, SSCI, Bibliometric, Lotka's Law

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1231
1946 A Novel Digital Watermarking Technique Basedon ISB (Intermediate Significant Bit)

Authors: Akram M. Zeki, Azizah A. Manaf

Abstract:

Least Significant Bit (LSB) technique is the earliest developed technique in watermarking and it is also the most simple, direct and common technique. It essentially involves embedding the watermark by replacing the least significant bit of the image data with a bit of the watermark data. The disadvantage of LSB is that it is not robust against attacks. In this study intermediate significant bit (ISB) has been used in order to improve the robustness of the watermarking system. The aim of this model is to replace the watermarked image pixels by new pixels that can protect the watermark data against attacks and at the same time keeping the new pixels very close to the original pixels in order to protect the quality of watermarked image. The technique is based on testing the value of the watermark pixel according to the range of each bit-plane.

Keywords: Watermarking, LSB, ISB, Robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1700
1945 Accounting Policies in Polish and International Legal Regulations

Authors: Piotr Prewysz Kwinto, Grażyna Voss

Abstract:

Accounting policies are a set of solutions compliant with legal regulations that an entity selects and adopts, and which guarantee a proper quality of financial statements. Those solutions may differ depending on whether the entity adopts national or international accounting standards. The aim of this article is to present accounting principles (policies) in Polish and international legal regulations and their adoption in selected Polish companies listed on the Warsaw Stock Exchange. The research method adopted in this work is the analysis and evaluation of legal conditions in Polish companies.

Keywords: Accounting policies, International Financial Reporting Standards, Financial statement, Method of measuring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3113
1944 Non-destructive Watermelon Ripeness Determination Using Image Processing and Artificial Neural Network (ANN)

Authors: Shah Rizam M. S. B., Farah Yasmin A.R., Ahmad Ihsan M. Y., Shazana K.

Abstract:

Agriculture products are being more demanding in market today. To increase its productivity, automation to produce these products will be very helpful. The purpose of this work is to measure and determine the ripeness and quality of watermelon. The textures on watermelon skin will be captured using digital camera. These images will be filtered using image processing technique. All these information gathered will be trained using ANN to determine the watermelon ripeness accuracy. Initial results showed that the best model has produced percentage accuracy of 86.51%, when measured at 32 hidden units with a balanced percentage rate of training dataset.

Keywords: Artificial Neural Network (ANN), Digital ImageProcessing, YCbCr Colour Space, Watermelon Ripeness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2939
1943 A Fuzzy MCDM Approach for Health-Care Waste Management

Authors: Mehtap Dursun, E. Ertugrul Karsak, Melis Almula Karadayi

Abstract:

The management of the health-care wastes is one of the most important problems in Istanbul, a city with more than 12 million inhabitants, as it is in most of the developing countries. Negligence in appropriate treatment and final disposal of the healthcare wastes can lead to adverse impacts to public health and to the environment. This paper employs a fuzzy multi-criteria group decision making approach, which is based on the principles of fusion of fuzzy information, 2-tuple linguistic representation model, and technique for order preference by similarity to ideal solution (TOPSIS), to evaluate health-care waste (HCW) treatment alternatives for Istanbul. The evaluation criteria are determined employing nominal group technique (NGT), which is a method of systematically developing a consensus of group opinion. The employed method is apt to manage information assessed using multigranularity linguistic information in a decision making problem with multiple information sources. The decision making framework employs ordered weighted averaging (OWA) operator that encompasses several operators as the aggregation operator since it can implement different aggregation rules by changing the order weights. The aggregation process is based on the unification of information by means of fuzzy sets on a basic linguistic term set (BLTS). Then, the unified information is transformed into linguistic 2-tuples in a way to rectify the problem of loss information of other fuzzy linguistic approaches.

Keywords: Group decision making, health care waste management, multi-criteria decision making, OWA, TOPSIS, 2-tuple linguistic representation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2394
1942 Simulation of Fluid Flow and Heat Transfer in Inclined Cavity using Lattice Boltzmann Method

Authors: Arash Karimipour, A. Hossein Nezhad, E. Shirani, A. Safaei

Abstract:

In this paper, Lattice Boltzmann Method (LBM) is used to study laminar flow with mixed convection heat transfer inside a two-dimensional inclined lid-driven rectangular cavity with aspect ratio AR = 3. Bottom wall of the cavity is maintained at lower temperature than the top lid, and its vertical walls are assumed insulated. Top lid motion results in fluid motion inside the cavity. Inclination of the cavity causes horizontal and vertical components of velocity to be affected by buoyancy force. To include this effect, calculation procedure of macroscopic properties by LBM is changed and collision term of Boltzmann equation is modified. A computer program is developed to simulate this problem using BGK model of lattice Boltzmann method. The effects of the variations of Richardson number and inclination angle on the thermal and flow behavior of the fluid inside the cavity are investigated. The results are presented as velocity and temperature profiles, stream function contours and isotherms. It is concluded that LBM has good potential to simulate mixed convection heat transfer problems.

Keywords: gravity, inclined lid driven cavity, lattice Boltzmannmethod, mixed convection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1942
1941 MR-Implantology: Exploring the Use for Mixed Reality in Dentistry Education

Authors: Areej R. Banjar, Abraham G. Campbell

Abstract:

The use of Mixed Reality (MR) in teaching and training is growing popular and can improve students’ ability to perform technical procedures. This paper outlines the creation of an interactive educational MR 3D application that aims to improve the quality of instruction for dentistry students. This application is called ”MR-Implantology” and aims to teach and train dentistry students on single dental implant placement. MR-Implantology uses cone-beam computed tomography (CBCT) images as the source for 3D dental models that dentistry students will be able to freely manipulate within a 3D MR world to aid their learning process.

Keywords: Cone-Beam Computed Tomography, dentistry education, implantology, Mixed Reality, MR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 486
1940 Economic Evaluation Offshore Wind Project under Uncertainly and Risk Circumstances

Authors: Sayed Amir Hamzeh Mirkheshti

Abstract:

Offshore wind energy as a strategic renewable energy, has been growing rapidly due to availability, abundance and clean nature of it. On the other hand, budget of this project is incredibly higher in comparison with other renewable energies and it takes more duration. Accordingly, precise estimation of time and cost is needed in order to promote awareness in the developers and society and to convince them to develop this kind of energy despite its difficulties. Occurrence risks during on project would cause its duration and cost constantly changed. Therefore, to develop offshore wind power, it is critical to consider all potential risks which impacted project and to simulate their impact. Hence, knowing about these risks could be useful for the selection of most influencing strategies such as avoidance, transition, and act in order to decrease their probability and impact. This paper presents an evaluation of the feasibility of 500 MV offshore wind project in the Persian Gulf and compares its situation with uncertainty resources and risk. The purpose of this study is to evaluate time and cost of offshore wind project under risk circumstances and uncertain resources by using Monte Carlo simulation. We analyzed each risk and activity along with their distribution function and their effect on the project.

Keywords: Wind energy project; uncertain resources; risks; Monte Carlo simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 789