Search results for: displacement based design
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14336

Search results for: displacement based design

896 A Software Framework for Predicting Oil-Palm Yield from Climate Data

Authors: Mohd. Noor Md. Sap, A. Majid Awan

Abstract:

Intelligent systems based on machine learning techniques, such as classification, clustering, are gaining wide spread popularity in real world applications. This paper presents work on developing a software system for predicting crop yield, for example oil-palm yield, from climate and plantation data. At the core of our system is a method for unsupervised partitioning of data for finding spatio-temporal patterns in climate data using kernel methods which offer strength to deal with complex data. This work gets inspiration from the notion that a non-linear data transformation into some high dimensional feature space increases the possibility of linear separability of the patterns in the transformed space. Therefore, it simplifies exploration of the associated structure in the data. Kernel methods implicitly perform a non-linear mapping of the input data into a high dimensional feature space by replacing the inner products with an appropriate positive definite function. In this paper we present a robust weighted kernel k-means algorithm incorporating spatial constraints for clustering the data. The proposed algorithm can effectively handle noise, outliers and auto-correlation in the spatial data, for effective and efficient data analysis by exploring patterns and structures in the data, and thus can be used for predicting oil-palm yield by analyzing various factors affecting the yield.

Keywords: Pattern analysis, clustering, kernel methods, spatial data, crop yield

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1971
895 Automatic Tuning for a Systemic Model of Banking Originated Losses (SYMBOL) Tool on Multicore

Authors: Ronal Muresano, Andrea Pagano

Abstract:

Nowadays, the mathematical/statistical applications are developed with more complexity and accuracy. However, these precisions and complexities have brought as result that applications need more computational power in order to be executed faster. In this sense, the multicore environments are playing an important role to improve and to optimize the execution time of these applications. These environments allow us the inclusion of more parallelism inside the node. However, to take advantage of this parallelism is not an easy task, because we have to deal with some problems such as: cores communications, data locality, memory sizes (cache and RAM), synchronizations, data dependencies on the model, etc. These issues are becoming more important when we wish to improve the application’s performance and scalability. Hence, this paper describes an optimization method developed for Systemic Model of Banking Originated Losses (SYMBOL) tool developed by the European Commission, which is based on analyzing the application's weakness in order to exploit the advantages of the multicore. All these improvements are done in an automatic and transparent manner with the aim of improving the performance metrics of our tool. Finally, experimental evaluations show the effectiveness of our new optimized version, in which we have achieved a considerable improvement on the execution time. The time has been reduced around 96% for the best case tested, between the original serial version and the automatic parallel version.

Keywords: Algorithm optimization, Bank Failures, OpenMP, Parallel Techniques, Statistical tool.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1897
894 A Review of the Characteristics and Optimization of Optical Properties of Zirconia Ceramics for Aesthetic Dental Restorations

Authors: R. A. Shahmiri, O. C. Standard, J. N. Hart, C. C. Sorrell

Abstract:

The ceramic yttria-stabilized tetragonal zirconia polycrystal (Y-TZP) has been used as a dental biomaterial for several decades. The strength and toughness of this material can be accounted for by its toughening mechanisms, which include transformation toughening, crack deflection, zone shielding, contact shielding, and crack bridging. Prevention of crack propagation is of critical importance in high-fatigue situations, such as those encountered in mastication and para-function. However, the poor translucence of Y-TZP in polycrystalline form is such that it may not meet the aesthetic requirements due to its white/grey appearance. To improve the optical properties of Y-TZP, more detailed study of the optical properties is required; in particular, precise evaluation of the refractive index, absorption coefficient, and scattering coefficient are necessary. The measurement of the optical parameters has been based on the assumption that light scattered from biological media is isotropically distributed over all angles. In fact, the optical behavior of real biological materials depends on the angular scattering of light due to the anisotropic nature of the materials. The purpose of the present work is to evaluate the optical properties (including color, opacity/translucence, scattering, and fluorescence) of zirconia dental ceramics and their control through modification of the chemical composition, phase composition, and surface microstructure.

Keywords: Optical properties, opacity/translucence, scattering, fluorescence, chemical composition, phase composition, surface microstructure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1506
893 A Framework for Improving Trade Contractors’ Productivity Tracking Methods

Authors: Sophia Hayes, Kenny L. Liang, Sahil Sharma, Austin Shema, Mahmoud Bader, Mohamed Elbarkouky

Abstract:

Despite being one of the most significant economic contributors of the country, Canada’s construction industry is lagging behind other sectors when it comes to labor productivity improvements. The construction industry is very collaborative as a general contractor, will hire trade contractors to perform most of a project’s work; meaning low productivity from one contractor can have a domino effect on the shared success of a project. To address this issue and encourage trade contractors to improve their productivity tracking methods, an investigative study was done on the productivity views and tracking methods of various trade contractors. Additionally, an in-depth review was done on four standard tracking methods used in the construction industry: cost codes, benchmarking, the job productivity measurement (JPM) standard, and WorkFace Planning (WFP). The four tracking methods were used as a baseline in comparing the trade contractors’ responses, determining gaps within their current tracking methods, and for making improvement recommendations. 15 interviews were conducted with different trades to analyze how contractors value productivity. The results of these analyses indicated that there seem to be gaps within the construction industry when it comes to an understanding of the purpose and value in productivity tracking. The trade contractors also shared their current productivity tracking systems; which were then compared to the four standard tracking methods used in the construction industry. Gaps were identified in their various tracking methods and using a framework; recommendations were made based on the type of trade on how to improve how they track productivity.

Keywords: Trade contractors’ productivity, productivity tracking, cost codes, benchmarking, job productivity measurement, JPM, workface planning WFP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 879
892 Safe and Efficient Deep Reinforcement Learning Control Model: A Hydroponics Case Study

Authors: Almutasim Billa A. Alanazi, Hal S. Tharp

Abstract:

Safe performance and efficient energy consumption are essential factors for designing a control system. This paper presents a reinforcement learning (RL) model that can be applied to control applications to improve safety and reduce energy consumption. As hardware constraints and environmental disturbances are imprecise and unpredictable, conventional control methods may not always be effective in optimizing control designs. However, RL has demonstrated its value in several artificial intelligence (AI) applications, especially in the field of control systems. The proposed model intelligently monitors a system's success by observing the rewards from the environment, with positive rewards counting as a success when the controlled reference is within the desired operating zone. Thus, the model can determine whether the system is safe to continue operating based on the designer/user specifications, which can be adjusted as needed. Additionally, the controller keeps track of energy consumption to improve energy efficiency by enabling the idle mode when the controlled reference is within the desired operating zone, thus reducing the system energy consumption during the controlling operation. Water temperature control for a hydroponic system is taken as a case study for the RL model, adjusting the variance of disturbances to show the model’s robustness and efficiency. On average, the model showed safety improvement by up to 15% and energy efficiency improvements by 35%-40% compared to a traditional RL model.

Keywords: Control system, hydroponics, machine learning, reinforcement learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 187
891 Investigating the Demand for Short-shelf Life Food Products for SME Wholesalers

Authors: Yamini Raju, Parminder S. Kang, Adam Moroz, Ross Clement, Ashley Hopwell, Alistair Duffy

Abstract:

Accurate forecasting of fresh produce demand is one the challenges faced by Small Medium Enterprise (SME) wholesalers. This paper is an attempt to understand the cause for the high level of variability such as weather, holidays etc., in demand of SME wholesalers. Therefore, understanding the significance of unidentified factors may improve the forecasting accuracy. This paper presents the current literature on the factors used to predict demand and the existing forecasting techniques of short shelf life products. It then investigates a variety of internal and external possible factors, some of which is not used by other researchers in the demand prediction process. The results presented in this paper are further analysed using a number of techniques to minimize noise in the data. For the analysis past sales data (January 2009 to May 2014) from a UK based SME wholesaler is used and the results presented are limited to product ‘Milk’ focused on café’s in derby. The correlation analysis is done to check the dependencies of variability factor on the actual demand. Further PCA analysis is done to understand the significance of factors identified using correlation. The PCA results suggest that the cloud cover, weather summary and temperature are the most significant factors that can be used in forecasting the demand. The correlation of the above three factors increased relative to monthly and becomes more stable compared to the weekly and daily demand.

Keywords: Demand Forecasting, Deteriorating Products, Food Wholesalers, Principal Component Analysis and Variability Factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3360
890 Online Signature Verification Using Angular Transformation for e-Commerce Services

Authors: Peerapong Uthansakul, Monthippa Uthansakul

Abstract:

The rapid growth of e-Commerce services is significantly observed in the past decade. However, the method to verify the authenticated users still widely depends on numeric approaches. A new search on other verification methods suitable for online e-Commerce is an interesting issue. In this paper, a new online signature-verification method using angular transformation is presented. Delay shifts existing in online signatures are estimated by the estimation method relying on angle representation. In the proposed signature-verification algorithm, all components of input signature are extracted by considering the discontinuous break points on the stream of angular values. Then the estimated delay shift is captured by comparing with the selected reference signature and the error matching can be computed as a main feature used for verifying process. The threshold offsets are calculated by two types of error characteristics of the signature verification problem, False Rejection Rate (FRR) and False Acceptance Rate (FAR). The level of these two error rates depends on the decision threshold chosen whose value is such as to realize the Equal Error Rate (EER; FAR = FRR). The experimental results show that through the simple programming, employed on Internet for demonstrating e-Commerce services, the proposed method can provide 95.39% correct verifications and 7% better than DP matching based signature-verification method. In addition, the signature verification with extracting components provides more reliable results than using a whole decision making.

Keywords: Online signature verification, e-Commerce services, Angular transformation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1577
889 X-Ray Intensity Measurement Using Frequency Output Sensor for Computed Tomography

Authors: R. M. Siddiqui, D. Z. Moghaddam, T. R. Turlapati, S. H. Khan, I. Ul Ahad

Abstract:

Quality of 2D and 3D cross-sectional images produce by Computed Tomography primarily depend upon the degree of precision of primary and secondary X-Ray intensity detection. Traditional method of primary intensity detection is apt to errors. Recently the X-Ray intensity measurement system along with smart X-Ray sensors is developed by our group which is able to detect primary X-Ray intensity unerringly. In this study a new smart X-Ray sensor is developed using Light-to-Frequency converter TSL230 from Texas Instruments which has numerous advantages in terms of noiseless data acquisition and transmission. TSL230 construction is based on a silicon photodiode which converts incoming X-Ray radiation into the proportional current signal. A current to frequency converter is attached to this photodiode on a single monolithic CMOS integrated circuit which provides proportional frequency count to incoming current signal in the form of the pulse train. The frequency count is delivered to the center of PICDEM FS USB board with PIC18F4550 microcontroller mounted on it. With highly compact electronic hardware, this Demo Board efficiently read the smart sensor output data. The frequency output approaches overcome nonlinear behavior of sensors with analog output thus un-attenuated X-Ray intensities could be measured precisely and better normalization could be acquired in order to attain high resolution.

Keywords: Computed tomography, detector technology, X-Ray intensity measurement

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2600
888 Resilient Manufacturing: Use of Augmented Reality to Advance Training and Operating Practices in Manual Assembly

Authors: L. C. Moreira, M. Kauffman

Abstract:

This paper outlines the results of an experimental research on deploying an emerging augmented reality (AR) system for real-time task assistance (or work instructions) of highly customised and high-risk manual operations. The focus is on human operators’ training effectiveness and performance and the aim is to test if such technologies can support enhancing the knowledge retention levels and accuracy of task execution to improve health and safety (H&S). An AR enhanced assembly method is proposed and experimentally tested using a real industrial process as case study for electric vehicles’ (EV) battery module assembly. The experimental results revealed that the proposed method improved the training practices and performance through increases in the knowledge retention levels from 40% to 84%, and accuracy of task execution from 20% to 71%, when compared to the traditional paper-based method. The results of this research validate and demonstrate how emerging technologies are advancing the choice for manual, hybrid or fully automated processes by promoting the XR-assisted processes, and the connected worker (a vision for Industry 4 and 5.0), and supporting manufacturing become more resilient in times of constant market changes.

Keywords: Augmented reality, extended reality, connected worker, XR-assisted operator, manual assembly 4.0, industry 5.0, smart training, battery assembly.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 367
887 Modeling Spatial Distributions of Point and Nonpoint Source Pollution Loadings in the Great Lakes Watersheds

Authors: Chansheng He, Carlo DeMarchi

Abstract:

A physically based, spatially-distributed water quality model is being developed to simulate spatial and temporal distributions of material transport in the Great Lakes Watersheds of the U.S. Multiple databases of meteorology, land use, topography, hydrography, soils, agricultural statistics, and water quality were used to estimate nonpoint source loading potential in the study watersheds. Animal manure production was computed from tabulations of animals by zip code area for the census years of 1987, 1992, 1997, and 2002. Relative chemical loadings for agricultural land use were calculated from fertilizer and pesticide estimates by crop for the same periods. Comparison of these estimates to the monitored total phosphorous load indicates that both point and nonpoint sources are major contributors to the total nutrient loads in the study watersheds, with nonpoint sources being the largest contributor, particularly in the rural watersheds. These estimates are used as the input to the distributed water quality model for simulating pollutant transport through surface and subsurface processes to Great Lakes waters. Visualization and GIS interfaces are developed to visualize the spatial and temporal distribution of the pollutant transport in support of water management programs.

Keywords: Distributed Large Basin Runoff Model, Great LakesWatersheds, nonpoint source pollution, and point sources.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1524
886 Investigating the Usability of a University Website from the Users’ Perspective: An Empirical Study of Benue State University Website

Authors: Abraham Undu, Stephen Akuma

Abstract:

Websites are becoming a major component of an organization’s success in our ever globalizing competitive world. The website symbolizes an organization, interacting or projecting an organization’s principles, culture, values, vision, and perspectives. It is an interface connecting organizations and their clients. The university, as an academic institution, makes use of a website to communicate and offer computing services to its stakeholders (students, staff, host community, university management etc). Unfortunately, website designers often give more consideration to the technology, organizational structure and business objectives of the university than to the usability of the site. Website designers end up designing university websites which do not meet the needs of the primary users. This empirical study investigated the Benue State University website from the point view of students. This research was realized by using a standardized website usability questionnaire based on the five factors of usability defined by WAMMI (Website Analysis and Measurement Inventory): attractiveness, controllability, efficiency, learnability and helpfulness. The result of the investigation showed that the university website (https://portal.bsum.edu.ng/) has neutral usability level because of the usability issues associated with the website. The research recommended feasible solutions to improve the usability of the website from the users’ perspective and also provided a modified usability model that will be used for better evaluation of the Benue State University website.

Keywords: Usability, usability factors, university websites, user’s perspective, WAMMI, modified usability model, Benue State University.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1057
885 Investigating Polynomial Interpolation Functions for Zooming Low Resolution Digital Medical Images

Authors: Maninder Pal

Abstract:

Medical digital images usually have low resolution because of nature of their acquisition. Therefore, this paper focuses on zooming these images to obtain better level of information, required for the purpose of medical diagnosis. For this purpose, a strategy for selecting pixels in zooming operation is proposed. It is based on the principle of analog clock and utilizes a combination of point and neighborhood image processing. In this approach, the hour hand of clock covers the portion of image to be processed. For alignment, the center of clock points at middle pixel of the selected portion of image. The minute hand is longer in length, and is used to gain information about pixels of the surrounding area. This area is called neighborhood pixels region. This information is used to zoom the selected portion of the image. The proposed algorithm is implemented and its performance is evaluated for many medical images obtained from various sources such as X-ray, Computerized Tomography (CT) scan and Magnetic Resonance Imaging (MRI). However, for illustration and simplicity, the results obtained from a CT scanned image of head is presented. The performance of algorithm is evaluated in comparison to various traditional algorithms in terms of Peak signal-to-noise ratio (PSNR), maximum error, SSIM index, mutual information and processing time. From the results, the proposed algorithm is found to give better performance than traditional algorithms.

Keywords: Zooming, interpolation, medical images, resolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1569
884 Pragati Node Popularity (PNP) Approach to Identify Congestion Hot Spots in MPLS

Authors: E. Ramaraj, A. Padmapriya

Abstract:

In large Internet backbones, Service Providers typically have to explicitly manage the traffic flows in order to optimize the use of network resources. This process is often referred to as Traffic Engineering (TE). Common objectives of traffic engineering include balance traffic distribution across the network and avoiding congestion hot spots. Raj P H and SVK Raja designed the Bayesian network approach to identify congestion hors pots in MPLS. In this approach for every node in the network the Conditional Probability Distribution (CPD) is specified. Based on the CPD the congestion hot spots are identified. Then the traffic can be distributed so that no link in the network is either over utilized or under utilized. Although the Bayesian network approach has been implemented in operational networks, it has a number of well known scaling issues. This paper proposes a new approach, which we call the Pragati (means Progress) Node Popularity (PNP) approach to identify the congestion hot spots with the network topology alone. In the new Pragati Node Popularity approach, IP routing runs natively over the physical topology rather than depending on the CPD of each node as in Bayesian network. We first illustrate our approach with a simple network, then present a formal analysis of the Pragati Node Popularity approach. Our PNP approach shows that for any given network of Bayesian approach, it exactly identifies the same result with minimum efforts. We further extend the result to a more generic one: for any network topology and even though the network is loopy. A theoretical insight of our result is that the optimal routing is always shortest path routing with respect to some considerations of hot spots in the networks.

Keywords: Conditional Probability Distribution, Congestion hotspots, Operational Networks, Traffic Engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1976
883 An Experimental Study on the Effect of Operating Parameters during the Micro-Electro-Discharge Machining of Ni Based Alloy

Authors: Asma Perveen, M. P. Jahan

Abstract:

Ni alloys have managed to cover wide range of applications such as automotive industries, oil gas industries, and aerospace industries. However, these alloys impose challenges while using conventional machining technologies. On the other hand, Micro-Electro-Discharge machining (micro-EDM) is a non-conventional machining method that uses controlled sparks energy to remove material irrespective of the materials hardness. There has been always a huge interest from the industries for developing optimum methodology and parameters in order to enhance the productivity of micro-EDM in terms of reducing machining time and tool wear for different alloys. Therefore, the aims of this study are to investigate the effects of the micro-EDM process parameters, in order to find their optimal values. The input process parameters include voltage, capacitance, and electrode rotational speed, whereas the output parameters considered are machining time, entrance diameter of hole, overcut, tool wear, and crater size. The surface morphology and element characterization are also investigated with the use of SEM and EDX analysis. The experimental result indicates the reduction of machining time with the increment of discharge energy. Discharge energy also contributes to the enlargement of entrance diameter as well as overcut. In addition, tool wears show reduction with the increase of discharge energy. Moreover, crater size is found to be increased in size along with the increment of discharge energy.

Keywords: Micro EDM, Ni alloy, discharge energy, micro-holes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1331
882 Increasing Fishery Economic Added Value through Post Fishing Program: Cold Storage Program

Authors: Indrijuli Magsari Putri, Dicky R. Munaf

Abstract:

The purpose of this paper is to guide the effort in improving the economic added value of Indonesian fisheries product through post fishing program, which is cold storage program. Indonesia's fisheries potential has been acknowledged by the world. FAO (2009) stated that Indonesia is one of the tenth highest producers of fishery products in the world. Based on BPS (Statistics Indonesia data), the national fisheries production in 2011 reached 5.714 million tons, which 93.55% came from marine fisheries and 6.45% from open waters. Indonesian territory consist of 2/3 of Indonesian waters, has given enormous benefits for Indonesia, especially fishermen. To improve the economic level of fishermen requires efforts to develop fisheries business unit. On of the efforts is by improving the quality of products which are marketed in the regional and international levels. It is certainly need the support of the existence of various fishery facilities (infrastructure to superstructure), one of which is cold storage. Given the many benefits of cold storage as a means of processing of fishery resources, Indonesia Maritime Security Coordinating Board (IMSCB) as one of the maritime institutions for maritime security and safety, has a program to empower the coastal community through encourages the development of cold storage in the middle and lower fishery business unit. The development of cold storage facilities which able to run its maximum role requires synergistic efforts of various parties.

Keywords: Cold Storage, Fish, Regulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2107
881 Using SMS Mobile Technology to Assess the Mastery of Subject Content Knowledge of Science and Mathematics Teachers of Secondary Schools in Tanzania

Authors: Joel S. Mtebe, Aron Kondoro, Mussa M. Kissaka, Elia Kibga

Abstract:

Sub-Saharan Africa is described as the second fastest growing in mobile phone penetration in the world more than in the United States or the European Union. Mobile phones have been used to provide a lot of opportunities to improve people’s lives in the region such as in banking, marketing, entertainment, and paying for various bills such as water, TV, and electricity. However, the potential of mobile phones to enhance teaching and learning has not been explored. This study presents an experience of developing and delivering SMS based quiz questions used to assess mastery of subject content knowledge of science and mathematics secondary school teachers in Tanzania. The SMS quizzes were used as a follow up support mechanism to 500 teachers who participated in a project to upgrade subject content knowledge of teachers in science and mathematics subjects in Tanzania. Quizzes of 10-15 questions were sent to teachers each week for 8 weeks and the results were analyzed using SPSS. Results show that teachers who participated in chemistry and biology subjects have better performance compared to those who participated in mathematics and physics subjects. Teachers reported some challenges that led to poor performance, This research has several practical implications for those who are implementing or planning to use mobile phones in teaching and learning especially in rural secondary schools in sub-Saharan Africa.

Keywords: Mobile learning, e-learning, educational technologies, SMS, secondary education, assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2059
880 Breast Cancer Survivability Prediction via Classifier Ensemble

Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia

Abstract:

This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.

Keywords: Classifier ensemble, breast cancer survivability, data mining, SEER.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1667
879 Nuclear Medical Image Treatment System Based On FPGA in Real Time

Authors: B. Mahmoud, M.H. Bedoui, R. Raychev, H. Essabbah

Abstract:

We present in this paper an acquisition and treatment system designed for semi-analog Gamma-camera. It consists of a nuclear medical Image Acquisition, Treatment and Display chain(IATD) ensuring the acquisition, the treatment of the signals(resulting from the Gamma-camera detection head) and the scintigraphic image construction in real time. This chain is composed by an analog treatment board and a digital treatment board. We describe the designed systems and the digital treatment algorithms in which we have improved the performance and the flexibility. The digital treatment algorithms are implemented in a specific reprogrammable circuit FPGA (Field Programmable Gate Array).interface for semi-analog cameras of Sopha Medical Vision(SMVi) by taking as example SOPHY DS7. The developed system consists of an Image Acquisition, Treatment and Display (IATD) ensuring the acquisition and the treatment of the signals resulting from the DH. The developed chain is formed by a treatment analog board and a digital treatment board designed around a DSP [2]. In this paper we have presented the architecture of a new version of our chain IATD in which the integration of the treatment algorithms is executed on an FPGA (Field Programmable Gate Array)

Keywords: Nuclear medical image, scintigraphic image, digitaltreatment, linearity, spectrometry, FPGA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1671
878 Least Square-SVM Detector for Wireless BPSK in Multi-Environmental Noise

Authors: J. P. Dubois, Omar M. Abdul-Latif

Abstract:

Support Vector Machine (SVM) is a statistical learning tool developed to a more complex concept of structural risk minimization (SRM). In this paper, SVM is applied to signal detection in communication systems in the presence of channel noise in various environments in the form of Rayleigh fading, additive white Gaussian background noise (AWGN), and interference noise generalized as additive color Gaussian noise (ACGN). The structure and performance of SVM in terms of the bit error rate (BER) metric is derived and simulated for these advanced stochastic noise models and the computational complexity of the implementation, in terms of average computational time per bit, is also presented. The performance of SVM is then compared to conventional binary signaling optimal model-based detector driven by binary phase shift keying (BPSK) modulation. We show that the SVM performance is superior to that of conventional matched filter-, innovation filter-, and Wiener filter-driven detectors, even in the presence of random Doppler carrier deviation, especially for low SNR (signal-to-noise ratio) ranges. For large SNR, the performance of the SVM was similar to that of the classical detectors. However, the convergence between SVM and maximum likelihood detection occurred at a higher SNR as the noise environment became more hostile.

Keywords: Colour noise, Doppler shift, innovation filter, least square-support vector machine, matched filter, Rayleigh fading, Wiener filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1807
877 Object Negotiation Mechanism for an Intelligent Environment Using Event Agents

Authors: Chiung-Hui Chen

Abstract:

With advancements in science and technology, the concept of the Internet of Things (IoT) has gradually developed. The development of the intelligent environment adds intelligence to objects in the living space by using the IoT. In the smart environment, when multiple users share the living space, if different service requirements from different users arise, then the context-aware system will have conflicting situations for making decisions about providing services. Therefore, the purpose of establishing a communication and negotiation mechanism among objects in the intelligent environment is to resolve those service conflicts among users. This study proposes developing a decision-making methodology that uses “Event Agents” as its core. When the sensor system receives information, it evaluates a user’s current events and conditions; analyses object, location, time, and environmental information; calculates the priority of the object; and provides the user services based on the event. Moreover, when the event is not single but overlaps with another, conflicts arise. This study adopts the “Multiple Events Correlation Matrix” in order to calculate the degree values of incidents and support values for each object. The matrix uses these values as the basis for making inferences for system service, and to further determine appropriate services when there is a conflict.

Keywords: Internet of things, intelligent object, event agents, negotiation mechanism, degree of similarity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1183
876 Communication Styles of Business Students: A Comparison of Four National Cultures

Authors: Tiina Brandt, Isaac Wanasika

Abstract:

Culturally diverse global companies need to understand cultural differences between leaders and employees from different backgrounds. Communication is culturally contingent and has a significant impact on effective execution of leadership goals. The awareness of cultural variations related to communication and interactions will help leaders modify their own behavior, and consequently improve the execution of goals and avoid unnecessary faux pas. Our focus is on young adults that have experienced cultural integration, culturally diverse surroundings in schools and universities, and cultural travels. Our central research problem is to understand the impact of different national cultures on communication. We focus on four countries with distinct national cultures and spatial distribution. The countries are Finland, Indonesia, Russia and USA. Our sample is based on business students (n = 225) from various backgrounds in the four countries. Their responses of communication and leadership styles were analyzed using ANOVA and post-hoc test. Results indicate that culture impacts on communication behavior. Even young culturally-exposed adults with cultural awareness and experience demonstrate cultural differences in their behavior. Apparently, culture is a deeply seated trait that cannot be completely neutralized by environmental variables. Our study offers valuable input for leadership training programs and for expatriates when recognizing specific differences on leaders’ behavior due to culture.

Keywords: Culture, communication, Finland, Indonesia, Russia, USA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 658
875 Indoor and Outdoor Concentration of Particulate Matter at Domestic Homes

Authors: B. Karakas, S. Lakestani, C. Guler, B. Guciz Dogan, S. Acar Vaizoglu, A. Taner, B. Sekerel, R. Tıpırdamaz, G. Gullu

Abstract:

Particulate matter (PM) in ambient air is responsible for adverse health effects in adults and children. Relatively little is known about the concentrations, sources and health effects of PM in indoor air. A monitoring study was conducted in Ankara by three campaigns in order to measure PM levels in indoor and outdoor environments to identify and quantify associations between sources and concentrations. Approximately 82 homes (1st campaign for 42, 2nd campaign for 12, and 3rd campaign for 28), three rooms (living room, baby-s room and living room used as a baby-s room) and outdoor ambient at each home were sampled with Grimm Environmental Dust Monitoring (EDM) 107, during different seasonal periods of 2011 and 2012. In this study, the relationship between indoor and outdoor PM levels for particulate matter less than 10 micrometer (.m) (PM10), particulate matter less than 2.5.m (PM2.5) and particulate matter less than 1.0.m (PM1) were investigated. The mean concentration of PM10, PM2.5, and PM1.0 at living room used as baby-s room is higher than living and baby-s room (or bedroom) for three sampling campaigns. It is concluded that the household activities and environmental conditions are very important for PM concentrations in the indoor environments during the sampling periods. The amount of smokers, being near a main street and/or construction activities increased the PM concentration. This study is based on the assessment the relationship between indoor and outdoor PM levels and the household activities and environmental conditions

Keywords: Indoor air quality, particulate matter (PM), PM10, PM2.5, PM1.0.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3230
874 Optimization of the Dental Direct Digital Imaging by Applying the Self-Recognition Technology

Authors: Mina Dabirinezhad, Mohsen Bayat Pour, Amin Dabirinejad

Abstract:

This paper is intended to introduce the technology to solve some of the deficiencies of the direct digital radiology. Nowadays, digital radiology is the latest progression in dental imaging, which has become an essential part of dentistry. There are two main parts of the direct digital radiology comprised of an intraoral X-ray machine and a sensor (digital image receptor). The dentists and the dental nurses experience afflictions during the taking image process by the direct digital X-ray machine. For instance, sometimes they need to readjust the sensor in the mouth of the patient to take the X-ray image again due to the low quality of that. Another problem is, the position of the sensor may move in the mouth of the patient and it triggers off an inappropriate image for the dentists. It means that it is a time-consuming process for dentists or dental nurses. On the other hand, taking several the X-ray images brings some problems for the patient such as being harmful to their health and feeling pain in their mouth due to the pressure of the sensor to the jaw. The author provides a technology to solve the above-mentioned issues that is called “Self-Recognition Direct Digital Radiology” (SDDR). This technology is based on the principle that the intraoral X-ray machine is capable to diagnose the location of the sensor in the mouth of the patient automatically. In addition, to solve the aforementioned problems, SDDR technology brings out fewer environmental impacts in comparison to the previous version.

Keywords: Dental direct digital imaging, digital image receptor, digital x-ray machine, and environmental impacts.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 587
873 Computable Difference Matrix for Synonyms in the Holy Quran

Authors: Mohamed Ali AlShaari, Khalid M. ElFitori

Abstract:

In the field of Quran Studies known as GHAREEB AL QURAN (The study of the meanings of strange words and structures in Holy Quran), it is difficult to distinguish some pragmatic meanings from conceptual meanings. One who wants to study this subject may need to look for a common usage between any two words or more; to understand general meaning, and sometimes may need to look for common differences between them, even if there are synonyms (word sisters).

Some of the distinguished scholars of Arabic linguistics believe that there are no synonym words, they believe in varieties of meaning and multi-context usage. Based on this viewpoint, our method was designedto look for synonyms of a word, then the differences that distinct the word and their synonyms.

There are many available books that use such a method e.g. synonyms books, dictionaries, glossaries, and some books on the interpretations of strange vocabulary of the Holy Quran, but it is difficult to look up words in these written works.

For that reason, we proposed a logical entity, which we called Differences Matrix (DM).

DM groups the synonyms words to extract the relations between them and to know the general meaning, which defines the skeleton of all word synonyms; this meaning is expressed by a word of its sisters.

In Differences Matrix, we used  the sisters(words) as titles for rows and columns, and in the obtained  cells we tried to define the row title (word) by using column title (her sister), so the relations between sisters appear, the expected result is well defined groups of sisters for each word. We represented the obtained results formally, and used the defined groups as a base for building the ontology of the Holy Quran synonyms.

Keywords: Quran, synonyms, Differences Matrix, ontology

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2105
872 Cost of Governance in Nigeria: In Whose Interest?

Authors: Francis O. Iyoha, Daniel E. Gberevbie, Charles T. Iruonagbe, Matthew E. Egharevba

Abstract:

Cost of governance in Nigeria has become a challenge to development and concern to practitioners and scholars alike in the field of business and social science research. In the 2010 national budget of NGN4.6 trillion or USD28.75billion for instance, only a pantry sum of NGN1.8trillion or USD11.15billion was earmarked for capital expenditure. Similarly, in 2013, out of a total national budget of NGN4.92trillion or USD30.75billion, only the sum of NGN1.50trllion or USD9.38billion was voted for capital expenditure. Therefore, based on the data sourced from the Nigerian Office of Statistics, Central bank of Nigeria Statistical Bulletin as well as from the United Nations Development Programme, this study examined the causes of high cost of governance in Nigeria. It found out that the high cost of governance in the country is in the interest of the ruling class, arising from their unethical behaviour – corrupt practices and the poor management of public resources. As a result, the study recommends the need to intensify the war against corruption and mismanagement of public resources by government officials as possible solution to overcome the high cost of governance in Nigeria. This could be achieved by strengthening the constitutional powers of the various anti-corruption agencies in the area of arrest, investigation and prosecution of offenders without the interference of the executive arm of government either at the local, state or federal level.

Keywords: Capital expenditure, Cost of governance, recurrent expenditure, unethical behaviour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3526
871 A Comparative Study of Fine Grained Security Techniques Based on Data Accessibility and Inference

Authors: Azhar Rauf, Sareer Badshah, Shah Khusro

Abstract:

This paper analyzes different techniques of the fine grained security of relational databases for the two variables-data accessibility and inference. Data accessibility measures the amount of data available to the users after applying a security technique on a table. Inference is the proportion of information leakage after suppressing a cell containing secret data. A row containing a secret cell which is suppressed can become a security threat if an intruder generates useful information from the related visible information of the same row. This paper measures data accessibility and inference associated with row, cell, and column level security techniques. Cell level security offers greatest data accessibility as it suppresses secret data only. But on the other hand, there is a high probability of inference in cell level security. Row and column level security techniques have least data accessibility and inference. This paper introduces cell plus innocent security technique that utilizes the cell level security method but suppresses some innocent data to dodge an intruder that a suppressed cell may not necessarily contain secret data. Four variations of the technique namely cell plus innocent 1/4, cell plus innocent 2/4, cell plus innocent 3/4, and cell plus innocent 4/4 respectively have been introduced to suppress innocent data equal to 1/4, 2/4, 3/4, and 4/4 percent of the true secret data inside the database. Results show that the new technique offers better control over data accessibility and inference as compared to the state-of-theart security techniques. This paper further discusses the combination of techniques together to be used. The paper shows that cell plus innocent 1/4, 2/4, and 3/4 techniques can be used as a replacement for the cell level security.

Keywords: Fine Grained Security, Data Accessibility, Inference, Row, Cell, Column Level Security.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1465
870 Using Genetic Algorithms to Outline Crop Rotations and a Cropping-System Model

Authors: Nicolae Bold, Daniel Nijloveanu

Abstract:

The idea of cropping-system is a method used by farmers. It is an environmentally-friendly method, protecting the natural resources (soil, water, air, nutritive substances) and increase the production at the same time, taking into account some crop particularities. The combination of this powerful method with the concepts of genetic algorithms results into a possibility of generating sequences of crops in order to form a rotation. The usage of this type of algorithms has been efficient in solving problems related to optimization and their polynomial complexity allows them to be used at solving more difficult and various problems. In our case, the optimization consists in finding the most profitable rotation of cultures. One of the expected results is to optimize the usage of the resources, in order to minimize the costs and maximize the profit. In order to achieve these goals, a genetic algorithm was designed. This algorithm ensures the finding of several optimized solutions of cropping-systems possibilities which have the highest profit and, thus, which minimize the costs. The algorithm uses genetic-based methods (mutation, crossover) and structures (genes, chromosomes). A cropping-system possibility will be considered a chromosome and a crop within the rotation is a gene within a chromosome. Results about the efficiency of this method will be presented in a special section. The implementation of this method would bring benefits into the activity of the farmers by giving them hints and helping them to use the resources efficiently.

Keywords: Genetic algorithm, chromosomes, genes, cropping, agriculture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1596
869 Automat Control of the Aircrafts- Lateral Movement using the Dynamic Inversion

Authors: Mihai Lungu, Romulus Lungu, Lucian Grigorie

Abstract:

The paper presents a new system for the automat control of the aircrafts- flight in lateral plane using the cinematic model and the dynamic inversion. Starting from the equations of the aircrafts- lateral movement, the authors use two axes systems and obtained a control law that cancels the lateral deviation of the flying objects from the runway line. This system makes the aircrafts- direction angle to follow the direction angle of the runway line. Simulations in Matlab/Simulink have been done for different aircraft-s initial points and direction angles. The inconvenience of this system is the long duration of the “transient regime". That is why this system can be used independently, but the results are not very good; thus, it can be a part (subsystem) of other systems. The main system that cancels the lateral deviation from the runway line is based on dynamic inversion and uses, as subsystem, the control system for the lateral movement using the cinematic model. Using complex Matlab/Simulink models, the authors obtained the time evolution of the direction angle and the time evolution of the aircraft lateral deviation with respect to the runway line, for different values of the initial direction angle and for different wind types. The system has a very good behavior for all initial direction angles and wind types.

Keywords: Direction angle, Dynamic inversion, Lateraldeviation, Lateral movement

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1933
868 Facility Location Selection using Preference Programming

Authors: C. Ardil

Abstract:

This paper presents preference programming technique based multiple criteria decision making analysis for selecting a facility location for a new organization or expansion of an existing facility which is of vital importance for a decision support system and strategic planning process. The implementation of decision support systems is considered crucial to sustain competitive advantage and profitability persistence in turbulent environment. As an effective strategic management and decision making is necessary, multiple criteria decision making analysis supports the decision makers to formulate and implement the right strategy. The investment cost associated with acquiring the property and facility construction makes the facility location selection problem a long-term strategic investment decision, which rationalize the best location selection which results in higher economic benefits through increased productivity and optimal distribution network. Selecting the proper facility location from a given set of alternatives is a difficult task, as many potential qualitative and quantitative multiple conflicting criteria are to be considered. This paper solves a facility location selection problem using preference programming, which is an effective multiple criteria decision making analysis tool applied to deal with complex decision problems in the operational research environment. The ranking results of preference programming are compared with WSM, TOPSIS and VIKOR methods.

Keywords: Facility Location Selection, Multiple Criteria Decision Making, Multiple Criteria Decision Making Analysis, Preference Programming, Location Selection, WSM, TOPSIS, VIKOR

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 526
867 Analysis of Cascade Control Structure in Train Dynamic Braking System

Authors: B. Moaveni, S. Morovati

Abstract:

In recent years, increasing the usage of railway transportations especially in developing countries caused more attention to control systems railway vehicles. Consequently, designing and implementing the modern control systems to improve the operating performance of trains and locomotives become one of the main concerns of researches. Dynamic braking systems is an important safety system which controls the amount of braking torque generated by traction motors, to keep the adhesion coefficient between the wheel-sets and rail road in optimum bound. Adhesion force has an important role to control the braking distance and prevent the wheels from slipping during the braking process. Cascade control structure is one of the best control methods for the wide range of industrial plants in the presence of disturbances and errors. This paper presents cascade control structure based on two forward simple controllers with two feedback loops to control the slip ratio and braking torque. In this structure, the inner loop controls the angular velocity and the outer loop control the longitudinal velocity of the locomotive that its dynamic is slower than the dynamic of angular velocity. This control structure by controlling the torque of DC traction motors, tries to track the desired velocity profile to access the predefined braking distance and to control the slip ratio. Simulation results are employed to show the effectiveness of the introduced methodology in dynamic braking system.

Keywords: Cascade control, dynamic braking system, DC traction motors, slip control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1643