Search results for: distributed algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3886

Search results for: distributed algorithms

1876 A Multi-Criteria Model for Scheduling of Stochastic Single Machine Problem with Outsourcing and Solving It through Application of Chance Constrained

Authors: Homa Ghave, Parmis Shahmaleki

Abstract:

This paper presents a new multi-criteria stochastic mathematical model for a single machine scheduling with outsourcing allowed. There are multiple jobs processing in batch. For each batch, all of job or a quantity of it can be outsourced. The jobs have stochastic processing time and lead time and deterministic due dates arrive randomly. Because of the stochastic inherent of processing time and lead time, we use the chance constrained programming for modeling the problem. First, the problem is formulated in form of stochastic programming and then prepared in a form of deterministic mixed integer linear programming. The objectives are considered in the model to minimize the maximum tardiness and outsourcing cost simultaneously. Several procedures have been developed to deal with the multi-criteria problem. In this paper, we utilize the concept of satisfaction functions to increases the manager’s preference. The proposed approach is tested on instances where the random variables are normally distributed.

Keywords: single machine scheduling, multi-criteria mathematical model, outsourcing strategy, uncertain lead times and processing times, chance constrained programming, satisfaction function

Procedia PDF Downloads 254
1875 Computing Continuous Skyline Queries without Discriminating between Static and Dynamic Attributes

Authors: Ibrahim Gomaa, Hoda M. O. Mokhtar

Abstract:

Although most of the existing skyline queries algorithms focused basically on querying static points through static databases; with the expanding number of sensors, wireless communications and mobile applications, the demand for continuous skyline queries has increased. Unlike traditional skyline queries which only consider static attributes, continuous skyline queries include dynamic attributes, as well as the static ones. However, as skyline queries computation is based on checking the domination of skyline points over all dimensions, considering both the static and dynamic attributes without separation is required. In this paper, we present an efficient algorithm for computing continuous skyline queries without discriminating between static and dynamic attributes. Our algorithm in brief proceeds as follows: First, it excludes the points which will not be in the initial skyline result; this pruning phase reduces the required number of comparisons. Second, the association between the spatial positions of data points is examined; this phase gives an idea of where changes in the result might occur and consequently enables us to efficiently update the skyline result (continuous update) rather than computing the skyline from scratch. Finally, experimental evaluation is provided which demonstrates the accuracy, performance and efficiency of our algorithm over other existing approaches.

Keywords: continuous query processing, dynamic database, moving object, skyline queries

Procedia PDF Downloads 201
1874 Evaluation of Random Forest and Support Vector Machine Classification Performance for the Prediction of Early Multiple Sclerosis from Resting State FMRI Connectivity Data

Authors: V. Saccà, A. Sarica, F. Novellino, S. Barone, T. Tallarico, E. Filippelli, A. Granata, P. Valentino, A. Quattrone

Abstract:

The work aim was to evaluate how well Random Forest (RF) and Support Vector Machine (SVM) algorithms could support the early diagnosis of Multiple Sclerosis (MS) from resting-state functional connectivity data. In particular, we wanted to explore the ability in distinguishing between controls and patients of mean signals extracted from ICA components corresponding to 15 well-known networks. Eighteen patients with early-MS (mean-age 37.42±8.11, 9 females) were recruited according to McDonald and Polman, and matched for demographic variables with 19 healthy controls (mean-age 37.55±14.76, 10 females). MRI was acquired by a 3T scanner with 8-channel head coil: (a)whole-brain T1-weighted; (b)conventional T2-weighted; (c)resting-state functional MRI (rsFMRI), 200 volumes. Estimated total lesion load (ml) and number of lesions were calculated using LST-toolbox from the corrected T1 and FLAIR. All rsFMRIs were pre-processed using tools from the FMRIB's Software Library as follows: (1) discarding of the first 5 volumes to remove T1 equilibrium effects, (2) skull-stripping of images, (3) motion and slice-time correction, (4) denoising with high-pass temporal filter (128s), (5) spatial smoothing with a Gaussian kernel of FWHM 8mm. No statistical significant differences (t-test, p < 0.05) were found between the two groups in the mean Euclidian distance and the mean Euler angle. WM and CSF signal together with 6 motion parameters were regressed out from the time series. We applied an independent component analysis (ICA) with the GIFT-toolbox using the Infomax approach with number of components=21. Fifteen mean components were visually identified by two experts. The resulting z-score maps were thresholded and binarized to extract the mean signal of the 15 networks for each subject. Statistical and machine learning analysis were then conducted on this dataset composed of 37 rows (subjects) and 15 features (mean signal in the network) with R language. The dataset was randomly splitted into training (75%) and test sets and two different classifiers were trained: RF and RBF-SVM. We used the intrinsic feature selection of RF, based on the Gini index, and recursive feature elimination (rfe) for the SVM, to obtain a rank of the most predictive variables. Thus, we built two new classifiers only on the most important features and we evaluated the accuracies (with and without feature selection) on test-set. The classifiers, trained on all the features, showed very poor accuracies on training (RF:58.62%, SVM:65.52%) and test sets (RF:62.5%, SVM:50%). Interestingly, when feature selection by RF and rfe-SVM were performed, the most important variable was the sensori-motor network I in both cases. Indeed, with only this network, RF and SVM classifiers reached an accuracy of 87.5% on test-set. More interestingly, the only misclassified patient resulted to have the lowest value of lesion volume. We showed that, with two different classification algorithms and feature selection approaches, the best discriminant network between controls and early MS, was the sensori-motor I. Similar importance values were obtained for the sensori-motor II, cerebellum and working memory networks. These findings, in according to the early manifestation of motor/sensorial deficits in MS, could represent an encouraging step toward the translation to the clinical diagnosis and prognosis.

Keywords: feature selection, machine learning, multiple sclerosis, random forest, support vector machine

Procedia PDF Downloads 228
1873 Perception and Knowledge of the Jordanian Society of Occupational Therapy

Authors: Wesam Darawsheh

Abstract:

Background: there are scarcity of studies done to investigate the level of knowledge and the level of awareness and perception of Jordanians about occupational therapy (OT). Aim: to investigate the level of awareness of lay people, clients receiving services and healthcare professionals of OT, identify the common misconceptions about OT, and to explore ways whereby the knowledge and awareness about OT can be increased. Methodology: a cross sectional design was employed in this study where a survey was distributed in the Northern, Southern, Western, Eastern provinces and the Middle (capital city: Amman) province of Jordan. The survey consisted of eight section and 61 questions that aims to investigate the demographics of participants, self evaluation concerning knowledge and awareness about OT, sources of knowledge about OT, the perception of the aims, fields of practice, OT settings, misconceptions about OT, and suggestion to improve knowledge and awareness about OT. Results: A total of 829 participants were enrolled in this study: 459 lay people, 155 clients who are currently receiving OT services, 215 healthcare professionals. About 57% of the participants did not hear about OT, and 48% of those who reported to hear about OT did not have sufficient knowledge about it. There are several misconceptions associated with OT. The statistical analysis was executed using IBM SPSS software, Version 22.0 (SPSS, Chicago, USA). Conclusion: it is the responsibility of OTRs to increase the knowledge and awareness about OT in Jordan. This is required for the profession to proliferate and to be given its status.

Keywords: knowledge, occupational therapy misconceptions, healthcare professionals, lay people, Jordan

Procedia PDF Downloads 355
1872 Parents' Attitude toward Compulsory Pre-School Education in Slovakia

Authors: Sona Lorencova, Beata Hornickova

Abstract:

Compulsory pre-school education in Slovakia will be established by the Education Act for all five-year-old children from September 2021. The implementation of this law will change pre-school education in our country from optional to compulsory, and children will be able to complete this education either in institutional form school facilities or in the form of individual education at the request of the parent. The primary purpose of this change is that all children achieve pre-school education before entering primary school, thus eliminating differences between children before entering primary school. The benefits of introducing compulsory pre-school education are obvious to the professional public. However, as this fundamental change in children's education is perceived by parents who have a prime position in the upbringing and education of their children, research pays minimal attention. The aim of the study is to interpret the findings of quantitatively oriented research, which was focused on finding out the attitudes of parents to the planned introduction of compulsory preschool education in Slovakia. The data were obtained through questionnaires primarily intended for parents of preschool children. In the distributed questionnaire, the degree of agreement or disagreement with individual items could be expressed on a 5-point Likert scale. The results of the research present how perceived compulsory pre-school education is perceived by the parental public in Slovakia and what perspectives and limitations parents anticipate after its introduction.

Keywords: compulsory pre-school education, education act, childs' learning and development, kindergarten, parents' perspectives

Procedia PDF Downloads 151
1871 Brain Computer Interface Implementation for Affective Computing Sensing: Classifiers Comparison

Authors: Ramón Aparicio-García, Gustavo Juárez Gracia, Jesús Álvarez Cedillo

Abstract:

A research line of the computer science that involve the study of the Human-Computer Interaction (HCI), which search to recognize and interpret the user intent by the storage and the subsequent analysis of the electrical signals of the brain, for using them in the control of electronic devices. On the other hand, the affective computing research applies the human emotions in the HCI process helping to reduce the user frustration. This paper shows the results obtained during the hardware and software development of a Brain Computer Interface (BCI) capable of recognizing the human emotions through the association of the brain electrical activity patterns. The hardware involves the sensing stage and analogical-digital conversion. The interface software involves algorithms for pre-processing of the signal in time and frequency analysis and the classification of patterns associated with the electrical brain activity. The methods used for the analysis and classification of the signal have been tested separately, by using a database that is accessible to the public, besides to a comparison among classifiers in order to know the best performing.

Keywords: affective computing, interface, brain, intelligent interaction

Procedia PDF Downloads 373
1870 A Monocular Measurement for 3D Objects Based on Distance Area Number and New Minimize Projection Error Optimization Algorithms

Authors: Feixiang Zhao, Shuangcheng Jia, Qian Li

Abstract:

High-precision measurement of the target’s position and size is one of the hotspots in the field of vision inspection. This paper proposes a three-dimensional object positioning and measurement method using a monocular camera and GPS, namely the Distance Area Number-New Minimize Projection Error (DAN-NMPE). Our algorithm contains two parts: DAN and NMPE; specifically, DAN is a picture sequence algorithm, NMPE is a relatively positive optimization algorithm, which greatly improves the measurement accuracy of the target’s position and size. Comprehensive experiments validate the effectiveness of our proposed method on a self-made traffic sign dataset. The results show that with the laser point cloud as the ground truth, the size and position errors of the traffic sign measured by this method are ± 5% and 0.48 ± 0.3m, respectively. In addition, we also compared it with the current mainstream method, which uses a monocular camera to locate and measure traffic signs. DAN-NMPE attains significant improvements compared to existing state-of-the-art methods, which improves the measurement accuracy of size and position by 50% and 15.8%, respectively.

Keywords: monocular camera, GPS, positioning, measurement

Procedia PDF Downloads 130
1869 Multilayer Ceramic Capacitors: Based Force Sensor Array for Occlusal Force Measurement

Authors: Sheng-Che Chen, Keng-Ren Lin, Che-Hsin Lin, Hao-Yuan Tseng, Chih-Han Chang

Abstract:

Teeth play an important role in providing the essential nutrients. The force loading of chewing on the crow is important condition to evaluate long-term success of many dental treatments. However, the quantification of the force regarding forces are distributed over the dental crow is still not well recognized. This study presents an industrial-grade piezoelectric-based multilayer ceramic capacitors (MLCCs) force sensor for measuring the distribution of the force distribute over the first molar. The developed sensor array is based on a flexible polyimide electrode and barium titanate-based MLCCs. MLCCs are commonly used in the electronic industry and it is a typical electric component composed of BaTiO₃, which is used as a capacitive material. The most important is that it also can be used as a force-sensing component by its piezoelectric property. In this study, to increase the sensitivity as well as to reduce the variation of different MLCCs, a treatment process is utilized. The MLCC force sensors are able to measure large forces (above 500 N), making them suitable for measuring the bite forces on the tooth crown. Moreover, the sensors also show good force response and good repeatability.

Keywords: force sensor array, multilayer ceramic capacitors, occlusal force, piezoelectric

Procedia PDF Downloads 404
1868 Training of Future Computer Science Teachers Based on Machine Learning Methods

Authors: Meruert Serik, Nassipzhan Duisegaliyeva, Danara Tleumagambetova

Abstract:

The article highlights and describes the characteristic features of real-time face detection in images and videos using machine learning algorithms. Students of educational programs reviewed the research work "6B01511-Computer Science", "7M01511-Computer Science", "7M01525- STEM Education," and "8D01511-Computer Science" of Eurasian National University named after L.N. Gumilyov. As a result, the advantages and disadvantages of Haar Cascade (Haar Cascade OpenCV), HoG SVM (Histogram of Oriented Gradients, Support Vector Machine), and MMOD CNN Dlib (Max-Margin Object Detection, convolutional neural network) detectors used for face detection were determined. Dlib is a general-purpose cross-platform software library written in the programming language C++. It includes detectors used for determining face detection. The Cascade OpenCV algorithm is efficient for fast face detection. The considered work forms the basis for the development of machine learning methods by future computer science teachers.

Keywords: algorithm, artificial intelligence, education, machine learning

Procedia PDF Downloads 64
1867 Rule Based Architecture for Collaborative Multidisciplinary Aircraft Design Optimisation

Authors: Nickolay Jelev, Andy Keane, Carren Holden, András Sóbester

Abstract:

In aircraft design, the jump from the conceptual to preliminary design stage introduces a level of complexity which cannot be realistically handled by a single optimiser, be that a human (chief engineer) or an algorithm. The design process is often partitioned along disciplinary lines, with each discipline given a level of autonomy. This introduces a number of challenges including, but not limited to: coupling of design variables; coordinating disciplinary teams; handling of large amounts of analysis data; reaching an acceptable design within time constraints. A number of classical Multidisciplinary Design Optimisation (MDO) architectures exist in academia specifically designed to address these challenges. Their limited use in the industrial aircraft design process has inspired the authors of this paper to develop an alternative strategy based on well established ideas from Decision Support Systems. The proposed rule based architecture sacrifices possibly elusive guarantees of convergence for an attractive return in simplicity. The method is demonstrated on analytical and aircraft design test cases and its performance is compared to a number of classical distributed MDO architectures.

Keywords: Multidisciplinary Design Optimisation, Rule Based Architecture, Aircraft Design, Decision Support System

Procedia PDF Downloads 346
1866 Benefits of Construction Management Implications and Processes by Projects Managers on Project Completion

Authors: Mamoon Mousa Atout

Abstract:

Projects managers in construction industry usually face a difficult organizational environment especially if the project is unique. The organization lacks the processes to practice construction management correctly, and the executive’s technical managers who have lack of experience in playing their role and responsibilities correctly. Project managers need to adopt best practices that allow them to do things effectively to make sure that the project can be delivered without any delay even though the executive’s technical managers should follow a certain process to avoid any factor might cause any delay during the project life cycle. The purpose of the paper is to examine the awareness level of projects managers about construction management processes, tools, techniques and implications to complete projects on time. The outcome and the results of the study are prepared based on the designed questionnaires and interviews conducted with many project managers. The method used in this paper is a quantitative study. A survey with a sample of 100 respondents was prepared and distributed in a construction company in Dubai, which includes nine questions to examine the level of their awareness. This research will also identify the necessary benefits of processes of construction management that has to be adopted by projects managers to mitigate the maximum potential problems which might cause any delay to the project life cycle.

Keywords: construction management, project objectives, resource planing and scheduling, project completion

Procedia PDF Downloads 390
1865 Minimum Vertices Dominating Set Algorithm for Secret Sharing Scheme

Authors: N. M. G. Al-Saidi, K. A. Kadhim, N. A. Rajab

Abstract:

Over the past decades, computer networks and data communication system has been developing fast, so, the necessity to protect a transmitted data is a challenging issue, and data security becomes a serious problem nowadays. A secret sharing scheme is a method which allows a master key to be distributed among a finite set of participants, in such a way that only certain authorized subsets of participants to reconstruct the original master key. To create a secret sharing scheme, many mathematical structures have been used; the most widely used structure is the one that is based on graph theory (graph access structure). Subsequently, many researchers tried to find efficient schemes based on graph access structures. In this paper, we propose a novel efficient construction of a perfect secret sharing scheme for uniform access structure. The dominating set of vertices in a regular graph is used for this construction in the following way; each vertex represents a participant and each minimum independent dominating subset represents a minimal qualified subset. Some relations between dominating set, graph order and regularity are achieved, and can be used to demonstrate the possibility of using dominating set to construct a secret sharing scheme. The information rate that is used as a measure for the efficiency of such systems is calculated to show that the proposed method has some improved values.

Keywords: secret sharing scheme, dominating set, information rate, access structure, rank

Procedia PDF Downloads 383
1864 Performance Evaluation of Hierarchical Location-Based Services Coupled to the Greedy Perimeter Stateless Routing Protocol for Wireless Sensor Networks

Authors: Rania Khadim, Mohammed Erritali, Abdelhakim Maaden

Abstract:

Nowadays Wireless Sensor Networks have attracted worldwide research and industrial interest, because they can be applied in various areas. Geographic routing protocols are very suitable to those networks because they use location information when they need to route packets. Obviously, location information is maintained by Location-Based Services provided by network nodes in a distributed way. In this paper we choose to evaluate the performance of two hierarchical rendezvous location based-services, GLS (Grid Location Service) and HLS (Hierarchical Location Service) coupled to the GPSR routing protocol (Greedy Perimeter Stateless Routing) for Wireless Sensor Network. The simulations were performed using NS2 simulator to evaluate the performance and power of the two services in term of location overhead, the request travel time (RTT) and the query Success ratio (QSR). This work presents also a new scalability performance study of both GLS and HLS, specifically, what happens if the number of nodes N increases. The study will focus on three qualitative metrics: The location maintenance cost, the location query cost and the storage cost.

Keywords: location based-services, routing protocols, scalability, wireless sensor networks

Procedia PDF Downloads 356
1863 A Plan of Smart Management for Groundwater Resources

Authors: Jennifer Chen, Pei Y. Hsu, Yu W. Chen

Abstract:

Groundwater resources play a vital role in regional water supply because over 1/3 of total demand is satisfied by groundwater resources. Because over-pumpage might cause environmental impact such as land subsidence, a sustainable management of groundwater resource is required. In this study, a blueprint of smart management for groundwater resource is proposed and planned. The framework of the smart management can be divided into two major parts, hardware and software parts. First, an internet of groundwater (IoG) which is inspired by the internet of thing (IoT) is proposed to observe the migration of groundwater usage and the associated response, groundwater levels. Second, algorithms based on data mining and signal analysis are proposed to achieve the goal of providing highly efficient management of groundwater. The entire blueprint is a 4-year plan and this year is the first year. We have finished the installation of 50 flow meters and 17 observation wells. An underground hydrological model is proposed to determine the associated drawdown caused by the measured pumpages. Besides, an alternative to the flow meter is also proposed to decrease the installation cost of IoG. An accelerometer and 3G remote transmission are proposed to detect the on and off of groundwater pumpage.

Keywords: groundwater management, internet of groundwater, underground hydrological model, alternative of flow meter

Procedia PDF Downloads 367
1862 An Analysis of Non-Elliptic Curve Based Primality Tests

Authors: William Wong, Zakaria Alomari, Hon Ching Lai, Zhida Li

Abstract:

Modern-day information security depends on implementing Diffie-Hellman, which requires the generation of prime numbers. Because the number of primes is infinite, it is impractical to store prime numbers for use, and therefore, primality tests are indispensable in modern-day information security. A primality test is a test to determine whether a number is prime or composite. There are two types of primality tests, which are deterministic tests and probabilistic tests. Deterministic tests are adopting algorithms that provide a definite answer whether a given number is prime or composite. While in probabilistic tests, a probabilistic result would be provided, there is a degree of uncertainty. In this paper, we review three probabilistic tests: the Fermat Primality Test, the Miller-Rabin Test, and the Baillie-PSW Test, as well as one deterministic test, the Agrawal-Kayal-Saxena (AKS) Test. Furthermore, we do an analysis of these tests. All of the reviews discussed are not based on the Elliptic Curve. The analysis demonstrates that, in the majority of real-world scenarios, the Baillie- PSW test’s favorability stems from its typical operational complexity of O(log 3n) and its capacity to deliver accurate results for numbers below 2^64.

Keywords: primality tests, Fermat’s primality test, Miller-Rabin primality test, Baillie-PSW primality test, AKS primality test

Procedia PDF Downloads 75
1861 White Light Emission through Downconversion of Terbium and Europium Doped CEF3 Nanophosphors

Authors: Mohit Kalra, Varun S., Mayuri Gandhi

Abstract:

CeF3 nanophosphors has been extensively investigated in the recent years for lighting and numerous bio-applications. Down conversion emissions in CeF3:Eu3+/Tb3+ phosphors were studied with the aim of obtaining a white light emitting composition, by a simple co-precipitation method. The material was characterized by X-ray Diffraction (XRD), High Resolution Transmission Electron Microscopy (HR-TEM), Fourier Transform Infrared Spectroscopy (FT-IR) and Photoluminescence (PL). Uniformly distributed nanoparticles were obtained with an average particle size 8-10 nm. Different doping concentrations were performed and fluorescence study was carried out to optimize the dopants concentration for maximum luminescence intensity. The steady state and time resolved luminescence studies confirmed efficient energy transfer from the host to activator ions. Different concentrations of Tb 3+, Eu 3+ were doped to achieve a white light emitting phosphor for UV-based Light Emitting Diodes (LEDs). The nanoparticles showed characteristic emission of respective dopants (Eu 3+, Tb3+) when excited at the 4f→5d transition of Ce3+. The chromaticity coordinates for these samples were calculated and the CeF3 doped with Eu 3+ and Tb3+ gave an emission very close to white light. These materials may find its applications in optoelectronics and various bio applications.

Keywords: white light down-conversion, nanophosphors, LEDs, rare earth, cerium fluoride, lanthanides

Procedia PDF Downloads 395
1860 Cellulose Acetate/Polyacrylic Acid Filled with Nano-Hydroxapatite Composites: Spectroscopic Studies and Search for Biomedical Applications

Authors: E. M. AbdelRazek, G. S. ElBahy, M. A. Allam, A. M. Abdelghany, A. M. Hezma

Abstract:

Polymeric biocomposite of hydroxyapatite/polyacrylic acid were prepared and their thermal and mechanical properties were improved by addition of cellulose acetate. FTIR spectroscopy technique and X-ray diffraction analysis were employed to examine the physical and chemical characteristics of the biocomposites. Scanning electron microscopy shows a uniform distribution of HAp nano-particles through the polymeric matrix of two organic/inorganic composites weight ratios (60/40 and 70/30), at which the material crystallinity reaches a considerable value appropriate for the needed applications were studied and revealed that the HAp nano-particles are uniformly distributed in the polymeric matrix. Kinetic parameters were determined from the weight loss data using non isothermal thermogravimetric analysis (TGA). Also, the main degradation steps were described and discussed. The mechanical properties of composites were evaluated by measuring tensile strength and elastic modulus. The data indicate that the addition of cellulose acetate can make homogeneous composites scaffold significantly resistant to higher stress. Elastic modulus of the composites was also improved by the addition of cellulose acetate, making them more appropriate for bioapplications.

Keywords: biocomposite, chemical synthesis, infrared spectroscopy, mechanical properties

Procedia PDF Downloads 449
1859 Barriers Facing the Implementation of Lean Manufacturing in Libyan Manufacturing Companies

Authors: Mohamed Abduelmula, Martin Birkett, Chris Connor

Abstract:

Lean Manufacturing has developed from being a set of tools and methods to becoming a management philosophy which can be used to remove or reduce waste in manufacturing processes and so enhance the operational productivity of an enterprise. Several enterprises around the world have applied the lean manufacturing system and gained great improvements. This paper investigates the barriers and obstacles that face Libyan manufacturing companies to implement lean manufacturing. A mixed-method approach is suggested, starting with conducting a questionnaire to get quantitative data then using this to develop semi-structured interviews to collect qualitative data. The findings of the questionnaire results and how these can be used further develop the semi-structured interviews are then discussed. The survey was distributed to 65 manufacturing companies in Libya, and a response rate of 64.6% was obtained. The results showed that these are five main barriers to implementing lean in Libya, namely organizational culture, skills and expertise, and training program, financial capability, top management, and communication. These barriers were also identified from the literature as being significant obstacles to implementing Lean in other countries industries. Having an understanding of the difficulties that face the implementation of lean manufacturing systems, as a new and modern system and using this to develop a suitable framework will help to improve the manufacturing sector in Libya.

Keywords: lean manufacturing, barriers, questionnaire, Libyan manufacturing companies

Procedia PDF Downloads 231
1858 Workflow Based Inspection of Geometrical Adaptability from 3D CAD Models Considering Production Requirements

Authors: Tobias Huwer, Thomas Bobek, Gunter Spöcker

Abstract:

Driving forces for enhancements in production are trends like digitalization and individualized production. Currently, such developments are restricted to assembly parts. Thus, complex freeform surfaces are not addressed in this context. The need for efficient use of resources and near-net-shape production will require individualized production of complex shaped workpieces. Due to variations between nominal model and actual geometry, this can lead to changes in operations in Computer-aided process planning (CAPP) to make CAPP manageable for an adaptive serial production. In this context, 3D CAD data can be a key to realizing that objective. Along with developments in the geometrical adaptation, a preceding inspection method based on CAD data is required to support the process planner by finding objective criteria to make decisions about the adaptive manufacturability of workpieces. Nowadays, this kind of decisions is depending on the experience-based knowledge of humans (e.g. process planners) and results in subjective decisions – leading to a variability of workpiece quality and potential failure in production. In this paper, we present an automatic part inspection method, based on design and measurement data, which evaluates actual geometries of single workpiece preforms. The aim is to automatically determine the suitability of the current shape for further machining, and to provide a basis for an objective decision about subsequent adaptive manufacturability. The proposed method is realized by a workflow-based approach, keeping in mind the requirements of industrial applications. Workflows are a well-known design method of standardized processes. Especially in applications like aerospace industry standardization and certification of processes are an important aspect. Function blocks, providing a standardized, event-driven abstraction to algorithms and data exchange, will be used for modeling and execution of inspection workflows. Each analysis step of the inspection, such as positioning of measurement data or checking of geometrical criteria, will be carried out by function blocks. One advantage of this approach is its flexibility to design workflows and to adapt algorithms specific to the application domain. In general, within the specified tolerance range it will be checked if a geometrical adaption is possible. The development of particular function blocks is predicated on workpiece specific information e.g. design data. Furthermore, for different product lifecycle phases, appropriate logics and decision criteria have to be considered. For example, tolerances for geometric deviations are different in type and size for new-part production compared to repair processes. In addition to function blocks, appropriate referencing systems are important. They need to support exact determination of position and orientation of the actual geometries to provide a basis for precise analysis. The presented approach provides an inspection methodology for adaptive and part-individual process chains. The analysis of each workpiece results in an inspection protocol and an objective decision about further manufacturability. A representative application domain is the product lifecycle of turbine blades containing a new-part production and a maintenance process. In both cases, a geometrical adaptation is required to calculate individual production data. In contrast to existing approaches, the proposed initial inspection method provides information to decide between different potential adaptive machining processes.

Keywords: adaptive, CAx, function blocks, turbomachinery

Procedia PDF Downloads 291
1857 The Communication Library DIALOG for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

Modern experiments in high energy physics impose great demands on the reliability, the efficiency, and the data rate of Data Acquisition Systems (DAQ). This contribution focuses on the development and deployment of the new communication library DIALOG for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. The iFDAQ utilizing a hardware event builder is designed to be able to readout data at the maximum rate of the experiment. The DIALOG library is a communication system both for distributed and mixed environments, it provides a network transparent inter-process communication layer. Using the high-performance and modern C++ framework Qt and its Qt Network API, the DIALOG library presents an alternative to the previously used DIM library. The DIALOG library was fully incorporated to all processes in the iFDAQ during the run 2016. From the software point of view, it might be considered as a significant improvement of iFDAQ in comparison with the previous run. To extend the possibilities of debugging, the online monitoring of communication among processes via DIALOG GUI is a desirable feature. In the paper, we present the DIALOG library from several insights and discuss it in a detailed way. Moreover, the efficiency measurement and comparison with the DIM library with respect to the iFDAQ requirements is provided.

Keywords: data acquisition system, DIALOG library, DIM library, FPGA, Qt framework, TCP/IP

Procedia PDF Downloads 309
1856 Quality Assurance in Cardiac Disorder Detection Images

Authors: Anam Naveed, Asma Andleeb, Mehreen Sirshar

Abstract:

In the article, Image processing techniques have been applied on cardiac images for enhancing the image quality. Two types of methodologies considers for survey, invasive techniques and non-invasive techniques. Different image processes for improvement of cardiac image quality and reduce the amount of radiation exposure for invasive techniques are explored. Different image processing algorithms for enhancing the noninvasive cardiac image qualities are described. Beside these two methodologies, third methodology has applied on live streaming of heart rate on ECG window for extracting necessary information, removing noise and enhancing quality. Sensitivity analyses have been carried out to investigate the impacts of cardiac images for diagnosis of cardiac arteries disease and how the enhancement on images will help the cardiologist to diagnoses disease. The paper evaluates strengths and weaknesses of different techniques applied for improved the image quality and draw a conclusion. Some specific limitations must be considered for whole survey, like the patient heart beat must be 70-75 beats/minute while doing the angiography, similarly patient weight and exposure radiation amount has some limitation.

Keywords: cardiac images, CT angiography, critical analysis, exposure radiation, invasive techniques, invasive techniques, non-invasive techniques

Procedia PDF Downloads 333
1855 Numerical Investigation of Static and Dynamic Responses of Fiber Reinforced Sand

Authors: Sandeep Kumar, Mahesh Kumar Jat, Rajib Sarkar

Abstract:

Soil reinforced with randomly distributed fibers is an attractive means to improve the performance of soil in a cost effective manner. Static and dynamic characterization of fiber reinforced soil have become important to evaluate adequate performance for all classes of geotechnical engineering problems. Present study investigates the behaviour of fiber reinforced cohesionless soil through numerical simulation of triaxial specimen. The numerical model has been validated with the existing literature of laboratory triaxial compression testing. A parametric study has been done to find out optimum fiber content for shear resistance. Cyclic triaxial testing has been simulated and the stress-strain response of fiber-reinforced sand has been examined considering different combination of fiber contents. Shear modulus values and damping values of fiber-reinforced sand are evaluated. It has been observed from results that for 1.0 percent fiber content shear modulus increased 2.28 times and damping ratio decreased 4.6 times. The influence of amplitude of cyclic strain, confining pressure and frequency of loading on the dynamic properties of fiber reinforced sand has been investigated and presented.

Keywords: damping, fiber reinforced soil, numerical modelling, shear modulus

Procedia PDF Downloads 264
1854 A Comprehensive Safety Analysis for a Pressurized Water Reactor Fueled with Mixed-Oxide Fuel as an Accident Tolerant Fuel

Authors: Mohamed Y. M. Mohsen

Abstract:

The viability of utilising mixed-oxide fuel (MOX) ((U₀.₉, rgPu₀.₁) O₂) as an accident-tolerant fuel (ATF) has been thoroughly investigated. MOX fuel provides the best example of a nuclear waste recycling process. The MCNPX 2.7 code was used to determine the main neutronic features, especially the radial power distribution, to identify the hot channel on which the thermal-hydraulic (TH) study was performed. Based on the computational fluid dynamics technique, the simulation of the rod-centered thermal-hydraulic subchannel model was implemented using COMSOL Multiphysics. TH analysis was utilised to determine the axially and radially distributed temperatures of the fuel and cladding materials, as well as the departure from the nucleate boiling ratio (DNBR) along the coolant channel. COMSOL Multiphysics can simulate reality by coupling multiphysics, such as coupling between heat transfer and solid mechanics. The main solid structure parameters, such as the von Mises stress, volumetric strain, and displacement, were simulated using this coupling. When the neutronic, TH, and solid structure performances of UO₂ and ((U₀.₉, rgPu₀.₁) O₂) were compared, the results showed considerable improvement and an increase in safety margins with the use of ((U₀.₉, rgPu₀.₁) O₂).

Keywords: mixed-oxide, MCNPX, neutronic analysis, COMSOL-multiphysics, thermal-hydraulic, solid structure

Procedia PDF Downloads 93
1853 Load Forecasting Using Neural Network Integrated with Economic Dispatch Problem

Authors: Mariyam Arif, Ye Liu, Israr Ul Haq, Ahsan Ashfaq

Abstract:

High cost of fossil fuels and intensifying installations of alternate energy generation sources are intimidating main challenges in power systems. Making accurate load forecasting an important and challenging task for optimal energy planning and management at both distribution and generation side. There are many techniques to forecast load but each technique comes with its own limitation and requires data to accurately predict the forecast load. Artificial Neural Network (ANN) is one such technique to efficiently forecast the load. Comparison between two different ranges of input datasets has been applied to dynamic ANN technique using MATLAB Neural Network Toolbox. It has been observed that selection of input data on training of a network has significant effects on forecasted results. Day-wise input data forecasted the load accurately as compared to year-wise input data. The forecasted load is then distributed among the six generators by using the linear programming to get the optimal point of generation. The algorithm is then verified by comparing the results of each generator with their respective generation limits.

Keywords: artificial neural networks, demand-side management, economic dispatch, linear programming, power generation dispatch

Procedia PDF Downloads 176
1852 Polarity Classification of Social Media Comments in Turkish

Authors: Migena Ceyhan, Zeynep Orhan, Dimitrios Karras

Abstract:

People in modern societies are continuously sharing their experiences, emotions, and thoughts in different areas of life. The information reaches almost everyone in real-time and can have an important impact in shaping people’s way of living. This phenomenon is very well recognized and advantageously used by the market representatives, trying to earn the most from this means. Given the abundance of information, people and organizations are looking for efficient tools that filter the countless data into important information, ready to analyze. This paper is a modest contribution in this field, describing the process of automatically classifying social media comments in the Turkish language into positive or negative. Once data is gathered and preprocessed, feature sets of selected single words or groups of words are build according to the characteristics of language used in the texts. These features are used later to train, and test a system according to different machine learning algorithms (Naïve Bayes, Sequential Minimal Optimization, J48, and Bayesian Linear Regression). The resultant high accuracies can be important feedback for decision-makers to improve the business strategies accordingly.

Keywords: feature selection, machine learning, natural language processing, sentiment analysis, social media reviews

Procedia PDF Downloads 140
1851 Sustainable Use of Fresh Groundwater Lens of Pleistocene Aquifer in Nam Dinh, Vietnam

Authors: Tran Thanh Le, Pham Trong Duc

Abstract:

The fresh groundwater lens of the Pleistocene aquifer in Nam Dinh was formed since 12,900 years ago. Currently, the Pleistocene aquifer has been continuously exploited on average of 154,163m3/day, distributed mainly in the districts of Nghia Hung, Hai Hau, a part of Truc Ninh, Y Yen, Nam Truc and Giao Thuy. The groundwater level is still on a declining trend, saltwater intrusion in this freshwater lens can occur if the growth rate in exploitation is maintained. This study focused on groundwater sustainable use by means of 4 groups of criteria including: Groundwater quality and pollution; Aquifers’ productivity and capacity; Environment impacts due to exploitation (groundwater level decline, land subsidence due to water exploitation); Social and economic impacts. Using a combination of methods including field surveys, geophysics, hydrogeochemistry, isotope and numerical models to determine safe groundwater exploitation thresholds for the whole study area has been determined to be 544,314m3/day and the actual exploitation amount is currently about 30% compared to the safe exploitation threshold. However, it should also be noted that the current groundwater exploitation threshold and level of its exploitation compared to the safe exploitation threshold of each locality are not the same. From this result, the groundwater exploitation threshold map of the study area was established to serve the management, licensing and orientation of groundwater exploitation.

Keywords: criteria, groundwater, fresh groundwater lens, pleistocene, Nam Dinh

Procedia PDF Downloads 149
1850 IEEE802.15.4e Based Scheduling Mechanisms and Systems for Industrial Internet of Things

Authors: Ho-Ting Wu, Kai-Wei Ke, Bo-Yu Huang, Liang-Lin Yan, Chun-Ting Lin

Abstract:

With the advances in advanced technology, wireless sensor network (WSN) has become one of the most promising candidates to implement the wireless industrial internet of things (IIOT) architecture. However, the legacy IEEE 802.15.4 based WSN technology such as Zigbee system cannot meet the stringent QoS requirement of low powered, real-time, and highly reliable transmission imposed by the IIOT environment. Recently, the IEEE society developed IEEE 802.15.4e Time Slotted Channel Hopping (TSCH) access mode to serve this purpose. Furthermore, the IETF 6TiSCH working group has proposed standards to integrate IEEE 802.15.4e with IPv6 protocol smoothly to form a complete protocol stack for IIOT. In this work, we develop key network technologies for IEEE 802.15.4e based wireless IIoT architecture, focusing on practical design and system implementation. We realize the OpenWSN-based wireless IIOT system. The system architecture is divided into three main parts: web server, network manager, and sensor nodes. The web server provides user interface, allowing the user to view the status of sensor nodes and instruct sensor nodes to follow commands via user-friendly browser. The network manager is responsible for the establishment, maintenance, and management of scheduling and topology information. It executes centralized scheduling algorithm, sends the scheduling table to each node, as well as manages the sensing tasks of each device. Sensor nodes complete the assigned tasks and sends the sensed data. Furthermore, to prevent scheduling error due to packet loss, a schedule inspection mechanism is implemented to verify the correctness of the schedule table. In addition, when network topology changes, the system will act to generate a new schedule table based on the changed topology for ensuring the proper operation of the system. To enhance the system performance of such system, we further propose dynamic bandwidth allocation and distributed scheduling mechanisms. The developed distributed scheduling mechanism enables each individual sensor node to build, maintain and manage the dedicated link bandwidth with its parent and children nodes based on locally observed information by exchanging the Add/Delete commands via two processes. The first process, termed as the schedule initialization process, allows each sensor node pair to identify the available idle slots to allocate the basic dedicated transmission bandwidth. The second process, termed as the schedule adjustment process, enables each sensor node pair to adjust their allocated bandwidth dynamically according to the measured traffic loading. Such technology can sufficiently satisfy the dynamic bandwidth requirement in the frequently changing environments. Last but not least, we propose a packet retransmission scheme to enhance the system performance of the centralized scheduling algorithm when the packet delivery rate (PDR) is low. We propose a multi-frame retransmission mechanism to allow every single network node to resend each packet for at least the predefined number of times. The multi frame architecture is built according to the number of layers of the network topology. Performance results via simulation reveal that such retransmission scheme is able to provide sufficient high transmission reliability while maintaining low packet transmission latency. Therefore, the QoS requirement of IIoT can be achieved.

Keywords: IEEE 802.15.4e, industrial internet of things (IIOT), scheduling mechanisms, wireless sensor networks (WSN)

Procedia PDF Downloads 150
1849 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic

Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi

Abstract:

In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.

Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing

Procedia PDF Downloads 286
1848 Urban Transformation as a Process for Inner-City Slums in Turkey the Experience of Gaziantep City, Turkey

Authors: Samer Katerji, Mustafa Ozakça, Esra Demircioğlu

Abstract:

The inner-city slums become a global phenomenon problem. It is widely distributed in separate zones through the urban textures, threatens cities in physical, economic and social aspects. It often has illegal settlements with unsafe and unhealthy conditions. By the time, it grown up rapidly followed by growing in its problems. According to United Nations, in some cities, up to 80 percent of the population lives in slums. Fifty-five million new slum dwellers have been added to the global population since 2000. Both developed and developing countries started to figure out mechanics to find solutions, which is suitable to solve the inner-city slums problems. In turn, the planning agenda of Turkey has been focused on urban transformation as a solution for inner-city slums problems since the 2000s. The current laws after 2004 changed all of the statements on the urban transformation of the country. This paper come to explain the urban transformation approach as qualified presses in dealing with inner-city slums problems of turkey. After that, it highlights one of the earliest ongoing transformation projects in Gaziantep city, which is adopted by the local municipalities. The study includes assessment of the pros and cons of pursuing the project and identifying the potential consequences. This is more likely to keep up with the efforts of Gaziantep Municipality in developing and transforming slum areas.

Keywords: transformation, urban, slums, Gaziantep

Procedia PDF Downloads 492
1847 Digital Control Algorithm Based on Delta-Operator for High-Frequency DC-DC Switching Converters

Authors: Renkai Wang, Tingcun Wei

Abstract:

In this paper, a digital control algorithm based on delta-operator is presented for high-frequency digitally-controlled DC-DC switching converters. The stability and the controlling accuracy of the DC-DC switching converters are improved by using the digital control algorithm based on delta-operator without increasing the hardware circuit scale. The design method of voltage compensator in delta-domain using PID (Proportion-Integration- Differentiation) control is given in this paper, and the simulation results based on Simulink platform are provided, which have verified the theoretical analysis results very well. It can be concluded that, the presented control algorithm based on delta-operator has better stability and controlling accuracy, and easier hardware implementation than the existed control algorithms based on z-operator, therefore it can be used for the voltage compensator design in high-frequency digitally- controlled DC-DC switching converters.

Keywords: digitally-controlled DC-DC switching converter, digital voltage compensator, delta-operator, finite word length, stability

Procedia PDF Downloads 400