Search results for: process component
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6195

Search results for: process component

4335 A New Extended Group Mutual Exclusion Algorithm with Low Message Complexity in Distributed Systems

Authors: S. Dehghan, A.M. Rahmani

Abstract:

The group mutual exclusion (GME) problem is an interesting generalization of the mutual exclusion problem. In the group mutual exclusion, multiple processes can enter a critical section simultaneously if they belong to the same group. In the extended group mutual exclusion, each process is a member of multiple groups at the same time. As a result, after the process by selecting a group enter critical section, other processes can select the same group with its belonging group and can enter critical section at the moment, so that it avoids their unnecessary blocking. This paper presents a quorum-based distributed algorithm for the extended group mutual exclusion problem. The message complexity of our algorithm is O(4Q ) in the best case and O(5Q) in the worst case, where Q is a quorum size.

Keywords: Group Mutual Exclusion (GME), Extended GME, Distributed systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1525
4334 Evaluation of Thrombolytic Activity of Zingiber cassumunar Roxb. and Thai Herbal Prasaplai Formula

Authors: Warachate Khobjai, Suriyan Sukati, Khemjira Jarmkom, Pattaranut Eakwaropas, Surachai Techaoei

Abstract:

The propose of this study was to investigate in vitro thrombolytic activity of Zingiber cassumunar Roxb. and Prasaplai, a Thai herbal formulation of Z. cassumunar Roxb. Herbs were extracted with boiling water and concentrated by lyophilization. To observe their thrombolytic potential, an in vitro clot lysis method was applied where streptokinase and sterile distilled water were used as positive and negative controls, respectively. Crude aqueous extracts from Z. cassumunar Roxb. and Prasaplai formula showed significant thrombolytic activity by clot lysis of 17.90% and 25.21%, respectively, compared to the negative control water (5.16%) while the standard streptokinase revealed 64.78% clot lysis. These findings suggest that Z. cassumunar Roxb. exhibits moderate thrombolytic activity and cloud play an important role in the thrombolytic properties of Prasaplai formula. However, further study should be done to observe in vivo clot dissolving potential and to isolate active component(s) of these extracts.

Keywords: Aqueous extract, prasaplai formula, thrombolytic activity, Zingiber cassumunar Roxb.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1413
4333 Image Ranking to Assist Object Labeling for Training Detection Models

Authors: Tonislav Ivanov, Oleksii Nedashkivskyi, Denis Babeshko, Vadim Pinskiy, Matthew Putman

Abstract:

Training a machine learning model for object detection that generalizes well is known to benefit from a training dataset with diverse examples. However, training datasets usually contain many repeats of common examples of a class and lack rarely seen examples. This is due to the process commonly used during human annotation where a person would proceed sequentially through a list of images labeling a sufficiently high total number of examples. Instead, the method presented involves an active process where, after the initial labeling of several images is completed, the next subset of images for labeling is selected by an algorithm. This process of algorithmic image selection and manual labeling continues in an iterative fashion. The algorithm used for the image selection is a deep learning algorithm, based on the U-shaped architecture, which quantifies the presence of unseen data in each image in order to find images that contain the most novel examples. Moreover, the location of the unseen data in each image is highlighted, aiding the labeler in spotting these examples. Experiments performed using semiconductor wafer data show that labeling a subset of the data, curated by this algorithm, resulted in a model with a better performance than a model produced from sequentially labeling the same amount of data. Also, similar performance is achieved compared to a model trained on exhaustive labeling of the whole dataset. Overall, the proposed approach results in a dataset that has a diverse set of examples per class as well as more balanced classes, which proves beneficial when training a deep learning model.

Keywords: Computer vision, deep learning, object detection, semiconductor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 823
4332 Wavelet Enhanced CCA for Minimization of Ocular and Muscle Artifacts in EEG

Authors: B. S. Raghavendra, D. Narayana Dutt

Abstract:

Electroencephalogram (EEG) recordings are often contaminated with ocular and muscle artifacts. In this paper, the canonical correlation analysis (CCA) is used as blind source separation (BSS) technique (BSS-CCA) to decompose the artifact contaminated EEG into component signals. We combine the BSSCCA technique with wavelet filtering approach for minimizing both ocular and muscle artifacts simultaneously, and refer the proposed method as wavelet enhanced BSS-CCA. In this approach, after careful visual inspection, the muscle artifact components are discarded and ocular artifact components are subjected to wavelet filtering to retain high frequency cerebral information, and then clean EEG is reconstructed. The performance of the proposed wavelet enhanced BSS-CCA method is tested on real EEG recordings contaminated with ocular and muscle artifacts, for which power spectral density is used as a quantitative measure. Our results suggest that the proposed hybrid approach minimizes ocular and muscle artifacts effectively, minimally affecting underlying cerebral activity in EEG recordings.

Keywords: Blind source separation, Canonical correlationanalysis, Electroencephalogram, Muscle artifact, Ocular artifact, Power spectrum, Wavelet threshold.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2333
4331 Hexavalent Chromium Pollution Abatement by use of Scrap Iron

Authors: Marius Gheju, Laura Cocheci

Abstract:

In this study, the reduction of Cr(VI) by use of scrap iron, a cheap and locally available industrial waste, was investigated in continuous system. The greater scrap iron efficiency observed for the first two sections of the column filling indicate that most of the reduction process was carried out in the bottom half of the column filling. This was ascribed to a constant decrease of Cr(VI) concentration inside the filling, as the water front passes from the bottom to the top end of the column. While the bottom section of the column filling was heavily passivated with secondary mineral phases, the top section was less affected by the passivation process; therefore the column filling would likely ensure the reduction of Cr(VI) for time periods longer than 216 hours. The experimental results indicate that fixed beds columns packed with scrap iron could be successfully used for the first step of Cr(VI) polluted wastewater treatment. However, the mass of scrap iron filling should be carefully estimated since it significantly affects the Cr(VI) reduction efficiency.

Keywords: hexavalent chromium, heavy metals, scrap iron, reduction capacity, wastewater treatment

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1835
4330 Zero Carbon & Low Energy Housing; Comparative Analysis of Two Persian Vernacular Architectural Solutions to Increase Energy Efficiency

Authors: N. Poorang

Abstract:

In order to respond the human needs, all regional, social, and economical factors are available to gain residents’ comfort and ideal architecture. There is no doubt the thermal comfort has to satisfy people not only for daily and physical activities but also creating pleasant area for mental activities and relaxing. It costs energy and increases greenhouse gas emissions.

Reducing energy use in buildings is a critical component of meeting carbon reduction commitments. Hence housing design represents a major opportunity to cut energy use and CO2 emissions.

In terms of energy efficiency, it is vital to propose and research modern design methods for buildings however vernacular architecture techniques are proven empirical existing practices which have to be considered. This research tries to compare two architectural solution were proposed by Persian vernacular architecture, to achieve energy efficiency in hot areas.

The aim of this research is to analyze two forms of traditional Persian architecture in different locations in order to develop a systematic research and sustainable technologies on adaptation to contemporary living standards.

Keywords: Comparative Analysis, Persian Vernacular Architecture, Sustainable architecture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2293
4329 A Wavelet-Based Watermarking Method Exploiting the Contrast Sensitivity Function

Authors: John N. Ellinas, Panagiotis Kenterlis

Abstract:

The efficiency of an image watermarking technique depends on the preservation of visually significant information. This is attained by embedding the watermark transparently with the maximum possible strength. The current paper presents an approach for still image digital watermarking in which the watermark embedding process employs the wavelet transform and incorporates Human Visual System (HVS) characteristics. The sensitivity of a human observer to contrast with respect to spatial frequency is described by the Contrast Sensitivity Function (CSF). The strength of the watermark within the decomposition subbands, which occupy an interval on the spatial frequencies, is adjusted according to this sensitivity. Moreover, the watermark embedding process is carried over the subband coefficients that lie on edges where distortions are less noticeable. The experimental evaluation of the proposed method shows very good results in terms of robustness and transparency.

Keywords: Image watermarking, wavelet transform, human visual system, contrast sensitivity function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2090
4328 Informative, Inclusive and Transparent Planning Methods for Sustainable Heritage Management

Authors: Mathilde Kirkegaard

Abstract:

The paper will focus on management of heritage that integrates the local community, and argue towards an obligation to integrate this social aspect in heritage management. By broadening the understanding of heritage, a sustainable heritage management takes its departure in more than a continual conservation of the physicality of heritage. The social aspect, or the local community, is in many govern heritage management situations being overlooked and it is not managed through community based urban planning methods, e.g.: citizen-inclusion, a transparent process, informative and inviting initiatives, etc. Historical sites are often being described by embracing terms such as “ours” and “us”: “our history” and “a history that is part of us”. Heritage is not something static, it is a link between the life that has been lived in the historical frames, and the life that is defining it today. This view on heritage is rooted in the strive to ensure that heritage sites, besides securing the national historical interest, have a value for those people who are affected by it: living in it or visiting it. Antigua Guatemala is a UNESCO-defined heritage site and this site is being ‘threatened’ by tourism, habitation and recreation. In other words: ‘the use’ of the site is considered a threat of the preservation of the heritage. Contradictory the same types of use (tourism and habitation) can also be considered development ability, and perhaps even a sustainable management solution. ‘The use’ of heritage is interlinked with the perspective that heritage sites ought to have a value for people today. In other words, the heritage sites should be comprised of a contemporary substance. Heritage is entwined in its context of physical structures and the social layer. A synergy between the use of heritage and the knowledge about the heritage can generate a sustainable preservation solution. The paper will exemplify this symbiosis with different examples of a heritage management that is centred around a local community inclusion. The inclusive method is not new in architectural planning and it refers to a top-down and bottom-up balance in decision making. It can be endeavoured through designs of an inclusive nature. Catalyst architecture is a planning method that strives to move the process of design solutions into the public space. Through process-orientated designs, or catalyst designs, the community can gain an insight into the process or be invited to participate in the process. A balance between bottom-up and top-down in the development process of a heritage site can, in relation to management measures, be understood to generate a socially sustainable solution. The ownership and engagement that can be created among the local community, along with the use that ultimately can gain an economic benefit, can delegate the maintenance and preservation. Informative, inclusive and transparent planning methods can generate a heritage management that is long-term due to the collective understanding and effort. This method handles sustainable management on two levels: the current preservation necessities and the long-term management, while ensuring a value for people today.

Keywords: Community, intangible, inclusion, planning, heritage.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 774
4327 Software Reliability Prediction Model Analysis

Authors: L. Mirtskhulava, M. Khunjgurua, N. Lomineishvili, K. Bakuria

Abstract:

Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.

Keywords: Exponential distribution, conditional mean time to failure, distribution function, mathematical model, software reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1679
4326 Visual Construction of Youth in Czechoslovak Press Photographs: 1959-1989

Authors: Jana Teplá

Abstract:

This text focuses on the visual construction of youth in press photographs in socialist Czechoslovakia. It deals with photographs in a magazine for young readers, Mladý svět, published by the Socialist Union of Youth of Czechoslovakia. The aim of this study was to develop a methodological tool for uncovering the values and the ideological messages in the strategies used in the visual construction of reality in the socialist press. Two methods of visual analysis were applied to the photographs, a quantitative content analysis and a social semiotic analysis. The social semiotic analysis focused on images representing youth in their free time. The study shows that the meaning of a socialist press photograph is a result of a struggle for ideological power between formal and informal ideologies. This struggle takes place within the process of production of the photograph and also within the process of interpretation of the photograph.

Keywords: Ideology, press photography, socialist regime, social semiotics, youth.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 891
4325 A Comparison of Single of Decision Tree, Decision Tree Forest and Group Method of Data Handling to Evaluate the Surface Roughness in Machining Process

Authors: S. Ghorbani, N. I. Polushin

Abstract:

The machinability of workpieces (AISI 1045 Steel, AA2024 aluminum alloy, A48-class30 gray cast iron) in turning operation has been carried out using different types of cutting tool (conventional, cutting tool with holes in toolholder and cutting tool filled up with composite material) under dry conditions on a turning machine at different stages of spindle speed (630-1000 rpm), feed rate (0.05-0.075 mm/rev), depth of cut (0.05-0.15 mm) and tool overhang (41-65 mm). Experimentation was performed as per Taguchi’s orthogonal array. To evaluate the relative importance of factors affecting surface roughness the single decision tree (SDT), Decision tree forest (DTF) and Group method of data handling (GMDH) were applied.

Keywords: Decision Tree Forest, GMDH, surface roughness, taguchi method, turning process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 954
4324 Biological Soil Conservation Planning by Spatial Multi-Criteria Evaluation Techniques (Case Study: Bonkuh Watershed in Iran)

Authors: Ali Akbar Jamali

Abstract:

This paper discusses site selection process for biological soil conservation planning. It was supported by a valuefocused approach and spatial multi-criteria evaluation techniques. A first set of spatial criteria was used to design a number of potential sites. Next, a new set of spatial and non-spatial criteria was employed, including the natural factors and the financial costs, together with the degree of suitability for the Bonkuh watershed to biological soil conservation planning and to recommend the most acceptable program. The whole process was facilitated by a new software tool that supports spatial multiple criteria evaluation, or SMCE in GIS software (ILWIS). The application of this tool, combined with a continual feedback by the public attentions, has provided an effective methodology to solve complex decisional problem in biological soil conservation planning.

Keywords: GIS, Biological soil conservation planning, Spatial multi-criteria evaluation, Iran

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715
4323 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ debugger, data acquisition system, FPGA, system signals, Qt framework.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 892
4322 On the Development of a Homogenized Earthquake Catalogue for Northern Algeria

Authors: I. Grigoratos, R. Monteiro

Abstract:

Regions with a significant percentage of non-seismically designed buildings and reduced urban planning are particularly vulnerable to natural hazards. In this context, the project ‘Improved Tools for Disaster Risk Mitigation in Algeria’ (ITERATE) aims at seismic risk mitigation in Algeria. Past earthquakes in North Algeria caused extensive damages, e.g. the El Asnam 1980 moment magnitude (Mw) 7.1 and Boumerdes 2003 Mw 6.8 earthquakes. This paper will address a number of proposed developments and considerations made towards a further improvement of the component of seismic hazard. In specific, an updated earthquake catalog (until year 2018) is compiled, and new conversion equations to moment magnitude are introduced. Furthermore, a network-based method for the estimation of the spatial and temporal distribution of the minimum magnitude of completeness is applied. We found relatively large values for Mc, due to the sparse network, and a nonlinear trend between Mw and body wave (mb) or local magnitude (ML), which are the most common scales reported in the region. Lastly, the resulting b-value of the Gutenberg-Richter distribution is sensitive to the declustering method.

Keywords: Conversion equation, magnitude of completeness, seismic events, seismic hazard.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 814
4321 Revisiting the Concept of Risk Analysis within the Context of Geospatial Database Design: A Collaborative Framework

Authors: J. Grira, Y. Bédard, S. Roche

Abstract:

The aim of this research is to design a collaborative framework that integrates risk analysis activities into the geospatial database design (GDD) process. Risk analysis is rarely undertaken iteratively as part of the present GDD methods in conformance to requirement engineering (RE) guidelines and risk standards. Accordingly, when risk analysis is performed during the GDD, some foreseeable risks may be overlooked and not reach the output specifications especially when user intentions are not systematically collected. This may lead to ill-defined requirements and ultimately in higher risks of geospatial data misuse. The adopted approach consists of 1) reviewing risk analysis process within the scope of RE and GDD, 2) analyzing the challenges of risk analysis within the context of GDD, and 3) presenting the components of a risk-based collaborative framework that improves the collection of the intended/forbidden usages of the data and helps geo-IT experts to discover implicit requirements and risks.

Keywords: Collaborative risk analysis, intention of use, Geospatial database design, Geospatial data misuse.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1665
4320 Lattice Boltzmann Simulation of Binary Mixture Diffusion Using Modern Graphics Processors

Authors: Mohammad Amin Safi, Mahmud Ashrafizaadeh, Amir Ali Ashrafizaadeh

Abstract:

A highly optimized implementation of binary mixture diffusion with no initial bulk velocity on graphics processors is presented. The lattice Boltzmann model is employed for simulating the binary diffusion of oxygen and nitrogen into each other with different initial concentration distributions. Simulations have been performed using the latest proposed lattice Boltzmann model that satisfies both the indifferentiability principle and the H-theorem for multi-component gas mixtures. Contemporary numerical optimization techniques such as memory alignment and increasing the multiprocessor occupancy are exploited along with some novel optimization strategies to enhance the computational performance on graphics processors using the C for CUDA programming language. Speedup of more than two orders of magnitude over single-core processors is achieved on a variety of Graphical Processing Unit (GPU) devices ranging from conventional graphics cards to advanced, high-end GPUs, while the numerical results are in excellent agreement with the available analytical and numerical data in the literature.

Keywords: Lattice Boltzmann model, Graphical processing unit, Binary mixture diffusion, 2D flow simulations, Optimized algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1555
4319 Mathematical Models for Overall Gas Transfer Coefficient Using Different Theories and Evaluating Their Measurement Accuracy

Authors: Shashank.B. Thakre, Lalit.B. Bhuyar, Samir.J. Deshmukh

Abstract:

Oxygen transfer, the process by which oxygen is transferred from the gaseous to liquid phase, is a vital part of the waste water treatment process. Because of low solubility of oxygen and consequent low rate of oxygen transfer, sufficient oxygen to meet the requirement of aerobic waste does not enter through normal surface air water interface. Many theories have come up in explaining the mechanism of gas transfer and absorption of non-reacting gases in a liquid, of out of which, Two film theory is important. An exiting mathematical model determines approximate value of Overall Gas Transfer coefficient. The Overall Gas Transfer coefficient, in case of Penetration theory, is 1.13 time more than that obtained in case of Two film theory. The difference is due to the difference in assumptions in the two theories. The paper aims at development of mathematical model which determines the value of Overall Gas Transfer coefficient with greater accuracy than the existing model.

Keywords: Theories, Dissolved oxygen, Mathematical model, Gas Transfer coefficient, Accuracy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1558
4318 The Effects of Shot and Grit Blasting Process Parameters on Steel Pipes Coating Adhesion

Authors: Saeed Khorasanizadeh

Abstract:

Adhesion strength of exterior or interior coating of steel pipes is too important. Increasing of coating adhesion on surfaces can increase the life time of coating, safety factor of transmitting line pipe and decreasing the rate of corrosion and costs. Preparation of steel pipe surfaces before doing the coating process is done by shot and grit blasting. This is a mechanical way to do it. Some effective parameters on that process, are particle size of abrasives, distance to surface, rate of abrasive flow, abrasive physical properties, shapes, selection of abrasive, kind of machine and its power, standard of surface cleanness degree, roughness, time of blasting and weather humidity. This search intended to find some better conditions which improve the surface preparation, adhesion strength and corrosion resistance of coating. So, this paper has studied the effect of varying abrasive flow rate, changing the abrasive particle size, time of surface blasting on steel surface roughness and over blasting on it by using the centrifugal blasting machine. After preparation of numbers of steel samples (according to API 5L X52) and applying epoxy powder coating on them, to compare strength adhesion of coating by Pull-Off test. The results have shown that, increasing the abrasive particles size and flow rate, can increase the steel surface roughness and coating adhesion strength but increasing the blasting time can do surface over blasting and increasing surface temperature and hardness too, change, decreasing steel surface roughness and coating adhesion strength.

Keywords: surface preparation, abrasive particles, adhesionstrength

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9075
4317 Preparation of Computer Model of the Aircraft for Numerical Aeroelasticity Tests – Flutter

Authors: M. Rychlik, R. Roszak, M. Morzynski, M. Nowak, H. Hausa, K. Kotecki

Abstract:

Article presents the geometry and structure reconstruction procedure of the aircraft model for flatter research (based on the I22-IRYDA aircraft). For reconstruction the Reverse Engineering techniques and advanced surface modeling CAD tools are used. Authors discuss all stages of data acquisition process, computation and analysis of measured data. For acquisition the three dimensional structured light scanner was used. In the further sections, details of reconstruction process are present. Geometry reconstruction procedure transform measured input data (points cloud) into the three dimensional parametric computer model (NURBS solid model) which is compatible with CAD systems. Parallel to the geometry of the aircraft, the internal structure (structural model) are extracted and modeled. In last chapter the evaluation of obtained models are discussed.

Keywords: computer modeling, numerical simulation, Reverse Engineering, structural model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1757
4316 A Fuzzy Dynamic Load Balancing Algorithm for Homogenous Distributed Systems

Authors: Ali M. Alakeel

Abstract:

Load balancing in distributed computer systems is the process of redistributing the work load among processors in the system to improve system performance. Most of previous research in using fuzzy logic for the purpose of load balancing has only concentrated in utilizing fuzzy logic concepts in describing processors load and tasks execution length. The responsibility of the fuzzy-based load balancing process itself, however, has not been discussed and in most reported work is assumed to be performed in a distributed fashion by all nodes in the network. This paper proposes a new fuzzy dynamic load balancing algorithm for homogenous distributed systems. The proposed algorithm utilizes fuzzy logic in dealing with inaccurate load information, making load distribution decisions, and maintaining overall system stability. In terms of control, we propose a new approach that specifies how, when, and by which node the load balancing is implemented. Our approach is called Centralized-But-Distributed (CBD).

Keywords: Dynamic load balancing, fuzzy logic, distributed systems, algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2454
4315 Evaluation of Internet Anxiety in SRBIAU Higher Education Students in Research Process

Authors: Nima Babazadeh Gashti, Nazanin Pilevari

Abstract:

Increase in using internet makes some problems that one of them is "internet anxiety". Internet anxiety is a type of anxious that people may feel during surfing internet or using internet for their educational purpose, blogging or streaming to digital libraries. The goal of this study is evaluating of internet anxiety among the management students. In this research Ealy's internet anxiety questionnaire, consists of positive and negative items, is completed by 310 participants. According to the findings, about 64.7% of them were equal or below to mean anxiety score (50). The distribution of internet anxiety scores was normal and there was no meaningful difference between men-s and women's anxiety level in this sample. Results also showed that there is no meaningful difference of internet anxiety level between different fields of study in Management. This evaluation will help managers to perform gap analysis between the existent level and the desired one. Future work would be providing techniques for abating human anxiety while using internet via human computer interaction techniques.

Keywords: Internet, anxiety, research process, internet identification, human computer interaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2046
4314 Migration from Commercial to in-House Developed Learning Management Systems

Authors: Lejla A. Bexheti, Visar S. Shehu, Adrian A. Besimi

Abstract:

The Learning Management Systems present learning environment which offers a collection of e-learning tools in a package that allows a common interface and information sharing among the tools. South East European University initial experience in LMS was with the usage of the commercial LMS-ANGEL. After a three year experience on ANGEL usage because of expenses that were very high it was decided to develop our own software. As part of the research project team for the in-house design and development of the new LMS, we primarily had to select the features that would cover our needs and also comply with the actual trends in the area of software development, and then design and develop the system. In this paper we present the process of LMS in-house development for South East European University, its architecture, conception and strengths with a special accent on the process of migration and integration with other enterprise applications.

Keywords: e-learning tools, LMS, migration, user feedback.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1639
4313 Survey to Assess the Feasibility of Executing the Web-Based Collaboration Process Using WBCS

Authors: Mohamed A. Sullabi

Abstract:

The importance of the formal specification in the software life cycle is barely concealing to anyone. Formal specifications use mathematical notation to describe the properties of information system precisely, without unduly constraining the way in how these properties are achieved. Having a correct and quality software specification is not easy task. This study concerns with how a group of rectifiers can communicate with each other and work to prepare and produce a correct formal software specification. WBCS has been implemented based mainly in the proposed supported cooperative work model and a survey conducted on the existing Webbased collaborative writing tools. This paper aims to assess the feasibility of executing the web-based collaboration process using WBCS. The purpose of conducting this test is to test the system as a whole for functionality and fitness for use based on the evaluation test plan.

Keywords: Formal methods, Formal specifications, collaborative writing, Usability testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1708
4312 Off-State Leakage Power Reduction by Automatic Monitoring and Control System

Authors: S. Abdollahi Pour, M. Saneei

Abstract:

This paper propose a new circuit design which monitor total leakage current during standby mode and generates the optimal reverse body bias voltage, by using the adaptive body bias (ABB) technique to compensate die-to-die parameter variations. Design details of power monitor are examined using simulation framework in 65nm and 32nm BTPM model CMOS process. Experimental results show the overhead of proposed circuit in terms of its power consumption is about 10 μW for 32nm technology and about 12 μW for 65nm technology at the same power supply voltage as the core power supply. Moreover the results show that our proposed circuit design is not far sensitive to the temperature variations and also process variations. Besides, uses the simple blocks which offer good sensitivity, high speed, the continuously feedback loop.

Keywords: leakage current, leakage power monitor, body biasing, low power

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1738
4311 Using Data Mining in Automotive Safety

Authors: Carine Cridelich, Pablo Juesas Cano, Emmanuel Ramasso, Noureddine Zerhouni, Bernd Weiler

Abstract:

Safety is one of the most important considerations when buying a new car. While active safety aims at avoiding accidents, passive safety systems such as airbags and seat belts protect the occupant in case of an accident. In addition to legal regulations, organizations like Euro NCAP provide consumers with an independent assessment of the safety performance of cars and drive the development of safety systems in automobile industry. Those ratings are mainly based on injury assessment reference values derived from physical parameters measured in dummies during a car crash test. The components and sub-systems of a safety system are designed to achieve the required restraint performance. Sled tests and other types of tests are then carried out by car makers and their suppliers to confirm the protection level of the safety system. A Knowledge Discovery in Databases (KDD) process is proposed in order to minimize the number of tests. The KDD process is based on the data emerging from sled tests according to Euro NCAP specifications. About 30 parameters of the passive safety systems from different data sources (crash data, dummy protocol) are first analysed together with experts opinions. A procedure is proposed to manage missing data and validated on real data sets. Finally, a procedure is developed to estimate a set of rough initial parameters of the passive system before testing aiming at reducing the number of tests.

Keywords: KDD process, passive safety systems, sled test, dummy injury assessment reference values, frontal impact

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2843
4310 Study of Solid Waste Landfill Suitability using Regional Screening Method and AHP in Rasht City

Authors: S. M. Monavari, P. Hoasami, S. Tajziehchi, N. Khorramichokami.

Abstract:

The practice of burying the solid waste under the ground is one of the waste disposal methods and dumping is known as an ultimate method in the fastest-growing cities like Rasht city in Iran. Some municipalities select the solid waste landfills without feasibility studies, programming, design and management plans. Therefore, several social and environmental impacts are created by these sites. In this study, the suitability of solid waste landfill in Rasht city, capital of Gilan Province is reviewed using Regional Screening Method (RSM), Geographic Information System (GIS) and Analytical Hierarchy Process (AHP). The results indicated that according to the suitability maps, the value of study site is midsuitable to suitable based on RSM and mid-suitable based on AHP.

Keywords: Analytical Hierarchy Process (AHP), Geographic Information System (GIS), Rasht City, Regional Screening Method (RSM), Solid Waste Landfill

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2412
4309 An Improvement of Flow Forming Process for Pressure Vessels by Four Rollers Machine

Authors: P. Sawitri, S. Cdr. Sittha, T. Kritsana

Abstract:

Flow forming is widely used in many industries, especially in defence technology industries. Pressure vessels requirements are high precision, light weight, seamless and optimum strength. For large pressure vessels, flow forming by 3 rollers machine were used. In case of long range rocket motor case flow forming and welding of pressure vessels have been used for manufacturing. Due to complication of welding process, researchers had developed 4 meters length pressure vessels without weldment by 4 rollers flow forming machine. Design and preparation of preform work pieces are performed. The optimization of flow forming parameter such as feed rate, spindle speed and depth of cut will be discussed. The experimental result shown relation of flow forming parameters to quality of flow formed tube and prototype pressure vessels have been made.

Keywords: Flow forming, Pressure vessel, four rollers, feed rate, spindle speed, cold work.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2789
4308 WPRiMA Tool: Managing Risks in Web Projects

Authors: Thamer Al-Rousan, Shahida Sulaiman, Rosalina Abdul Salam

Abstract:

Risk management is an essential fraction of project management, which plays a significant role in project success. Many failures associated with Web projects are the consequences of poor awareness of the risks involved and lack of process models that can serve as a guideline for the development of Web based applications. To circumvent this problem, contemporary process models have been devised for the development of conventional software. This paper introduces the WPRiMA (Web Project Risk Management Assessment) as the tool, which is used to implement RIAP, the risk identification architecture pattern model, which focuses upon the data from the proprietor-s and vendor-s perspectives. The paper also illustrates how WPRiMA tool works and how it can be used to calculate the risk level for a given Web project, to generate recommendations in order to facilitate risk avoidance in a project, and to improve the prospects of early risk management.

Keywords: Architecture pattern model, risk factors, risk identification, web project, web project risk management assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1592
4307 Classification of Earthquake Distribution in the Banda Sea Collision Zone with Point Process Approach

Authors: Henry J. Wattimanela, Udjianna S. Pasaribu, Nanang T. Puspito, Sapto W. Indratno

Abstract:

Banda Sea Collision Zone (BSCZ) is the result of the interaction and convergence of Indo-Australian plate, Eurasian plate and Pacific plate. This location is located in eastern Indonesia. This zone has a very high seismic activity. In this research, we will calculate the rate (λ) and Mean Square Error (MSE). By this result, we will classification earthquakes distribution in the BSCZ with the point process approach. Chi-square is used to determine the type of earthquakes distribution in the sub region of BSCZ. The data used in this research is data of earthquakes with a magnitude ≥ 6 SR for the period 1964-2013 and sourced from BMKG Jakarta. This research is expected to contribute to the Moluccas Province and surrounding local governments in performing spatial plan document related to disaster management.

Keywords: Banda sea collision zone, earthquakes, mean square error, Poisson distribution, chi-square test.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2116
4306 Does Bio-Demographic Diversity Influence Team Innovation through Participation Safety Climate and Team Reflexivity?

Authors: Maznah Abdullah, Mohammed Quaddus

Abstract:

Bio-demographic diversity which refers to age and gender of members in a team, has been frequently identified to influence team innovation directly. As the theories expanded, biodemographic diversity was suggested to influence team innovation via psychosocial trait and interaction process. This study examines those suggestions, in which psychosocial trait and interaction process were operationalized as 'participation safety climate' and 'team reflexivity' respectively. The role of team reflexivity as a mediator to participation safety climate and team innovation was also assessed. Due to a small number of teams involved in the study, data were analyzed by using a PLS-graph. While the results show only gender is significantly related to the participation safety climate, which in turn influences team reflexivity and team innovation, there is no statistical evidence that team reflexivity mediates the impact of participation safety climate on team innovation.

Keywords: Bio-demographic diversity, participation safetyclimate, team innovation, team reflexivity

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1368