Search results for: Software Architectures and Design
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6370

Search results for: Software Architectures and Design

6100 The Design Optimization for Sound Absorption Material of Multi-Layer Structure

Authors: Un-Hwan Park, Jun-Hyeok Heo, In-Sung Lee, Tae-Hyeon Oh, Dae-Kyu Park

Abstract:

Sound absorbing material is used as automotive interior material. Sound absorption coefficient should be predicted to design it. But it is difficult to predict sound absorbing coefficient because it is comprised of several material layers. So, its targets are achieved through many experimental tunings. It causes a lot of cost and time. In this paper, we propose the process to estimate the sound absorption coefficient with multi-layer structure. In order to estimate the coefficient, physical properties of each material are used. These properties also use predicted values by Foam-X software using the sound absorption coefficient data measured by impedance tube. Since there are many physical properties and the measurement equipment is expensive, the values predicted by software are used. Through the measurement of the sound absorption coefficient of each material, its physical properties are calculated inversely. The properties of each material are used to calculate the sound absorption coefficient of the multi-layer material. Since the absorption coefficient of multi-layer can be calculated, optimization design is possible through simulation. Then, we will compare and analyze the calculated sound absorption coefficient with the data measured by scaled reverberation chamber and impedance tubes for a prototype. If this method is used when developing automotive interior materials with multi-layer structure, the development effort can be reduced because it can be optimized by simulation. So, cost and time can be saved.

Keywords: Optimization design, multi-layer nonwoven, sound absorption coefficient, scaled reverberation chamber, impedance tubes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 953
6099 Extensions to Some AOSE Methodologies

Authors: Louay M. Jeroudaih, Mohamed S. Hajji

Abstract:

This paper looks into areas not covered by prominent Agent-Oriented Software Engineering (AOSE) methodologies. Extensive paper review led to the identification of two issues, first most of these methodologies almost neglect semantic web and ontology. Second, as expected, each one has its strength and weakness and may focus on some phases of the development lifecycle but not all of the phases. The work presented here builds extensions to a highly regarded AOSE methodology (MaSE) in order to cover the areas that this methodology does not concentrate on. The extensions include introducing an ontology stage for semantic representation and integrating early requirement specification from a methodology which mainly focuses on that. The integration involved developing transformation rules (with the necessary handling of nonmatching notions) between the two sets of representations and building the software which automates the transformation. The application of this integration on a case study is also presented in the paper. The main flow of MaSE stages was changed to smoothly accommodate the new additions.

Keywords: Agents, Intelligent Agents, Software Engineering(SE), UML, AUML, and Design Patterns.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1856
6098 Study Forecast Indoor Acoustics. A Case Study: the Auditorium Theatre-Hotel “Casa Tra Noi“

Authors: D. Germanò, D. Plutino, G. Cannistraro

Abstract:

The theatre-auditorium under investigation following the highly reflective characteristics of materials used in it (marble, painted wood, smooth plaster, etc), architectural and structural features of the Protocol and its intended use (very multifunctional: Auditorium, theatre, cinema, musicals, conference room) from the analysis of the statement of fact made by the acoustic simulation software Ramsete and supported by data obtained through a campaign of acoustic measurements of the state of fact made on the spot by a Fonomet Svantek model SVAN 957, appears to be acoustically inadequate. After the completion of the 3D model according to the specifications necessary software used forecast in order to be recognized by him, have made three simulations, acoustic simulation of the state of and acoustic simulation of two design solutions. Improved noise characteristics found in the first design solution, compared to the state in fact consists therefore in lowering Reverberation Time that you turn most desirable value, while the Indicators of Clarity, the Baricentric Time, the Lateral Efficiency, Ratio of Low Tmedia BR and defined the Speech Intelligibility improved significantly. Improved noise characteristics found instead in the second design solution, as compared to first design solution, is finally mostly in a more uniform distribution of Leq and in lowering Reverberation Time that you turn the optimum values. Indicators of Clarity, and the Lateral Efficiency improve further but at the expense of a value slightly worse than the BR. Slightly vary the remaining indices.

Keywords: Indoor, Acoustic, Acoustic simulation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4150
6097 JConqurr - A Multi-Core Programming Toolkit for Java

Authors: G.A.C.P. Ganegoda, D.M.A. Samaranayake, L.S. Bandara, K.A.D.N.K. Wimalawarne

Abstract:

With the popularity of the multi-core and many-core architectures there is a great requirement for software frameworks which can support parallel programming methodologies. In this paper we introduce an Eclipse toolkit, JConqurr which is easy to use and provides robust support for flexible parallel progrmaming. JConqurr is a multi-core and many-core programming toolkit for Java which is capable of providing support for common parallel programming patterns which include task, data, divide and conquer and pipeline parallelism. The toolkit uses an annotation and a directive mechanism to convert the sequential code into parallel code. In addition to that we have proposed a novel mechanism to achieve the parallelism using graphical processing units (GPU). Experiments with common parallelizable algorithms have shown that our toolkit can be easily and efficiently used to convert sequential code to parallel code and significant performance gains can be achieved.

Keywords: Multi-core, parallel programming patterns, GPU, Java, Eclipse plugin, toolkit,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2067
6096 Design Optimization of the Primary Containment Building of a Pressurized Water Reactor

Authors: M. Hossain, A. H. Khan, M. A. R. Sarkar

Abstract:

Primary containment structure is one of the five safety layers of a nuclear facility which is needed to be designed in such a manner that it can withstand the pressure and excessive radioactivity during accidental situations. It is also necessary to ensure minimization of cost with maximum possible safety in order to make the design economically feasible and attractive. This paper attempts to identify the optimum design conditions for primary containment structure considering both mechanical and radiation safety keeping the economic aspects in mind. This work takes advantage of commercial simulation software to identify the suitable conditions without the requirement of costly experiments. Generated data may be helpful for further studies.

Keywords: PWR, concrete containment, finite element approach, neutron attenuation, Von Mises Stress.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 831
6095 Design Considerations of Scheduling Systems Suitable for PCB Manufacturing

Authors: Oscar Fernandez-Flores, Tony Speer, Rodney Day

Abstract:

This paper identifies five key design characteristics of production scheduling software systems in printed circuit board (PCB) manufacturing. The authors consider that, in addition to an effective scheduling engine, a scheduling system should be able to process a preventative maintenance calendar, to give the user the flexibility to handle data using a variety of electronic sources, to run simulations to support decision-making, and to have simple and customisable graphical user interfaces. These design considerations were the result of a review of academic literature, the evaluation of commercial applications and a compilation of requirements of a PCB manufacturer. It was found that, from those systems that were evaluated, those that effectively addressed all five characteristics outlined in this paper were the most robust of all and could be used in PCB manufacturing.

Keywords: Decision-making, ERP, PCB, scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1759
6094 Requirement Engineering and Software Product Line Scoping Paradigm

Authors: Ahmed Mateen, Zhu Qingsheng, Faisal Shahzad

Abstract:

Requirement Engineering (RE) is a part being created for programming structure during the software development lifecycle. Software product line development is a new topic area within the domain of software engineering. It also plays important role in decision making and it is ultimately helpful in rising business environment for productive programming headway. Decisions are central to engineering processes and they hold them together. It is argued that better decisions will lead to better engineering. To achieve better decisions requires that they are understood in detail. In order to address the issues, companies are moving towards Software Product Line Engineering (SPLE) which helps in providing large varieties of products with minimum development effort and cost. This paper proposed a new framework for software product line and compared with other models. The results can help to understand the needs in SPL testing, by identifying points that still require additional investigation. In our future scenario, we will combine this model in a controlled environment with industrial SPL projects which will be the new horizon for SPL process management testing strategies.

Keywords: Requirements engineering, software product lines, scoping, process structure, domain specific language.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 774
6093 Design and Analysis of a New Dual-Band Microstrip Fractal Antenna

Authors: I. Zahraoui, J. Terhzaz, A. Errkik, El. H. Abdelmounim, A. Tajmouati, L. Abdellaoui, N. Ababssi, M. Latrach

Abstract:

This paper presents a novel design of a microstrip fractal antenna based on the use of Sierpinski triangle shape, it’s designed and simulated by using FR4 substrate in the operating frequency bands (GPS, WiMAX), the design is a fractal antenna with a modified ground structure. The proposed antenna is simulated and validated by using CST Microwave Studio Software, the simulated results presents good performances in term of radiation pattern and matching input impedance.

Keywords: Dual-band antenna, Fractal antenna, GPS band, Modified ground structure, Sierpinski triangle, WiMAX band.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3961
6092 Simulation Design of Separator for the Treatment of Emulsions

Authors: Irena Markovska, Dimitar Rusev, Nikolai Zaicev, Bogdan Bogdanov, Dimitar Georgiev, Yancho Hristov

Abstract:

A prototype model of an emulsion separator was designed and manufactured. Generally, it is a cylinder filled with different fractal modules. The emulsion was fed into the reactor by a peristaltic pump through an inlet placed at the boundary between the two phases. For hydrodynamic design and sizing of the reactor the assumptions of the theory of filtration were used and methods to describe the separation process were developed. Based on this methodology and using numerical methods and software of Autodesk the process is simulated in different operating modes. The basic hydrodynamic characteristics - speed and performance for different types of fractal systems and decisions to optimize the design of the reactor were also defined.

Keywords: fractal systems, reactor, separation, emulsions

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1697
6091 A Cognitive Measurement of Complexity and Comprehension for Object-Oriented Code

Authors: Amit Kumar Jakhar, Kumar Rajnish

Abstract:

Inherited complexity is one of the difficult tasks in software engineering field. Further, it is said that there is no physical laws or standard guidelines suit for designing different types of software. Hence, to make the software engineering as a matured engineering discipline like others, it is necessary that it has its own theoretical frameworks and laws. Software designing and development is a human effort which takes a lot of time and considers various parameters for successful completion of the software. The cognitive informatics plays an important role for understanding the essential characteristics of the software. The aim of this work is to consider the fundamental characteristics of the source code of Object-Oriented software i.e. complexity and understandability. The complexity of the programs is analyzed with the help of extracted important attributes of the source code, which is further utilized to evaluate the understandability factor. The aforementioned characteristics are analyzed on the basis of 16 C++ programs by distributing them to forty MCA students. They all tried to understand the source code of the given program and mean time is taken as the actual time needed to understand the program. For validation of this work, Briand’s framework is used and the presented metric is also evaluated comparatively with existing metric which proves its robustness.

Keywords: Software metrics, object-oriented, complexity, cognitive weight, understandability, basic control structures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1079
6090 Towards the Use of Software Product Metrics as an Indicator for Measuring Mobile Applications Power Consumption

Authors: Ching Kin Keong, Koh Tieng Wei, Abdul Azim Abd. Ghani, Khaironi Yatim Sharif

Abstract:

Maintaining factory default battery endurance rate over time in supporting huge amount of running applications on energy-restricted mobile devices has created a new challenge for mobile applications developer. While delivering customers’ unlimited expectations, developers are barely aware of efficient use of energy from the application itself. Thus, developers need a set of valid energy consumption indicators in assisting them to develop energy saving applications. In this paper, we present a few software product metrics that can be used as an indicator to measure energy consumption of Android-based mobile applications in the early of design stage. In particular, Trepn Profiler (Power profiling tool for Qualcomm processor) has used to collect the data of mobile application power consumption, and then analyzed for the 23 software metrics in this preliminary study. The results show that McCabe cyclomatic complexity, number of parameters, nested block depth, number of methods, weighted methods per class, number of classes, total lines of code and method lines have direct relationship with power consumption of mobile application.

Keywords: Battery endurance, software metrics, mobile application, power consumption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1901
6089 Application of Artificial Neural Network for Predicting Maintainability Using Object-Oriented Metrics

Authors: K. K. Aggarwal, Yogesh Singh, Arvinder Kaur, Ruchika Malhotra

Abstract:

Importance of software quality is increasing leading to development of new sophisticated techniques, which can be used in constructing models for predicting quality attributes. One such technique is Artificial Neural Network (ANN). This paper examined the application of ANN for software quality prediction using Object- Oriented (OO) metrics. Quality estimation includes estimating maintainability of software. The dependent variable in our study was maintenance effort. The independent variables were principal components of eight OO metrics. The results showed that the Mean Absolute Relative Error (MARE) was 0.265 of ANN model. Thus we found that ANN method was useful in constructing software quality model.

Keywords: Software quality, Measurement, Metrics, Artificial neural network, Coupling, Cohesion, Inheritance, Principal component analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2534
6088 CAD Tools Broadband Amplifier Design

Authors: Salwa M. Salah Eldeen, Fathi A. Farag, Abd Allah M. Moselhy

Abstract:

This paper proposed a new CAD tools for microwave amplifier design. The proposed tool is based on survey about the broadband amplifier design methods, such as the Feedback amplifiers, balanced amplifiers and Compensated Matching Network The proposed tool is developed for broadband amplifier using a compensated matching network "unconditional stability amplifier". The developed program is based on analytical procedures with ability of smith chart explanation. The C# software is used for the proposed tools implementation. The program is applied on broadband amplifier as an example for testing. The designed amplifier is considered as a broadband amplifier at the range 300-700 MHz. The results are highly agreement with the expected results. Finally, these methods can be extended for wide band amplifier design.

Keywords: Broadband amplifier (BBA), Compensated Matching Network, Microwave Amplifier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1320
6087 Estimation of Component Reusability through Reusability Metrics

Authors: Aditya Pratap Singh, Pradeep Tomar

Abstract:

Software reusability is an essential characteristic of Component-Based Software (CBS). The component reusability is an important assess for the effective reuse of components in CBS. The attributes of reusability proposed by various researchers are studied and four of them are identified as potential factors affecting reusability. This paper proposes metric for reusability estimation of black-box software component along with metrics for Interface Complexity, Understandability, Customizability and Reliability. An experiment is performed for estimation of reusability through a case study on a sample web application using a real world component.

Keywords: Component-based software, component reusability, customizability, interface complexity, reliability, understandability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3016
6086 Parametric Optimization of Hospital Design

Authors: M. K. Holst, P. H. Kirkegaard, L. D. Christoffersen

Abstract:

Present paper presents a parametric performancebased design model for optimizing hospital design. The design model operates with geometric input parameters defining the functional requirements of the hospital and input parameters in terms of performance objectives defining the design requirements and preferences of the hospital with respect to performances. The design model takes point of departure in the hospital functionalities as a set of defined parameters and rules describing the design requirements and preferences.

Keywords: Architectural Layout Design, Hospital Design, Parametric design, Performance-based models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2666
6085 A Model for Test Case Selection in the Software-Development Life Cycle

Authors: Adtha Lawanna

Abstract:

Software maintenance is one of the essential processes of Software-Development Life Cycle. The main philosophies of retaining software concern the improvement of errors, the revision of codes, the inhibition of future errors, and the development in piece and capacity. While the adjustment has been employing, the software structure has to be retested to an upsurge a level of assurance that it will be prepared due to the requirements. According to this state, the test cases must be considered for challenging the revised modules and the whole software. A concept of resolving this problem is ongoing by regression test selection such as the retest-all selections, random/ad-hoc selection and the safe regression test selection. Particularly, the traditional techniques concern a mapping between the test cases in a test suite and the lines of code it executes. However, there are not only the lines of code as one of the requirements that can affect the size of test suite but including the number of functions and faulty versions. Therefore, a model for test case selection is developed to cover those three requirements by the integral technique which can produce the smaller size of the test cases when compared with the traditional regression selection techniques.

Keywords: Software maintenance, regression test selection, test case.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1657
6084 A Model for Test Case Selection in the Software-Development Life Cycle

Authors: Adtha Lawanna

Abstract:

Software maintenance is one of the essential processes of Software-Development Life Cycle. The main philosophies of retaining software concern the improvement of errors, the revision of codes, the inhibition of future errors, and the development in piece and capacity. While the adjustment has been employing, the software structure has to be retested to an upsurge a level of assurance that it will be prepared due to the requirements. According to this state, the test cases must be considered for challenging the revised modules and the whole software. A concept of resolving this problem is ongoing by regression test selection such as the retest-all selections, random/ad-hoc selection and the safe regression test selection. Particularly, the traditional techniques concern a mapping between the test cases in a test suite and the lines of code it executes. However, there are not only the lines of code as one of the requirements that can affect the size of test suite but including the number of functions and faulty versions. Therefore, a model for test case selection is developed to cover those three requirements by the integral technique which can produce the smaller size of the test cases when compared with the traditional regression selection techniques.

Keywords: Software maintenance, regression test selection, test case.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1553
6083 Fast Adjustable Threshold for Uniform Neural Network Quantization

Authors: Alexander Goncharenko, Andrey Denisov, Sergey Alyamkin, Evgeny Terentev

Abstract:

The neural network quantization is highly desired procedure to perform before running neural networks on mobile devices. Quantization without fine-tuning leads to accuracy drop of the model, whereas commonly used training with quantization is done on the full set of the labeled data and therefore is both time- and resource-consuming. Real life applications require simplification and acceleration of quantization procedure that will maintain accuracy of full-precision neural network, especially for modern mobile neural network architectures like Mobilenet-v1, MobileNet-v2 and MNAS. Here we present a method to significantly optimize training with quantization procedure by introducing the trained scale factors for discretization thresholds that are separate for each filter. Using the proposed technique, we quantize the modern mobile architectures of neural networks with the set of train data of only ∼ 10% of the total ImageNet 2012 sample. Such reduction of train dataset size and small number of trainable parameters allow to fine-tune the network for several hours while maintaining the high accuracy of quantized model (accuracy drop was less than 0.5%). Ready-for-use models and code are available in the GitHub repository.

Keywords: Distillation, machine learning, neural networks, quantization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 685
6082 Toward an Architecture of a Component-Based System Supporting Separation of Non- Functional Concerns

Authors: Jerzy Nogiec, Kelley Trombly-Freytag, Shangping Ren

Abstract:

The promises of component-based technology can only be fully realized when the system contains in its design a necessary level of separation of concerns. The authors propose to focus on the concerns that emerge throughout the life cycle of the system and use them as an architectural foundation for the design of a component-based framework. The proposed model comprises a set of superimposed views of the system describing its functional and non-functional concerns. This approach is illustrated by the design of a specific framework for data analysis and data acquisition and supplemented with experiences from using the systems developed with this framework at the Fermi National Accelerator Laboratory.

Keywords: Distributed system, component-based technology, separation of concerns, software development, supervisory and control, QoS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1284
6081 A Metric-Set and Model Suggestion for Better Software Project Cost Estimation

Authors: Murat Ayyıldız, Oya Kalıpsız, Sırma Yavuz

Abstract:

Software project effort estimation is frequently seen as complex and expensive for individual software engineers. Software production is in a crisis. It suffers from excessive costs. Software production is often out of control. It has been suggested that software production is out of control because we do not measure. You cannot control what you cannot measure. During last decade, a number of researches on cost estimation have been conducted. The metric-set selection has a vital role in software cost estimation studies; its importance has been ignored especially in neural network based studies. In this study we have explored the reasons of those disappointing results and implemented different neural network models using augmented new metrics. The results obtained are compared with previous studies using traditional metrics. To be able to make comparisons, two types of data have been used. The first part of the data is taken from the Constructive Cost Model (COCOMO'81) which is commonly used in previous studies and the second part is collected according to new metrics in a leading international company in Turkey. The accuracy of the selected metrics and the data samples are verified using statistical techniques. The model presented here is based on Multi-Layer Perceptron (MLP). Another difficulty associated with the cost estimation studies is the fact that the data collection requires time and care. To make a more thorough use of the samples collected, k-fold, cross validation method is also implemented. It is concluded that, as long as an accurate and quantifiable set of metrics are defined and measured correctly, neural networks can be applied in software cost estimation studies with success

Keywords: Software Metrics, Software Cost Estimation, Neural Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1915
6080 An Overview of Technology Availability to Support Remote Decentralized Clinical Trials

Authors: S. Huber, B. Schnalzer, B. Alcalde, S. Hanke, L. Mpaltadoros, T. G. Stavropoulos, S. Nikolopoulos, I. Kompatsiaris, L. Pérez-Breva, V. Rodrigo-Casares, J. Fons-Martínez, J. de Bruin

Abstract:

Developing new medicine and health solutions and improving patient health currently rely on the successful execution of clinical trials, which generate relevant safety and efficacy data. For their success, recruitment and retention of participants are some of the most challenging aspects of protocol adherence. Main barriers include: i) lack of awareness of clinical trials; ii) long distance from the clinical site; iii) the burden on participants, including the duration and number of clinical visits, and iv) high dropout rate. Most of these aspects could be addressed with a new paradigm, namely the Remote Decentralized Clinical Trials (RDCTs). Furthermore, the COVID-19 pandemic has highlighted additional advantages and challenges for RDCTs in practice, allowing participants to join trials from home and not depending on site visits, etc. Nevertheless, RDCTs should follow the process and the quality assurance of conventional clinical trials, which involve several processes. For each part of the trial, the Building Blocks, existing software and technologies were assessed through a systematic search. The technology needed to perform RDCTs is widely available and validated but is yet segmented and developed in silos, as different software solutions address different parts of the trial and at various levels. The current paper is analyzing the availability of technology to perform RDCTs, identifying gaps and providing an overview of Basic Building Blocks and functionalities that need to be covered to support the described processes.

Keywords: architectures and frameworks for health informatics systems, clinical trials, information and communications technology, remote decentralized clinical trials, technology availability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 685
6079 Analyzing the Factors that Cause Parallel Performance Degradation in Parallel Graph-Based Computations Using Graph500

Authors: Mustafa Elfituri, Jonathan Cook

Abstract:

Recently, graph-based computations have become more important in large-scale scientific computing as they can provide a methodology to model many types of relations between independent objects. They are being actively used in fields as varied as biology, social networks, cybersecurity, and computer networks. At the same time, graph problems have some properties such as irregularity and poor locality that make their performance different than regular applications performance. Therefore, parallelizing graph algorithms is a hard and challenging task. Initial evidence is that standard computer architectures do not perform very well on graph algorithms. Little is known exactly what causes this. The Graph500 benchmark is a representative application for parallel graph-based computations, which have highly irregular data access and are driven more by traversing connected data than by computation. In this paper, we present results from analyzing the performance of various example implementations of Graph500, including a shared memory (OpenMP) version, a distributed (MPI) version, and a hybrid version. We measured and analyzed all the factors that affect its performance in order to identify possible changes that would improve its performance. Results are discussed in relation to what factors contribute to performance degradation.

Keywords: Graph computation, Graph500 benchmark, parallel architectures, parallel programming, workload characterization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 480
6078 Categorical Data Modeling: Logistic Regression Software

Authors: Abdellatif Tchantchane

Abstract:

A Matlab based software for logistic regression is developed to enhance the process of teaching quantitative topics and assist researchers with analyzing wide area of applications where categorical data is involved. The software offers an option of performing stepwise logistic regression to select the most significant predictors. The software includes a feature to detect influential observations in data, and investigates the effect of dropping or misclassifying an observation on a predictor variable. The input data may consist either as a set of individual responses (yes/no) with the predictor variables or as grouped records summarizing various categories for each unique set of predictor variables' values. Graphical displays are used to output various statistical results and to assess the goodness of fit of the logistic regression model. The software recognizes possible convergence constraints when present in data, and the user is notified accordingly.

Keywords: Logistic regression, Matlab, Categorical data, Influential observation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1844
6077 The Management in Large Emergency Situations – A Best Practise Case Study based on GIS for Management of Evacuation

Authors: Ion Baş, Claudiu Zoicaş, Angela Ioniţâ

Abstract:

In most of the cases, natural disasters lead to the necessity of evacuating people. The quality of evacuation management is dramatically improved by the use of information provided by decision support systems, which become indispensable in case of large scale evacuation operations. This paper presents a best practice case study. In November 2007, officers from the Emergency Situations Inspectorate “Crisana" of Bihor County from Romania participated to a cross-border evacuation exercise, when 700 people have been evacuated from Netherlands to Belgium. One of the main objectives of the exercise was the test of four different decision support systems. Afterwards, based on that experience, software system called TEVAC (Trans Border Evacuation) has been developed “in house" by the experts of this institution. This original software system was successfully tested in September 2008, during the deployment of the international exercise EU-HUROMEX 2008, the scenario involving real evacuation of 200 persons from Hungary to Romania. Based on the lessons learned and results, starting from April 2009, the TEVAC software is used by all Emergency Situations Inspectorates all over Romania.

Keywords: Emergency evacuation, Searching Features, TEVAC(Trans Border Evacuation) software system, User Interface Design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1543
6076 Speedup of Data Vortex Network Architecture

Authors: Qimin Yang

Abstract:

In this paper, 3X3 routing nodes are proposed to provide speedup and parallel processing capability in Data Vortex network architectures. The new design not only significantly improves network throughput and latency, but also eliminates the need for distributive traffic control mechanism originally embedded among nodes and the need for nodal buffering. The cost effectiveness is studied by a comparison study with the previously proposed 2- input buffered networks, and considerable performance enhancement can be achieved with similar or lower cost of hardware. Unlike previous implementation, the network leaves small probability of contention, therefore, the packet drop rate must be kept low for such implementation to be feasible and attractive, and it can be achieved with proper choice of operation conditions.

Keywords: Data Vortex, Packet Switch, Interconnection network, deflection, Network-on-chip

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1522
6075 Automating Test Activities: Test Cases Creation, Test Execution, and Test Reporting with Multiple Test Automation Tools

Authors: Loke Mun Sei

Abstract:

Software testing has become a mandatory process in assuring the software product quality. Hence, test management is needed in order to manage the test activities conducted in the software test life cycle. This paper discusses on the challenges faced in the software test life cycle, and how the test processes and test activities, mainly on test cases creation, test execution, and test reporting is being managed and automated using several test automation tools, i.e. Jira, Robot Framework, and Jenkins.

Keywords: Test automation tools, test case, test execution, test reporting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3046
6074 Multilevel Activation Functions For True Color Image Segmentation Using a Self Supervised Parallel Self Organizing Neural Network (PSONN) Architecture: A Comparative Study

Authors: Siddhartha Bhattacharyya, Paramartha Dutta, Ujjwal Maulik, Prashanta Kumar Nandi

Abstract:

The paper describes a self supervised parallel self organizing neural network (PSONN) architecture for true color image segmentation. The proposed architecture is a parallel extension of the standard single self organizing neural network architecture (SONN) and comprises an input (source) layer of image information, three single self organizing neural network architectures for segmentation of the different primary color components in a color image scene and one final output (sink) layer for fusion of the segmented color component images. Responses to the different shades of color components are induced in each of the three single network architectures (meant for component level processing) by applying a multilevel version of the characteristic activation function, which maps the input color information into different shades of color components, thereby yielding a processed component color image segmented on the basis of the different shades of component colors. The number of target classes in the segmented image corresponds to the number of levels in the multilevel activation function. Since the multilevel version of the activation function exhibits several subnormal responses to the input color image scene information, the system errors of the three component network architectures are computed from some subnormal linear index of fuzziness of the component color image scenes at the individual level. Several multilevel activation functions are employed for segmentation of the input color image scene using the proposed network architecture. Results of the application of the multilevel activation functions to the PSONN architecture are reported on three real life true color images. The results are substantiated empirically with the correlation coefficients between the segmented images and the original images.

Keywords: Colour image segmentation, fuzzy set theory, multi-level activation functions, parallel self-organizing neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1988
6073 Simulation of the Airflow Characteristic inside a Hard Disk Drive by Applying a Computational Fluid Dynamics Software

Authors: Chanchal Saha, Huynh Trung Luong, M. H. Aziz, Tharinan Rattanalert

Abstract:

Now-a-days, numbers of simulation software are being used all over the world to solve Computational Fluid Dynamics (CFD) related problems. In this present study, a commercial CFD simulation software namely STAR-CCM+ is applied to analyze the airflow characteristics inside a 2.5" hard disk drive. Each step of the software is described adequately to obtain the output and the data are verified with the theories to justify the robustness of the simulation outcome. This study gives an insight about the accuracy level of the CFD simulation software to compute CFD related problems although it largely depends upon the computer speed. Also this study will open avenues for further research.

Keywords: Computational fluid dynamics, Hard disk drive, Meshing, Recirculation filter, and Filter physics parameter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2114
6072 Remote Control Software for Rohde and Schwarz Instruments

Authors: Tomas Shejbal, Matej Petkov, Tomas Zalabsky, Jan Pidanic, Zdenek Nemec

Abstract:

The paper describes software for remote control and measuring with new Graphical User Interface for Rohde & Schwarz instruments. Software allows remote control through Ethernet and supports basic and advanced functions for control various type of instruments like network and spectrum analyzers, power meters, signal generators and oscilloscopes. Standard Commands for Programmable Instruments (SCPI) and Virtual Instrument Software Architecture (VISA) are used for remote control and setup of instruments. Developed software is modular with user friendly graphic user interface for each instrument with automatic identification of instruments.

Keywords: Remote control, Rohde&Schwarz, SCPI, VISA, MATLAB, spectum analyzer, network analyzer, oscilloscope, signal generator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5339
6071 The Use of Computer-Aided Design in Small Contractors in a Local Area of Korea

Authors: Myunghoun Jang

Abstract:

A survey of small-size contractors in Jeju was conducted to investigate college graduate's computer-aided design (CAD) competence. Most of small-size contractors use CAD software to review and update drawings submitted from an architect. This research analyzed the curriculum of the architectural engineering in several national universities. The CAD classes have 4 or 6 hours per week and use AutoCAD primarily. This paper proposes that a CAD class needs 6 hours per week, 2D drawing is the main theme in the curriculum, and exercises to make 3D models are also included in the CAD class. An improved method, for example Internet cafe and real time feedbacks using smartphones, to evaluate the reports and exercise results is necessary.

Keywords: Computer-aided design, CAD education, education improvement, small-size contractor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1258