Search results for: universal software radio peripheral
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6162

Search results for: universal software radio peripheral

5922 The 6Rs of Radiobiology in Photodynamic Therapy: Review

Authors: Kave Moloudi, Heidi Abrahamse, Blassan P. George

Abstract:

Radiotherapy (RT) and photodynamic therapy (PDT) are both forms of cancer treatment that aim to kill cancer cells while minimizing damage to healthy tissue. The similarity between RT and PDT lies in their mechanism of action. Both treatments use energy to damage cancer cells. RT uses high-energy radiation to damage the DNA of cancer cells, while PDT uses light energy to activate a photosensitizing agent, which produces reactive oxygen species (ROS) that damage the cancer cells. Both treatments require careful planning and monitoring to ensure the correct dose is delivered to the tumor while minimizing damage to surrounding healthy tissue. They are also often used in combination with other treatments, such as surgery or chemotherapy, to improve overall outcomes. However, there are also significant differences between RT and PDT. For example, RT is a non-invasive treatment that can be delivered externally or internally, while PDT requires the injection of a photosensitizing agent and the use of a specialized light source to activate it. Additionally, the side effects and risks associated with each treatment can vary. In this review, we focus on generalizing the 6Rs of radiobiology in PDT, which can open a window for the clinical application of Radio-photodynamic therapy with minimum side effects. Furthermore, this review can open new insight to work on and design new radio-photosensitizer agents in Radio-photodynamic therapy.

Keywords: radiobiology, photodynamic therapy, radiotherapy, 6Rs in radiobiology, ROS, DNA damages, cellular and molecular mechanism, clinical application.

Procedia PDF Downloads 56
5921 Influence and Dissemination of Solecism among Moroccan High School and University Students

Authors: Rachid Ed-Dali, Khalid Elasri

Abstract:

Mass media seem to provide a rich content for language acquisition. Exposure to television, the Internet, the mobile phone and other technological gadgets and devices helps enrich the student’s lexicon positively as well as negatively. The difficulties encountered by students while learning and acquiring second languages in addition to their eagerness to comprehend the content of a particular program prompt them to diversify their methods so as to achieve their targets. The present study highlights the significance of certain media channels and their involvement in language acquisition with the employment of the Natural Approach to further grasp whether students, especially secondary and high school students, learn and acquire errors through watching subtitled television programs. The chief objective is investigating the deductive and inductive relevance of certain programs beside the involvement of peripheral learning while acquiring mistakes.

Keywords: errors, mistakes, Natural Approach, peripheral learning, solecism

Procedia PDF Downloads 94
5920 Code Refactoring Using Slice-Based Cohesion Metrics and AOP

Authors: Jagannath Singh, Durga Prasad Mohapatra

Abstract:

Software refactoring is very essential for maintaining the software quality. It is an usual practice that we first design the software and then go for coding. But after coding is completed, if the requirement changes slightly or our expected output is not achieved, then we change the codes. For each small code change, we cannot change the design. In course of time, due to these small changes made to the code, the software design decays. Software refactoring is used to restructure the code in order to improve the design and quality of the software. In this paper, we propose an approach for performing code refactoring. We use slice-based cohesion metrics to identify the target methods which requires refactoring. After identifying the target methods, we use program slicing to divide the target method into two parts. Finally, we have used the concepts of Aspects to adjust the code structure so that the external behaviour of the original module does not change.

Keywords: software refactoring, program slicing, AOP, cohesion metrics, code restructure, AspectJ

Procedia PDF Downloads 476
5919 Analysis of Advanced Modulation Format Using Gain and Loss Spectrum for Long Range Radio over Fiber System

Authors: Shaina Nagpal, Amit Gupta

Abstract:

In this work, all optical Stimulated Brillouin Scattering (SBS) generated single sideband with suppressed carrier is presented to provide better efficiency. The generation of single sideband and enhanced carrier power signal using the SBS technique is further used to strengthen the low shifted sideband and to suppress the upshifted sideband. These generated single sideband signals are able to work at high frequency ranges. Also, generated single sideband is validated over 90 km transmission using single mode fiber with acceptable bit error rate. The results for an equivalent are then compared so that the acceptable technique is chosen and also the required quality for the optimum performance of the system is reported.

Keywords: stimulated Brillouin scattering, radio over fiber, upper side band, quality factor

Procedia PDF Downloads 203
5918 User-Perceived Quality Factors for Certification Model of Web-Based System

Authors: Jamaiah H. Yahaya, Aziz Deraman, Abdul Razak Hamdan, Yusmadi Yah Jusoh

Abstract:

One of the most essential issues in software products is to maintain it relevancy to the dynamics of the user’s requirements and expectation. Many studies have been carried out in quality aspect of software products to overcome these problems. Previous software quality assessment models and metrics have been introduced with strengths and limitations. In order to enhance the assurance and buoyancy of the software products, certification models have been introduced and developed. From our previous experiences in certification exercises and case studies collaborating with several agencies in Malaysia, the requirements for user based software certification approach is identified and demanded. The emergence of social network applications, the new development approach such as agile method and other varieties of software in the market have led to the domination of users over the software. As software become more accessible to the public through internet applications, users are becoming more critical in the quality of the services provided by the software. There are several categories of users in web-based systems with different interests and perspectives. The classifications and metrics are identified through brain storming approach with includes researchers, users and experts in this area. The new paradigm in software quality assessment is the main focus in our research. This paper discusses the classifications of users in web-based software system assessment and their associated factors and metrics for quality measurement. The quality model is derived based on IEEE structure and FCM model. The developments are beneficial and valuable to overcome the constraints and improve the application of software certification model in future.

Keywords: software certification model, user centric approach, software quality factors, metrics and measurements, web-based system

Procedia PDF Downloads 375
5917 Radiation Effects in the PVDF/Graphene Oxide Nanocomposites

Authors: Juliana V. Pereira, Adriana S. M. Batista, Jefferson P. Nascimento, Clascídia A. Furtado, Luiz O. Faria

Abstract:

Exposure to ionizing radiation has been found to induce changes in poly(vinylidene fluoride) (PVDF) homopolymers. The high dose gamma irradiation process induces the formation of C=C and C=O bonds in its [CH2-CF2]n main chain. The irradiation also provokes crosslinking and chain scission. All these radio-induced defects lead to changes in the PVDF crystalline structure. As a consequence, it is common to observe a decrease in the melting temperature (TM) and melting latent heat (LM) and some changes in its ferroelectric features. We have investigated the possibility of preparing nanocomposites of PVDF with graphene oxide (GO) through the radio-induction of molecular bonds. In this work, we discuss how the gamma radiation interacts with the nanocomposite crystalline structure.

Keywords: gamma irradiation, graphene oxide, nanocomposites, PVDF

Procedia PDF Downloads 252
5916 Fuzzy Set Approach to Study Appositives and Its Impact Due to Positional Alterations

Authors: E. Mike Dison, T. Pathinathan

Abstract:

Computing with Words (CWW) and Possibilistic Relational Universal Fuzzy (PRUF) are the two concepts which widely represent and measure the vaguely defined natural phenomenon. In this paper, we study the positional alteration of the phrases by which the impact of a natural language proposition gets affected and/or modified. We observe the gradations due to sensitivity/feeling of a statement towards the positional alterations. We derive the classification and modification of the meaning of words due to the positional alteration. We present the results with reference to set theoretic interpretations.

Keywords: appositive, computing with words, possibilistic relational universal fuzzy (PRUF), semantic sentiment analysis, set-theoretic interpretations

Procedia PDF Downloads 127
5915 Influence of Thickness on Optical Properties of ZnO Thin Films Prepared by Radio Frequency (RF) Sputtering Technique

Authors: S. Abdullahi, M. Momoh, K. U. Isah

Abstract:

Zinc oxide (ZnO) thin films of 75.5 nm and 130.5 nm were deposited at room temperature onto chemically and ultrasonically cleaned corning glass substrate by radio frequency technique and annealed at 150°C under nitrogen atmosphere for 60 minutes. The optical properties of the films were ascertained by UV-VIS-NIR spectrophotometry. Influence of the thickness of the films on the optical properties was studied keeping other deposition parameters constant. The optical transmittance spectra reveal a maximum transmittance of 81.49% and 84.26% respectively. The band gap of the films is found to be direct allowed transition and decreases with the increase in thickness of the films. The band gap energy (Eg) is in the range of 3.28 eV to 3.31 eV, respectively. These thin films are suitable for solar cell applications.

Keywords: optical constants, RF sputtering, Urbach energy, zinc oxide thin film

Procedia PDF Downloads 426
5914 New Standardized Framework for Developing Mobile Applications (Based On Real Case Studies and CMMI)

Authors: Ammar Khader Almasri

Abstract:

The software processes play a vital role for delivering a high quality software system that meets the user’s needs. There are many software development models which are used by most system developers, which can be categorized into two categories (traditional and new methodologies). Mobile applications like other desktop applications need appropriate and well-working software development process. Nevertheless, mobile applications have different features which limit their performance and efficiency like application size, mobile hardware features. Moreover, this research aims to help developers in using a standardized model for developing mobile applications.

Keywords: software development process, agile methods , moblile application development, traditional methods

Procedia PDF Downloads 357
5913 Hybrid Approach for Software Defect Prediction Using Machine Learning with Optimization Technique

Authors: C. Manjula, Lilly Florence

Abstract:

Software technology is developing rapidly which leads to the growth of various industries. Now-a-days, software-based applications have been adopted widely for business purposes. For any software industry, development of reliable software is becoming a challenging task because a faulty software module may be harmful for the growth of industry and business. Hence there is a need to develop techniques which can be used for early prediction of software defects. Due to complexities in manual prediction, automated software defect prediction techniques have been introduced. These techniques are based on the pattern learning from the previous software versions and finding the defects in the current version. These techniques have attracted researchers due to their significant impact on industrial growth by identifying the bugs in software. Based on this, several researches have been carried out but achieving desirable defect prediction performance is still a challenging task. To address this issue, here we present a machine learning based hybrid technique for software defect prediction. First of all, Genetic Algorithm (GA) is presented where an improved fitness function is used for better optimization of features in data sets. Later, these features are processed through Decision Tree (DT) classification model. Finally, an experimental study is presented where results from the proposed GA-DT based hybrid approach is compared with those from the DT classification technique. The results show that the proposed hybrid approach achieves better classification accuracy.

Keywords: decision tree, genetic algorithm, machine learning, software defect prediction

Procedia PDF Downloads 303
5912 Assessing the Current State of Software Engineering and Information Technology in Ghana

Authors: David Yartel

Abstract:

Drawing on the current state of software engineering and information technology in Ghana, the study documents its significant contribution to the development of Ghanaian industries. The study focuses on the application of modern trends in technology and the barriers faced in the area of software engineering and information technology. A thorough analysis of a dozen of interviews with stakeholders in software engineering and information technology via interviews reveals how modern trends in software engineering pose challenges to the industry in Ghana. Results show that to meet the expectation of modern software engineering and information technology trends, stakeholders must have skilled professionals, adequate infrastructure, and enhanced support for technology startups. Again, individuals should be encouraged to pursue a career in software engineering and information technology, as it has the propensity to increase the efficiency and effectiveness of work-related activities. This study recommends that stakeholders in software engineering and technology industries should invest enough in training more professionals by collaborating with international institutions well-versed in the area by organizing frequent training and seminars. The government should also provide funding opportunities for small businesses in the technology sector to drive creativity and development in order to bring about growth and development.

Keywords: software engineering, information technology, Ghana, development

Procedia PDF Downloads 59
5911 Proactive Pure Handoff Model with SAW-TOPSIS Selection and Time Series Predict

Authors: Harold Vásquez, Cesar Hernández, Ingrid Páez

Abstract:

This paper approach cognitive radio technic and applied pure proactive handoff Model to decrease interference between PU and SU and comparing it with reactive handoff model. Through the study and analysis of multivariate models SAW and TOPSIS join to 3 dynamic prediction techniques AR, MA ,and ARMA. To evaluate the best model is taken four metrics: number failed handoff, number handoff, number predictions, and number interference. The result presented the advantages using this type of pure proactive models to predict changes in the PU according to the selected channel and reduce interference. The model showed better performance was TOPSIS-MA, although TOPSIS-AR had a higher predictive ability this was not reflected in the interference reduction.

Keywords: cognitive radio, spectrum handoff, decision making, time series, wireless networks

Procedia PDF Downloads 458
5910 The Evaluation Model for the Quality of Software Based on Open Source Code

Authors: Li Donghong, Peng Fuyang, Yang Guanghua, Su Xiaoyan

Abstract:

Using open source code is a popular method of software development. How to evaluate the quality of software becomes more important. This paper introduces an evaluation model. The model evaluates the quality from four dimensions: technology, production, management, and development. Each dimension includes many indicators. The weight of indicator can be modified according to the purpose of evaluation. The paper also introduces a method of using the model. The evaluating result can provide good advice for evaluating or purchasing the software.

Keywords: evaluation model, software quality, open source code, evaluation indicator

Procedia PDF Downloads 355
5909 Network Conditioning and Transfer Learning for Peripheral Nerve Segmentation in Ultrasound Images

Authors: Harold Mauricio Díaz-Vargas, Cristian Alfonso Jimenez-Castaño, David Augusto Cárdenas-Peña, Guillermo Alberto Ortiz-Gómez, Alvaro Angel Orozco-Gutierrez

Abstract:

Precise identification of the nerves is a crucial task performed by anesthesiologists for an effective Peripheral Nerve Blocking (PNB). Now, anesthesiologists use ultrasound imaging equipment to guide the PNB and detect nervous structures. However, visual identification of the nerves from ultrasound images is difficult, even for trained specialists, due to artifacts and low contrast. The recent advances in deep learning make neural networks a potential tool for accurate nerve segmentation systems, so addressing the above issues from raw data. The most widely spread U-Net network yields pixel-by-pixel segmentation by encoding the input image and decoding the attained feature vector into a semantic image. This work proposes a conditioning approach and encoder pre-training to enhance the nerve segmentation of traditional U-Nets. Conditioning is achieved by the one-hot encoding of the kind of target nerve a the network input, while the pre-training considers five well-known deep networks for image classification. The proposed approach is tested in a collection of 619 US images, where the best C-UNet architecture yields an 81% Dice coefficient, outperforming the 74% of the best traditional U-Net. Results prove that pre-trained models with the conditional approach outperform their equivalent baseline by supporting learning new features and enriching the discriminant capability of the tested networks.

Keywords: nerve segmentation, U-Net, deep learning, ultrasound imaging, peripheral nerve blocking

Procedia PDF Downloads 76
5908 Financial Information and Collective Bargaining: Conflicting or Complementing

Authors: Humayun Murshed, Shibly Abdullah

Abstract:

The research conducted in early seventies apparently assumed the existence of a universal decision model for union negotiators and furthermore tended to regard financial information as a ‘neutral’ input into a rational decision-making process. However, research in the eighties began to question the neutrality of financial information as an input in collective bargaining rather viewing it as a potentially effective means for controlling the labour force. Furthermore, this later research also started challenging the simplistic assumptions relating particularly to union objectives which have underpinned the earlier search for universal union decision models. Despite the above developments there seems to be a dearth of studies in developing countries concerning the use of financial information in collective bargaining. This paper seeks to begin to remedy this deficiency. Utilising a case study approach based on two enterprises, one in the public sector and the other a multinational, the universal decision model is rejected and it is argued that the decision whether or not to use financial information is a contingent one and such a contingency is largely defined by the context and environment in which both union and management negotiators work. An attempt is also made to identify the factors constraining as well as promoting the use of financial information in collective bargaining, these being regarded as unique to the organizations within which the case studies are conducted.

Keywords: collective bargaining, developing countries, disclosures, financial information

Procedia PDF Downloads 443
5907 Extending the AOP Joinpoint Model for Memory and Type Safety

Authors: Amjad Nusayr

Abstract:

Software security is a general term used to any type of software architecture or model in which security aspects are incorporated in this architecture. These aspects are not part of the main logic of the underlying program. Software security can be achieved using a combination of approaches, including but not limited to secure software designs, third part component validation, and secure coding practices. Memory safety is one feature in software security where we ensure that any object in memory has a valid pointer or a reference with a valid type. Aspect-Oriented Programming (AOP) is a paradigm that is concerned with capturing the cross-cutting concerns in code development. AOP is generally used for common cross-cutting concerns like logging and DB transaction managing. In this paper, we introduce the concepts that enable AOP to be used for the purpose of memory and type safety. We also present ideas for extending AOP in software security practices.

Keywords: aspect oriented programming, programming languages, software security, memory and type safety

Procedia PDF Downloads 99
5906 Multilayer Neural Network and Fuzzy Logic Based Software Quality Prediction

Authors: Sadaf Sahar, Usman Qamar, Sadaf Ayaz

Abstract:

In the software development lifecycle, the quality prediction techniques hold a prime importance in order to minimize future design errors and expensive maintenance. There are many techniques proposed by various researchers, but with the increasing complexity of the software lifecycle model, it is crucial to develop a flexible system which can cater for the factors which in result have an impact on the quality of the end product. These factors include properties of the software development process and the product along with its operation conditions. In this paper, a neural network (perceptron) based software quality prediction technique is proposed. Using this technique, the stakeholders can predict the quality of the resulting software during the early phases of the lifecycle saving time and resources on future elimination of design errors and costly maintenance. This technique can be brought into practical use using successful training.

Keywords: software quality, fuzzy logic, perception, prediction

Procedia PDF Downloads 283
5905 Development of AUTOSAR Software Components of MDPS System

Authors: Jae-Woo Kim, Kyung-Joong Lee, Hyun-Sik Ahn

Abstract:

This paper describes the development of a Motor-Driven Power Steering (MDPS) system using Automotive Open System Architecture (AUTOSAR) methodology. The MDPS system is a new power steering technology for vehicles and it can enhance driver’s convenience and fuel efficiency. AUTOSAR defines common standards for the implementation of embedded automotive software. Some aspects of safety and timing requirements are analyzed. Through the AUTOSAR methodology, the embedded software becomes more flexible, reusable and maintainable than ever. Hence, we first design software components (SW-C) for MDPS control based on AUTOSAR and implement SW-Cs for MDPS control using authoring tool following AUTOSAR standards.

Keywords: AUTOSAR, MDPS, simulink, software component

Procedia PDF Downloads 328
5904 Archetypes in the Rorschach Inkblots: Imparting Universal Meaning in the Face of Ambiguity

Authors: Donna L. Roberts

Abstract:

The theory of archetypes contends that themes based on universal foundational images reside in and are transmitted generationally through the collective unconscious, which is referenced throughout an individual’s experience in order to make sense of that experience. There is then, a profoundly visceral and instinctive agreement on the gestalt of these universal themes and how they apply to the human condition throughout space and time. The inherent nature of projective tests, such as the Rorschach Inkblot, necessitates that the stimulus is ambiguous and thus elicits responses that reflect the unconscious inner psyche of the respondent. As the development of the Rorschach inkblots was relatively random and serendipitous - i.e., the inkblots were not engineered to elicit a specifically defined response - it would stand to reason that without a collective unconscious, every individual would interpret the inkblots in an individualized and unique way. Yet this is not the case. Instead, common themes appear in the images of the inkblots and their interpretation that reflect this deeper iconic understanding. This study analyzed the ten Rorschach inkblots in terms of Jungian archetypes, both with respect to the form of images on each plate and the commonly observed themes in responses. Examples of the archetypes were compared to each of the inkblots, with subsequent descriptions matched to the standard responses. The findings yielded clear and distinct instances of the universal symbolism intrinsic in the inkblot images as well as ubiquitous throughout the responses. This project illustrates the influence of the theories of psychologist Carl Gustav Jung on the interpretation of the ambiguous stimuli. It further serves to demonstrate the merit of Jungian psychology as a valuable tool with which to understand the nature of projective tests in general, Rorschach’s work specifically, and ultimately the broader implications for our collective unconscious and common humanity.

Keywords: archetypes, inkblots, projective tests, Rorschach

Procedia PDF Downloads 82
5903 Cognitive Weighted Polymorphism Factor: A New Cognitive Complexity Metric

Authors: T. Francis Thamburaj, A. Aloysius

Abstract:

Polymorphism is one of the main pillars of the object-oriented paradigm. It induces hidden forms of class dependencies which may impact software quality, resulting in higher cost factor for comprehending, debugging, testing, and maintaining the software. In this paper, a new cognitive complexity metric called Cognitive Weighted Polymorphism Factor (CWPF) is proposed. Apart from the software structural complexity, it includes the cognitive complexity on the basis of type. The cognitive weights are calibrated based on 27 empirical studies with 120 persons. A case study and experimentation of the new software metric shows positive results. Further, a comparative study is made and the correlation test has proved that CWPF complexity metric is a better, more comprehensive, and more realistic indicator of the software complexity than Abreu’s Polymorphism Factor (PF) complexity metric.

Keywords: cognitive complexity metric, object-oriented metrics, polymorphism factor, software metrics

Procedia PDF Downloads 411
5902 Comparative Advantage of Mobile Agent Application in Procuring Software Products on the Internet

Authors: Michael K. Adu, Boniface K. Alese, Olumide S. Ogunnusi

Abstract:

This paper brings to fore the inherent advantages in application of mobile agents to procure software products rather than downloading software content on the Internet. It proposes a system whereby the products come on compact disk with mobile agent as deliverable. The client/user purchases a software product, but must connect to the remote server of the software developer before installation. The user provides an activation code that activates mobile agent which is part of the software product on compact disk. The validity of the activation code is checked on connection at the developer’s end to ascertain authenticity and prevent piracy. The system is implemented by downloading two different software products as compare with installing same products on compact disk with mobile agent’s application. Downloading software contents from developer’s database as in the traditional method requires a continuously open connection between the client and the developer’s end, a fixed network is not economically or technically feasible. Mobile agent after being dispatched into the network becomes independent of the creating process and can operate asynchronously and autonomously. It can reconnect later after completing its task and return for result delivery. Response Time and Network Load are very minimal with application of Mobile agent.

Keywords: software products, software developer, internet, activation code, mobile agent

Procedia PDF Downloads 267
5901 Effects of Handgrip Isometric Training in Blood Pressure of Patients with Peripheral Artery Disease

Authors: Raphael M. Ritti-Dias, Marilia A. Correia, Wagner J. R. Domingues, Aline C. Palmeira, Paulo Longano, Nelson Wolosker, Lauro C. Vianna, Gabriel G. Cucato

Abstract:

Patients with peripheral arterial disease (PAD) have a high prevalence of hypertension, which contributes to a high risk of acute cardiovascular events and cardiovascular mortality. Strategies to reduce cardiovascular risk of these patients are needed. Meta-analysis studies have shown that isometric handgrip training promotes reductions in clinical blood pressure in normotensive, pre-hypertensive and hypertensive individuals. However, the effect of this exercise training on other cardiovascular function indicators in PAD patients remains unknown. Thus, the aim of this study was to analyze the effects of isometric handgrip training on blood pressure in patients with PAD. In this clinical trial, 28 patients were randomly allocated into two groups: isometric handgrip training (HG) and control (CG). The HG conducted the unilateral handgrip training three days per week (four sets of two minutes, with 30% of maximum voluntary contraction with an interval of four minutes between sets). CG was encouraged to increase their physical activity levels. At baseline and after eight weeks blood pressure and heart rate were obtained. ANOVA two-way for repeated measures with the group (GH and GC) and time (pre- and post-intervention) as factors was performed. After 8 weeks of training there were no significant changes in systolic blood pressure (HG pre 141 ± 24.0 mmHg vs. HG post 142 ± 22.0 mmHg; CG pre 140 ± 22.1 mmHg vs. CG post 146 ± 16.2 mmHg; P=0.18), diastolic blood pressure (HG pre 74 ± 10.4 mmHg vs. HG post 74 ± 11.9 mmHg; CG pre 72 ± 6.9 mmHg vs. CG post 74 ± 8.0 mmHg; P=0.22) and heart rate (HG pre 61 ± 10.5 bpm vs. HG post 62 ± 8.0 bpm; CG pre 64 ± 11.8 bpm vs. CG post 65 ± 13.6 bpm; P=0.81). In conclusion, our preliminary data indicate that isometric handgrip training did not modify blood pressure and heart rate in patients with PAD.

Keywords: blood pressure, exercise, isometric, peripheral artery disease

Procedia PDF Downloads 308
5900 Development on the Modeling Driven Architecture

Authors: Sahar Shahsavaripour Ghazanfarpour

Abstract:

As our daily life depends on quality of built services by systems and using devices in our environment; so education and model of software′s quality will be so important. By daily growth in software′s systems and using them so much, progressing process and requirements′ evaluation in primary level of progress especially architecture level in software get more important. Modern driver architecture changes an in dependent model of a level into some specific models that their purpose is reducing number of software changes into an executive model. Process of designing software engineering is mid-automated. The needed quality attribute in designing architecture and quality attribute in representation are in architecture models. The main problem is the relationship between needs, and elements in some aspect with implicit models and input sources in process. It’s because there is no detection ability. The MART profile is use to describe real-time properties and perform plat form modeling.

Keywords: MDA, DW, OMG, UML, AKB, software architecture, ontology, evaluation

Procedia PDF Downloads 461
5899 Inadequate Requirements Engineering Process: A Key Factor for Poor Software Development in Developing Nations: A Case Study

Authors: K. Adu Michael, K. Alese Boniface

Abstract:

Developing a reliable and sustainable software products is today a big challenge among up–coming software developers in Nigeria. The inability to develop a comprehensive problem statement needed to execute proper requirements engineering process is missing. The need to describe the ‘what’ of a system in one document, written in a natural language is a major step in the overall process of Software Engineering. Requirements Engineering is a process use to discover, analyze and validate system requirements. This process is needed in reducing software errors at the early stage of the development of software. The importance of each of the steps in Requirements Engineering is clearly explained in the context of using detailed problem statement from client/customer to get an overview of an existing system along with expectations from the new system. This paper elicits inadequate Requirements Engineering principle as the major cause of poor software development in developing nations using a case study of final year computer science students of a tertiary-education institution in Nigeria.

Keywords: client/customer, problem statement, requirements engineering, software developers

Procedia PDF Downloads 374
5898 Requirement Analysis for Emergency Management Software

Authors: Tomáš Ludík, Jiří Barta, Sabina Chytilová, Josef Navrátil

Abstract:

Emergency management is a discipline of dealing with and avoiding risks. Appropriate emergency management software allows better management of these risks and has a direct influence on reducing potential negative impacts. Although there are several emergency management software products in the Czech Republic, they cover user requirements from the emergency management field only partially. Therefore, the paper focuses on the issues of requirement analysis within development of emergency management software. Analysis of the current state describes the basic features and properties of user requirements for software development as well as basic methods and approaches for gathering these requirements. Then, the paper presents more specific mechanisms for requirement analysis based on chosen software development approach: structured, object-oriented or agile. Based on these experiences it is designed new methodology for requirement analysis. Methodology describes how to map user requirements comprehensively in the field of emergency management and thus reduce misunderstanding between software analyst and emergency manager. Proposed methodology was consulted with department of fire brigade and also has been applied in the requirements analysis for their current emergency management software. The proposed methodology has general character and can be used also in other specific areas during requirement analysis.

Keywords: emergency software, methodology, requirement analysis, stakeholders, use case diagram, user stories

Procedia PDF Downloads 510
5897 Hybrid Approach for the Min-Interference Frequency Assignment

Authors: F. Debbat, F. T. Bendimerad

Abstract:

The efficient frequency assignment for radio communications becomes more and more crucial when developing new information technologies and their applications. It is consists in defining an assignment of frequencies to radio links, to be established between base stations and mobile transmitters. Separation of the frequencies assigned is necessary to avoid interference. However, unnecessary separation causes an excess requirement for spectrum, the cost of which may be very high. This problem is NP-hard problem which cannot be solved by conventional optimization algorithms. It is therefore necessary to use metaheuristic methods to solve it. This paper proposes Hybrid approach based on simulated annealing (SA) and Tabu Search (TS) methods to solve this problem. Computational results, obtained on a number of standard problem instances, testify the effectiveness of the proposed approach.

Keywords: cellular mobile communication, frequency assignment problem, optimization, tabu search, simulated annealing

Procedia PDF Downloads 355
5896 A Proposal for Systematic Mapping Study of Software Security Testing, Verification and Validation

Authors: Adriano Bessa Albuquerque, Francisco Jose Barreto Nunes

Abstract:

Software vulnerabilities are increasing and not only impact services and processes availability as well as information confidentiality, integrity and privacy, but also cause changes that interfere in the development process. Security test could be a solution to reduce vulnerabilities. However, the variety of test techniques with the lack of real case studies of applying tests focusing on software development life cycle compromise its effective use. This paper offers an overview of how a Systematic Mapping Study (MS) about security verification, validation and test (VVT) was performed, besides presenting general results about this study.

Keywords: software test, software security verification validation and test, security test institutionalization, systematic mapping study

Procedia PDF Downloads 355
5895 TRAC: A Software Based New Track Circuit for Traffic Regulation

Authors: Jérôme de Reffye, Marc Antoni

Abstract:

Following the development of the ERTMS system, we think it is interesting to develop another software-based track circuit system which would fit secondary railway lines with an easy-to-work implementation and a low sensitivity to rail-wheel impedance variations. We called this track circuit 'Track Railway by Automatic Circuits.' To be internationally implemented, this system must not have any mechanical component and must be compatible with existing track circuit systems. For example, the system is independent from the French 'Joints Isolants Collés' that isolate track sections from one another, and it is equally independent from component used in Germany called 'Counting Axles,' in French 'compteur d’essieux.' This track circuit is fully interoperable. Such universality is obtained by replacing the train detection mechanical system with a space-time filtering of train position. The various track sections are defined by the frequency of a continuous signal. The set of frequencies related to the track sections is a set of orthogonal functions in a Hilbert Space. Thus the failure probability of track sections separation is precisely calculated on the basis of signal-to-noise ratio. SNR is a function of the level of traction current conducted by rails. This is the reason why we developed a very powerful algorithm to reject noise and jamming to obtain an SNR compatible with the precision required for the track circuit and SIL 4 level. The SIL 4 level is thus reachable by an adjustment of the set of orthogonal functions. Our major contributions to railway engineering signalling science are i) Train space localization is precisely defined by a calibration system. The operation bypasses the GSM-R radio system of the ERTMS system. Moreover, the track circuit is naturally protected against radio-type jammers. After the calibration operation, the track circuit is autonomous. ii) A mathematical topology adapted to train space localization by following the train through a linear time filtering of the received signal. Track sections are numerically defined and can be modified with a software update. The system was numerically simulated, and results were beyond our expectations. We achieved a precision of one meter. Rail-ground and rail-wheel impedance sensitivity analysis gave excellent results. Results are now complete and ready to be published. This work was initialised as a research project of the French Railways developed by the Pi-Ramses Company under SNCF contract and required five years to obtain the results. This track circuit is already at Level 3 of the ERTMS system, and it will be much cheaper to implement and to work. The traffic regulation is based on variable length track sections. As the traffic growths, the maximum speed is reduced, and the track section lengths are decreasing. It is possible if the elementary track section is correctly defined for the minimum speed and if every track section is able to emit with variable frequencies.

Keywords: track section, track circuits, space-time crossing, adaptive track section, automatic railway signalling

Procedia PDF Downloads 307
5894 A Study on the Implementation of Differentiating Instruction Based on Universal Design for Learning

Authors: Yong Wook Kim

Abstract:

The diversity of students in regular classrooms is increasing due to expand inclusive education and increase multicultural students in South Korea. In this diverse classroom environment, the universal design for learning (UDL) has been proposed as a way to meet both the educational need and social expectation of student achievement. UDL offers a variety of practical teaching methods, one of which is a differentiating instruction. The differentiating instruction has been pointed out resource limitation, organizational resistance, and lacks easy-to-implement framework. However, through the framework provided by the UDL, differentiating instruction is able to be flexible in their implementation. In practice, the UDL and differentiating instruction are complementary, but there is still a lack of research that suggests specific implementation methods that apply both concepts at the same time. This study was conducted to investigate the effects of differentiating instruction strategies according to learner characteristics (readiness, interest, learning profile), components of differentiating instruction (content, process, performance, learning environment), especially UDL principles (representation, behavior and expression, participation) existed in differentiating instruction, and implementation of UDL-based differentiating instruction through the Planning for All Learner (PAL) and UDL Lesson Plan Cycle. It is meaningful that such a series of studies can enhance the possibility of more concrete and realistic UDL-based teaching and learning strategies in the classroom, especially in inclusive settings.

Keywords: universal design for learning, differentiating instruction, UDL lesson plan, PAL

Procedia PDF Downloads 165
5893 Gender and Science: Is the Association Universal?

Authors: Neelam Kumar

Abstract:

Science is stratified, with an unequal distribution of research facilities and rewards among scientists. Gender stratification is one of the most prevalent phenomena in the world of science. In most countries gender segregation, horizontal as well as vertical, stands out in the field of science and engineering. India is no exception. This paper aims to examine: (1) gender and science associations, historical as well as contemporary, (2) women’s enrolment and gender differences in selection of academic fields, (2) women as professional researchers, (3) career path and recognition/trajectories. The paper reveals that in recent years the gender–science relationship has changed, but is not totally free from biases. Women’s enrolment into various science disciplines has shown remarkable and steady increase in most parts of the world, including India, yet they remain underrepresented in the S&T workforce, although to a lesser degree than in the past.

Keywords: gender, science, universal, women

Procedia PDF Downloads 276