Search results for: universal software radio peripheral
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6403

Search results for: universal software radio peripheral

6163 The 6Rs of Radiobiology in Photodynamic Therapy: Review

Authors: Kave Moloudi, Heidi Abrahamse, Blassan P. George

Abstract:

Radiotherapy (RT) and photodynamic therapy (PDT) are both forms of cancer treatment that aim to kill cancer cells while minimizing damage to healthy tissue. The similarity between RT and PDT lies in their mechanism of action. Both treatments use energy to damage cancer cells. RT uses high-energy radiation to damage the DNA of cancer cells, while PDT uses light energy to activate a photosensitizing agent, which produces reactive oxygen species (ROS) that damage the cancer cells. Both treatments require careful planning and monitoring to ensure the correct dose is delivered to the tumor while minimizing damage to surrounding healthy tissue. They are also often used in combination with other treatments, such as surgery or chemotherapy, to improve overall outcomes. However, there are also significant differences between RT and PDT. For example, RT is a non-invasive treatment that can be delivered externally or internally, while PDT requires the injection of a photosensitizing agent and the use of a specialized light source to activate it. Additionally, the side effects and risks associated with each treatment can vary. In this review, we focus on generalizing the 6Rs of radiobiology in PDT, which can open a window for the clinical application of Radio-photodynamic therapy with minimum side effects. Furthermore, this review can open new insight to work on and design new radio-photosensitizer agents in Radio-photodynamic therapy.

Keywords: radiobiology, photodynamic therapy, radiotherapy, 6Rs in radiobiology, ROS, DNA damages, cellular and molecular mechanism, clinical application.

Procedia PDF Downloads 109
6162 Software Evolution Based Activity Diagrams

Authors: Zine-Eddine Bouras, Abdelouaheb Talai

Abstract:

During the last two decades, the software evolution community has intensively tackled the software merging issue whose main objective is to merge in a consistent way different versions of software in order to obtain a new version. Well-established approaches, mainly based on the dependence analysis techniques, have been used to bring suitable solutions. These approaches concern the source code or software architectures. However, these solutions are more expensive due to the complexity and size. In this paper, we overcome this problem by operating at a high level of abstraction. The objective of this paper is to investigate the software merging at the level of UML activity diagrams, which is a new interesting issue. Its purpose is to merge activity diagrams instead of source code. The proposed approach, based on dependence analysis techniques, is illustrated through an appropriate case study.

Keywords: activity diagram, activity diagram slicing, dependency analysis, software merging

Procedia PDF Downloads 333
6161 Influence and Dissemination of Solecism among Moroccan High School and University Students

Authors: Rachid Ed-Dali, Khalid Elasri

Abstract:

Mass media seem to provide a rich content for language acquisition. Exposure to television, the Internet, the mobile phone and other technological gadgets and devices helps enrich the student’s lexicon positively as well as negatively. The difficulties encountered by students while learning and acquiring second languages in addition to their eagerness to comprehend the content of a particular program prompt them to diversify their methods so as to achieve their targets. The present study highlights the significance of certain media channels and their involvement in language acquisition with the employment of the Natural Approach to further grasp whether students, especially secondary and high school students, learn and acquire errors through watching subtitled television programs. The chief objective is investigating the deductive and inductive relevance of certain programs beside the involvement of peripheral learning while acquiring mistakes.

Keywords: errors, mistakes, Natural Approach, peripheral learning, solecism

Procedia PDF Downloads 124
6160 Quality and Coverage Assessment in Software Integration Based On Mutation Testing

Authors: Iyad Alazzam, Kenneth Magel, Izzat Alsmadi

Abstract:

The different activities and approaches in software testing try to find the most possible number of errors or failures with the least amount of possible effort. Mutation is a testing approach that is used to discover possible errors in tested applications. This is accomplished through changing one aspect of the software from its original and writes test cases to detect such change or mutation. In this paper, we present a mutation approach for testing software components integration aspects. Several mutation operations related to components integration are described and evaluated. A test case study of several open source code projects is collected. Proposed mutation operators are applied and evaluated. Results showed some insights and information that can help testing activities in detecting errors and improving coverage.

Keywords: software testing, integration testing, mutation, coverage, software design

Procedia PDF Downloads 435
6159 Calculating All Dark Energy and Dark Matter Effects through Dynamic Gravity Theory

Authors: Sean Michael Kinney

Abstract:

In 1666, Newton created the Law of Universal Gravitation. And in 1915, Einstein improved it to incorporate factors such as time dilation and gravitational lensing. But currently, there is a problem with this “universal” law. The math doesn’t work outside the confines of our solar system. And something is missing; any evidence of what gravity actually is and how it manifests. This paper explores the notion that gravity must obey the law of conservation of energy as all other forces in this universe have been shown to do. Explaining exactly what gravity is and how it manifests itself. And looking at many different implications that would be created are explained. And finally, use the math of Dynamic gravity to calculate Dark Energy and Dark Matter effects to explain all observations without the need for exotic measures.

Keywords: dynamic gravity, gravity, dark matter, dark energy

Procedia PDF Downloads 83
6158 Universal Design for Learning: Its Impact for Enhanced Performance in General Psychology

Authors: Jose Gay D. Gallego

Abstract:

This study examined the learning performance in General Psychology of 297 freshmen of the CPSU-Main through the Pre and Post Tests. The instructional intervention via Universal Design for Learning (UDL) was applied to 33% (97 out of 297) of these freshmen as the Treatment Group while the 67% (200) belonged to the Control Group for traditional instructions. Statistical inferences utilized one-way Analysis of Variance for mean differences; Pearson R Correlations for bivariate relationships, and; Factor Analysis for significant components that contributed most to the Universal Design for Learning instructions. Findings showed very high levels of students’ acquired UDL skills. Results in the pre test in General Psychology, respectively, were low and average when grouped into low and high achievers. There was no significant mean difference in the acquired nine UDL components when categorized into seven colleges to generalize that between colleges they were on the same very high levels. Significant differences were found in three test areas in General Psychology in eight colleges whose students in College of teacher education taking the lead in the learning performance. Significant differences were also traced in the post test in favor of the students in the treatment group. This proved that UDL really impacted the learning performance of the low achieving students. Significant correlations were revealed between the components of UDL and General Psychology. There were twenty four significant itemized components that contributed most to UDL instructional interventions. Implications were emphasized to maximizing the principles of UDL with the contention of thoughtful planning related to the four curricular pillars of UDL: (a) instructional goals, (b) instructional delivery methods, (c) instructional materials, and (d) student assessments.

Keywords: universal design for learning, enhanced performance, teaching innovation, technology in education, social science area

Procedia PDF Downloads 279
6157 Improving Learning and Teaching of Software Packages among Engineering Students

Authors: Sara Moridpour

Abstract:

To meet emerging industry needs, engineering students must learn different software packages and enhance their computational skills. Traditionally, face-to-face is selected as the preferred approach to teaching software packages. Face-to-face tutorials and workshops provide an interactive environment for learning software packages where the students can communicate with the teacher and interact with other students, evaluate their skills, and receive feedback. However, COVID-19 significantly limited face-to-face learning and teaching activities at universities. Worldwide lockdowns and the shift to online and remote learning and teaching provided the opportunity to introduce different strategies to enhance the interaction among students and teachers in online and virtual environments and improve the learning and teaching of software packages in online and blended teaching methods. This paper introduces a blended strategy to teach engineering software packages to undergraduate students. This article evaluates the effectiveness of the proposed blended learning and teaching strategy in students’ learning by comparing the impact of face-to-face, online and the proposed blended environments on students’ software skills. The paper evaluates the students’ software skills and their software learning through an authentic assignment. According to the results, the proposed blended teaching strategy successfully improves the software learning experience among undergraduate engineering students.

Keywords: teaching software packages, undergraduate students, blended learning and teaching, authentic assessment

Procedia PDF Downloads 119
6156 Analysis of Advanced Modulation Format Using Gain and Loss Spectrum for Long Range Radio over Fiber System

Authors: Shaina Nagpal, Amit Gupta

Abstract:

In this work, all optical Stimulated Brillouin Scattering (SBS) generated single sideband with suppressed carrier is presented to provide better efficiency. The generation of single sideband and enhanced carrier power signal using the SBS technique is further used to strengthen the low shifted sideband and to suppress the upshifted sideband. These generated single sideband signals are able to work at high frequency ranges. Also, generated single sideband is validated over 90 km transmission using single mode fiber with acceptable bit error rate. The results for an equivalent are then compared so that the acceptable technique is chosen and also the required quality for the optimum performance of the system is reported.

Keywords: stimulated Brillouin scattering, radio over fiber, upper side band, quality factor

Procedia PDF Downloads 238
6155 Radiation Effects in the PVDF/Graphene Oxide Nanocomposites

Authors: Juliana V. Pereira, Adriana S. M. Batista, Jefferson P. Nascimento, Clascídia A. Furtado, Luiz O. Faria

Abstract:

Exposure to ionizing radiation has been found to induce changes in poly(vinylidene fluoride) (PVDF) homopolymers. The high dose gamma irradiation process induces the formation of C=C and C=O bonds in its [CH2-CF2]n main chain. The irradiation also provokes crosslinking and chain scission. All these radio-induced defects lead to changes in the PVDF crystalline structure. As a consequence, it is common to observe a decrease in the melting temperature (TM) and melting latent heat (LM) and some changes in its ferroelectric features. We have investigated the possibility of preparing nanocomposites of PVDF with graphene oxide (GO) through the radio-induction of molecular bonds. In this work, we discuss how the gamma radiation interacts with the nanocomposite crystalline structure.

Keywords: gamma irradiation, graphene oxide, nanocomposites, PVDF

Procedia PDF Downloads 289
6154 Influence of Thickness on Optical Properties of ZnO Thin Films Prepared by Radio Frequency (RF) Sputtering Technique

Authors: S. Abdullahi, M. Momoh, K. U. Isah

Abstract:

Zinc oxide (ZnO) thin films of 75.5 nm and 130.5 nm were deposited at room temperature onto chemically and ultrasonically cleaned corning glass substrate by radio frequency technique and annealed at 150°C under nitrogen atmosphere for 60 minutes. The optical properties of the films were ascertained by UV-VIS-NIR spectrophotometry. Influence of the thickness of the films on the optical properties was studied keeping other deposition parameters constant. The optical transmittance spectra reveal a maximum transmittance of 81.49% and 84.26% respectively. The band gap of the films is found to be direct allowed transition and decreases with the increase in thickness of the films. The band gap energy (Eg) is in the range of 3.28 eV to 3.31 eV, respectively. These thin films are suitable for solar cell applications.

Keywords: optical constants, RF sputtering, Urbach energy, zinc oxide thin film

Procedia PDF Downloads 461
6153 Automated Detection of Related Software Changes by Probabilistic Neural Networks Model

Authors: Yuan Huang, Xiangping Chen, Xiaonan Luo

Abstract:

Current software are continuously updating. The change between two versions usually involves multiple program entities (e.g., packages, classes, methods, attributes) with multiple purposes (e.g., changed requirements, bug fixing). It is hard for developers to understand which changes are made for the same purpose. Whether two changes are related is not decided by the relationship between this two entities in the program. In this paper, we summarized 4 coupling rules(16 instances) and 4 state-combination types at the class, method and attribute levels for software change. Related Change Vector (RCV) are defined based on coupling rules and state-combination types, and applied to classify related software changes by using Probabilistic Neural Network during a software updating.

Keywords: PNN, related change, state-combination, logical coupling, software entity

Procedia PDF Downloads 441
6152 Fuzzy Set Approach to Study Appositives and Its Impact Due to Positional Alterations

Authors: E. Mike Dison, T. Pathinathan

Abstract:

Computing with Words (CWW) and Possibilistic Relational Universal Fuzzy (PRUF) are the two concepts which widely represent and measure the vaguely defined natural phenomenon. In this paper, we study the positional alteration of the phrases by which the impact of a natural language proposition gets affected and/or modified. We observe the gradations due to sensitivity/feeling of a statement towards the positional alterations. We derive the classification and modification of the meaning of words due to the positional alteration. We present the results with reference to set theoretic interpretations.

Keywords: appositive, computing with words, possibilistic relational universal fuzzy (PRUF), semantic sentiment analysis, set-theoretic interpretations

Procedia PDF Downloads 167
6151 Proactive Pure Handoff Model with SAW-TOPSIS Selection and Time Series Predict

Authors: Harold Vásquez, Cesar Hernández, Ingrid Páez

Abstract:

This paper approach cognitive radio technic and applied pure proactive handoff Model to decrease interference between PU and SU and comparing it with reactive handoff model. Through the study and analysis of multivariate models SAW and TOPSIS join to 3 dynamic prediction techniques AR, MA ,and ARMA. To evaluate the best model is taken four metrics: number failed handoff, number handoff, number predictions, and number interference. The result presented the advantages using this type of pure proactive models to predict changes in the PU according to the selected channel and reduce interference. The model showed better performance was TOPSIS-MA, although TOPSIS-AR had a higher predictive ability this was not reflected in the interference reduction.

Keywords: cognitive radio, spectrum handoff, decision making, time series, wireless networks

Procedia PDF Downloads 495
6150 Network Conditioning and Transfer Learning for Peripheral Nerve Segmentation in Ultrasound Images

Authors: Harold Mauricio Díaz-Vargas, Cristian Alfonso Jimenez-Castaño, David Augusto Cárdenas-Peña, Guillermo Alberto Ortiz-Gómez, Alvaro Angel Orozco-Gutierrez

Abstract:

Precise identification of the nerves is a crucial task performed by anesthesiologists for an effective Peripheral Nerve Blocking (PNB). Now, anesthesiologists use ultrasound imaging equipment to guide the PNB and detect nervous structures. However, visual identification of the nerves from ultrasound images is difficult, even for trained specialists, due to artifacts and low contrast. The recent advances in deep learning make neural networks a potential tool for accurate nerve segmentation systems, so addressing the above issues from raw data. The most widely spread U-Net network yields pixel-by-pixel segmentation by encoding the input image and decoding the attained feature vector into a semantic image. This work proposes a conditioning approach and encoder pre-training to enhance the nerve segmentation of traditional U-Nets. Conditioning is achieved by the one-hot encoding of the kind of target nerve a the network input, while the pre-training considers five well-known deep networks for image classification. The proposed approach is tested in a collection of 619 US images, where the best C-UNet architecture yields an 81% Dice coefficient, outperforming the 74% of the best traditional U-Net. Results prove that pre-trained models with the conditional approach outperform their equivalent baseline by supporting learning new features and enriching the discriminant capability of the tested networks.

Keywords: nerve segmentation, U-Net, deep learning, ultrasound imaging, peripheral nerve blocking

Procedia PDF Downloads 112
6149 Code Refactoring Using Slice-Based Cohesion Metrics and AOP

Authors: Jagannath Singh, Durga Prasad Mohapatra

Abstract:

Software refactoring is very essential for maintaining the software quality. It is an usual practice that we first design the software and then go for coding. But after coding is completed, if the requirement changes slightly or our expected output is not achieved, then we change the codes. For each small code change, we cannot change the design. In course of time, due to these small changes made to the code, the software design decays. Software refactoring is used to restructure the code in order to improve the design and quality of the software. In this paper, we propose an approach for performing code refactoring. We use slice-based cohesion metrics to identify the target methods which requires refactoring. After identifying the target methods, we use program slicing to divide the target method into two parts. Finally, we have used the concepts of Aspects to adjust the code structure so that the external behaviour of the original module does not change.

Keywords: software refactoring, program slicing, AOP, cohesion metrics, code restructure, AspectJ

Procedia PDF Downloads 517
6148 Financial Information and Collective Bargaining: Conflicting or Complementing

Authors: Humayun Murshed, Shibly Abdullah

Abstract:

The research conducted in early seventies apparently assumed the existence of a universal decision model for union negotiators and furthermore tended to regard financial information as a ‘neutral’ input into a rational decision-making process. However, research in the eighties began to question the neutrality of financial information as an input in collective bargaining rather viewing it as a potentially effective means for controlling the labour force. Furthermore, this later research also started challenging the simplistic assumptions relating particularly to union objectives which have underpinned the earlier search for universal union decision models. Despite the above developments there seems to be a dearth of studies in developing countries concerning the use of financial information in collective bargaining. This paper seeks to begin to remedy this deficiency. Utilising a case study approach based on two enterprises, one in the public sector and the other a multinational, the universal decision model is rejected and it is argued that the decision whether or not to use financial information is a contingent one and such a contingency is largely defined by the context and environment in which both union and management negotiators work. An attempt is also made to identify the factors constraining as well as promoting the use of financial information in collective bargaining, these being regarded as unique to the organizations within which the case studies are conducted.

Keywords: collective bargaining, developing countries, disclosures, financial information

Procedia PDF Downloads 475
6147 User-Perceived Quality Factors for Certification Model of Web-Based System

Authors: Jamaiah H. Yahaya, Aziz Deraman, Abdul Razak Hamdan, Yusmadi Yah Jusoh

Abstract:

One of the most essential issues in software products is to maintain it relevancy to the dynamics of the user’s requirements and expectation. Many studies have been carried out in quality aspect of software products to overcome these problems. Previous software quality assessment models and metrics have been introduced with strengths and limitations. In order to enhance the assurance and buoyancy of the software products, certification models have been introduced and developed. From our previous experiences in certification exercises and case studies collaborating with several agencies in Malaysia, the requirements for user based software certification approach is identified and demanded. The emergence of social network applications, the new development approach such as agile method and other varieties of software in the market have led to the domination of users over the software. As software become more accessible to the public through internet applications, users are becoming more critical in the quality of the services provided by the software. There are several categories of users in web-based systems with different interests and perspectives. The classifications and metrics are identified through brain storming approach with includes researchers, users and experts in this area. The new paradigm in software quality assessment is the main focus in our research. This paper discusses the classifications of users in web-based software system assessment and their associated factors and metrics for quality measurement. The quality model is derived based on IEEE structure and FCM model. The developments are beneficial and valuable to overcome the constraints and improve the application of software certification model in future.

Keywords: software certification model, user centric approach, software quality factors, metrics and measurements, web-based system

Procedia PDF Downloads 408
6146 Archetypes in the Rorschach Inkblots: Imparting Universal Meaning in the Face of Ambiguity

Authors: Donna L. Roberts

Abstract:

The theory of archetypes contends that themes based on universal foundational images reside in and are transmitted generationally through the collective unconscious, which is referenced throughout an individual’s experience in order to make sense of that experience. There is then, a profoundly visceral and instinctive agreement on the gestalt of these universal themes and how they apply to the human condition throughout space and time. The inherent nature of projective tests, such as the Rorschach Inkblot, necessitates that the stimulus is ambiguous and thus elicits responses that reflect the unconscious inner psyche of the respondent. As the development of the Rorschach inkblots was relatively random and serendipitous - i.e., the inkblots were not engineered to elicit a specifically defined response - it would stand to reason that without a collective unconscious, every individual would interpret the inkblots in an individualized and unique way. Yet this is not the case. Instead, common themes appear in the images of the inkblots and their interpretation that reflect this deeper iconic understanding. This study analyzed the ten Rorschach inkblots in terms of Jungian archetypes, both with respect to the form of images on each plate and the commonly observed themes in responses. Examples of the archetypes were compared to each of the inkblots, with subsequent descriptions matched to the standard responses. The findings yielded clear and distinct instances of the universal symbolism intrinsic in the inkblot images as well as ubiquitous throughout the responses. This project illustrates the influence of the theories of psychologist Carl Gustav Jung on the interpretation of the ambiguous stimuli. It further serves to demonstrate the merit of Jungian psychology as a valuable tool with which to understand the nature of projective tests in general, Rorschach’s work specifically, and ultimately the broader implications for our collective unconscious and common humanity.

Keywords: archetypes, inkblots, projective tests, Rorschach

Procedia PDF Downloads 111
6145 Effects of Handgrip Isometric Training in Blood Pressure of Patients with Peripheral Artery Disease

Authors: Raphael M. Ritti-Dias, Marilia A. Correia, Wagner J. R. Domingues, Aline C. Palmeira, Paulo Longano, Nelson Wolosker, Lauro C. Vianna, Gabriel G. Cucato

Abstract:

Patients with peripheral arterial disease (PAD) have a high prevalence of hypertension, which contributes to a high risk of acute cardiovascular events and cardiovascular mortality. Strategies to reduce cardiovascular risk of these patients are needed. Meta-analysis studies have shown that isometric handgrip training promotes reductions in clinical blood pressure in normotensive, pre-hypertensive and hypertensive individuals. However, the effect of this exercise training on other cardiovascular function indicators in PAD patients remains unknown. Thus, the aim of this study was to analyze the effects of isometric handgrip training on blood pressure in patients with PAD. In this clinical trial, 28 patients were randomly allocated into two groups: isometric handgrip training (HG) and control (CG). The HG conducted the unilateral handgrip training three days per week (four sets of two minutes, with 30% of maximum voluntary contraction with an interval of four minutes between sets). CG was encouraged to increase their physical activity levels. At baseline and after eight weeks blood pressure and heart rate were obtained. ANOVA two-way for repeated measures with the group (GH and GC) and time (pre- and post-intervention) as factors was performed. After 8 weeks of training there were no significant changes in systolic blood pressure (HG pre 141 ± 24.0 mmHg vs. HG post 142 ± 22.0 mmHg; CG pre 140 ± 22.1 mmHg vs. CG post 146 ± 16.2 mmHg; P=0.18), diastolic blood pressure (HG pre 74 ± 10.4 mmHg vs. HG post 74 ± 11.9 mmHg; CG pre 72 ± 6.9 mmHg vs. CG post 74 ± 8.0 mmHg; P=0.22) and heart rate (HG pre 61 ± 10.5 bpm vs. HG post 62 ± 8.0 bpm; CG pre 64 ± 11.8 bpm vs. CG post 65 ± 13.6 bpm; P=0.81). In conclusion, our preliminary data indicate that isometric handgrip training did not modify blood pressure and heart rate in patients with PAD.

Keywords: blood pressure, exercise, isometric, peripheral artery disease

Procedia PDF Downloads 336
6144 Hybrid Approach for the Min-Interference Frequency Assignment

Authors: F. Debbat, F. T. Bendimerad

Abstract:

The efficient frequency assignment for radio communications becomes more and more crucial when developing new information technologies and their applications. It is consists in defining an assignment of frequencies to radio links, to be established between base stations and mobile transmitters. Separation of the frequencies assigned is necessary to avoid interference. However, unnecessary separation causes an excess requirement for spectrum, the cost of which may be very high. This problem is NP-hard problem which cannot be solved by conventional optimization algorithms. It is therefore necessary to use metaheuristic methods to solve it. This paper proposes Hybrid approach based on simulated annealing (SA) and Tabu Search (TS) methods to solve this problem. Computational results, obtained on a number of standard problem instances, testify the effectiveness of the proposed approach.

Keywords: cellular mobile communication, frequency assignment problem, optimization, tabu search, simulated annealing

Procedia PDF Downloads 392
6143 New Standardized Framework for Developing Mobile Applications (Based On Real Case Studies and CMMI)

Authors: Ammar Khader Almasri

Abstract:

The software processes play a vital role for delivering a high quality software system that meets the user’s needs. There are many software development models which are used by most system developers, which can be categorized into two categories (traditional and new methodologies). Mobile applications like other desktop applications need appropriate and well-working software development process. Nevertheless, mobile applications have different features which limit their performance and efficiency like application size, mobile hardware features. Moreover, this research aims to help developers in using a standardized model for developing mobile applications.

Keywords: software development process, agile methods , moblile application development, traditional methods

Procedia PDF Downloads 390
6142 Hybrid Approach for Software Defect Prediction Using Machine Learning with Optimization Technique

Authors: C. Manjula, Lilly Florence

Abstract:

Software technology is developing rapidly which leads to the growth of various industries. Now-a-days, software-based applications have been adopted widely for business purposes. For any software industry, development of reliable software is becoming a challenging task because a faulty software module may be harmful for the growth of industry and business. Hence there is a need to develop techniques which can be used for early prediction of software defects. Due to complexities in manual prediction, automated software defect prediction techniques have been introduced. These techniques are based on the pattern learning from the previous software versions and finding the defects in the current version. These techniques have attracted researchers due to their significant impact on industrial growth by identifying the bugs in software. Based on this, several researches have been carried out but achieving desirable defect prediction performance is still a challenging task. To address this issue, here we present a machine learning based hybrid technique for software defect prediction. First of all, Genetic Algorithm (GA) is presented where an improved fitness function is used for better optimization of features in data sets. Later, these features are processed through Decision Tree (DT) classification model. Finally, an experimental study is presented where results from the proposed GA-DT based hybrid approach is compared with those from the DT classification technique. The results show that the proposed hybrid approach achieves better classification accuracy.

Keywords: decision tree, genetic algorithm, machine learning, software defect prediction

Procedia PDF Downloads 332
6141 Assessing the Current State of Software Engineering and Information Technology in Ghana

Authors: David Yartel

Abstract:

Drawing on the current state of software engineering and information technology in Ghana, the study documents its significant contribution to the development of Ghanaian industries. The study focuses on the application of modern trends in technology and the barriers faced in the area of software engineering and information technology. A thorough analysis of a dozen of interviews with stakeholders in software engineering and information technology via interviews reveals how modern trends in software engineering pose challenges to the industry in Ghana. Results show that to meet the expectation of modern software engineering and information technology trends, stakeholders must have skilled professionals, adequate infrastructure, and enhanced support for technology startups. Again, individuals should be encouraged to pursue a career in software engineering and information technology, as it has the propensity to increase the efficiency and effectiveness of work-related activities. This study recommends that stakeholders in software engineering and technology industries should invest enough in training more professionals by collaborating with international institutions well-versed in the area by organizing frequent training and seminars. The government should also provide funding opportunities for small businesses in the technology sector to drive creativity and development in order to bring about growth and development.

Keywords: software engineering, information technology, Ghana, development

Procedia PDF Downloads 98
6140 The Evaluation Model for the Quality of Software Based on Open Source Code

Authors: Li Donghong, Peng Fuyang, Yang Guanghua, Su Xiaoyan

Abstract:

Using open source code is a popular method of software development. How to evaluate the quality of software becomes more important. This paper introduces an evaluation model. The model evaluates the quality from four dimensions: technology, production, management, and development. Each dimension includes many indicators. The weight of indicator can be modified according to the purpose of evaluation. The paper also introduces a method of using the model. The evaluating result can provide good advice for evaluating or purchasing the software.

Keywords: evaluation model, software quality, open source code, evaluation indicator

Procedia PDF Downloads 394
6139 Fabrication of Textile-Based Radio Frequency Metasurfaces

Authors: Adria Kajenski, Guinevere Strack, Edward Kingsley, Shahriar Khushrushahi, Alkim Akyurtlu

Abstract:

Radio Frequency (RF) metasurfaces are arrangements of subwavelength elements interacting with electromagnetic radiation. These arrangements affect polarization state, amplitude, and phase of impinged radio waves; for example, metasurface designs are used to produce functional passband and stopband filters. Recent advances in additive manufacturing techniques have enabled the low-cost, rapid fabrication of ultra-thin metasurface elements on flexible substrates such as plastic films, paper, and textiles. Furthermore, scalable manufacturing processes promote the integration of fabric-based RF metasurfaces into the market of sensors and devices within the Internet of Things (IoT). The design and fabrication of metasurfaces on textiles require a multidisciplinary team with expertise in i) textile and materials science, ii) metasurface design and simulation, and iii) metasurface fabrication and testing. In this presentation, we will discuss RF metasurfaces on fabric with an emphasis on how the materials, including fabric and inks, along with fabrication techniques, affect the RF performance. We printed metasurfaces using a direct-write approach onto various woven and non-woven fabrics, as well as on fabrics coated with either thermoplastic or thermoset coatings. Our team also performed a range of tests on the printed structures, including different inks and their curing parameters, wash durability, abrasion resistance, and RF performance over time.

Keywords: electronic textiles, metasurface, printed electronics, flexible

Procedia PDF Downloads 198
6138 A Study on the Implementation of Differentiating Instruction Based on Universal Design for Learning

Authors: Yong Wook Kim

Abstract:

The diversity of students in regular classrooms is increasing due to expand inclusive education and increase multicultural students in South Korea. In this diverse classroom environment, the universal design for learning (UDL) has been proposed as a way to meet both the educational need and social expectation of student achievement. UDL offers a variety of practical teaching methods, one of which is a differentiating instruction. The differentiating instruction has been pointed out resource limitation, organizational resistance, and lacks easy-to-implement framework. However, through the framework provided by the UDL, differentiating instruction is able to be flexible in their implementation. In practice, the UDL and differentiating instruction are complementary, but there is still a lack of research that suggests specific implementation methods that apply both concepts at the same time. This study was conducted to investigate the effects of differentiating instruction strategies according to learner characteristics (readiness, interest, learning profile), components of differentiating instruction (content, process, performance, learning environment), especially UDL principles (representation, behavior and expression, participation) existed in differentiating instruction, and implementation of UDL-based differentiating instruction through the Planning for All Learner (PAL) and UDL Lesson Plan Cycle. It is meaningful that such a series of studies can enhance the possibility of more concrete and realistic UDL-based teaching and learning strategies in the classroom, especially in inclusive settings.

Keywords: universal design for learning, differentiating instruction, UDL lesson plan, PAL

Procedia PDF Downloads 200
6137 TRAC: A Software Based New Track Circuit for Traffic Regulation

Authors: Jérôme de Reffye, Marc Antoni

Abstract:

Following the development of the ERTMS system, we think it is interesting to develop another software-based track circuit system which would fit secondary railway lines with an easy-to-work implementation and a low sensitivity to rail-wheel impedance variations. We called this track circuit 'Track Railway by Automatic Circuits.' To be internationally implemented, this system must not have any mechanical component and must be compatible with existing track circuit systems. For example, the system is independent from the French 'Joints Isolants Collés' that isolate track sections from one another, and it is equally independent from component used in Germany called 'Counting Axles,' in French 'compteur d’essieux.' This track circuit is fully interoperable. Such universality is obtained by replacing the train detection mechanical system with a space-time filtering of train position. The various track sections are defined by the frequency of a continuous signal. The set of frequencies related to the track sections is a set of orthogonal functions in a Hilbert Space. Thus the failure probability of track sections separation is precisely calculated on the basis of signal-to-noise ratio. SNR is a function of the level of traction current conducted by rails. This is the reason why we developed a very powerful algorithm to reject noise and jamming to obtain an SNR compatible with the precision required for the track circuit and SIL 4 level. The SIL 4 level is thus reachable by an adjustment of the set of orthogonal functions. Our major contributions to railway engineering signalling science are i) Train space localization is precisely defined by a calibration system. The operation bypasses the GSM-R radio system of the ERTMS system. Moreover, the track circuit is naturally protected against radio-type jammers. After the calibration operation, the track circuit is autonomous. ii) A mathematical topology adapted to train space localization by following the train through a linear time filtering of the received signal. Track sections are numerically defined and can be modified with a software update. The system was numerically simulated, and results were beyond our expectations. We achieved a precision of one meter. Rail-ground and rail-wheel impedance sensitivity analysis gave excellent results. Results are now complete and ready to be published. This work was initialised as a research project of the French Railways developed by the Pi-Ramses Company under SNCF contract and required five years to obtain the results. This track circuit is already at Level 3 of the ERTMS system, and it will be much cheaper to implement and to work. The traffic regulation is based on variable length track sections. As the traffic growths, the maximum speed is reduced, and the track section lengths are decreasing. It is possible if the elementary track section is correctly defined for the minimum speed and if every track section is able to emit with variable frequencies.

Keywords: track section, track circuits, space-time crossing, adaptive track section, automatic railway signalling

Procedia PDF Downloads 337
6136 Coding and Decoding versus Space Diversity for ‎Rayleigh Fading Radio Frequency Channels ‎

Authors: Ahmed Mahmoud Ahmed Abouelmagd

Abstract:

The diversity is the usual remedy of the transmitted signal level variations (Fading phenomena) in radio frequency channels. Diversity techniques utilize two or more copies of a signal and combine those signals to combat fading. The basic concept of diversity is to transmit the signal via several independent diversity branches to get independent signal replicas via time – frequency - space - and polarization diversity domains. Coding and decoding processes can be an alternative remedy for fading phenomena, it cannot increase the channel capacity, but it can improve the error performance. In this paper we propose the use of replication decoding with BCH code class, and Viterbi decoding algorithm with convolution coding; as examples of coding and decoding processes. The results are compared to those obtained from two optimized selection space diversity techniques. The performance of Rayleigh fading channel, as the model considered for radio frequency channels, is evaluated for each case. The evaluation results show that the coding and decoding approaches, especially the BCH coding approach with replication decoding scheme, give better performance compared to that of selection space diversity optimization approaches. Also, an approach for combining the coding and decoding diversity as well as the space diversity is considered, the main disadvantage of this approach is its complexity but it yields good performance results.

Keywords: Rayleigh fading, diversity, BCH codes, Replication decoding, ‎convolution coding, viterbi decoding, space diversity

Procedia PDF Downloads 445
6135 Extending the AOP Joinpoint Model for Memory and Type Safety

Authors: Amjad Nusayr

Abstract:

Software security is a general term used to any type of software architecture or model in which security aspects are incorporated in this architecture. These aspects are not part of the main logic of the underlying program. Software security can be achieved using a combination of approaches, including but not limited to secure software designs, third part component validation, and secure coding practices. Memory safety is one feature in software security where we ensure that any object in memory has a valid pointer or a reference with a valid type. Aspect-Oriented Programming (AOP) is a paradigm that is concerned with capturing the cross-cutting concerns in code development. AOP is generally used for common cross-cutting concerns like logging and DB transaction managing. In this paper, we introduce the concepts that enable AOP to be used for the purpose of memory and type safety. We also present ideas for extending AOP in software security practices.

Keywords: aspect oriented programming, programming languages, software security, memory and type safety

Procedia PDF Downloads 135
6134 Multilayer Neural Network and Fuzzy Logic Based Software Quality Prediction

Authors: Sadaf Sahar, Usman Qamar, Sadaf Ayaz

Abstract:

In the software development lifecycle, the quality prediction techniques hold a prime importance in order to minimize future design errors and expensive maintenance. There are many techniques proposed by various researchers, but with the increasing complexity of the software lifecycle model, it is crucial to develop a flexible system which can cater for the factors which in result have an impact on the quality of the end product. These factors include properties of the software development process and the product along with its operation conditions. In this paper, a neural network (perceptron) based software quality prediction technique is proposed. Using this technique, the stakeholders can predict the quality of the resulting software during the early phases of the lifecycle saving time and resources on future elimination of design errors and costly maintenance. This technique can be brought into practical use using successful training.

Keywords: software quality, fuzzy logic, perception, prediction

Procedia PDF Downloads 321