Search results for: component based software engineering
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33200

Search results for: component based software engineering

32720 Software Development to Empowering Digital Libraries with Effortless Digital Cataloging and Access

Authors: Abdul Basit Kiani

Abstract:

The software for the digital library system is a cutting-edge solution designed to revolutionize the way libraries manage and provide access to their vast collections of digital content. This advanced software leverages the power of technology to offer a seamless and user-friendly experience for both library staff and patrons. By implementing this software, libraries can efficiently organize, store, and retrieve digital resources, including e-books, audiobooks, journals, articles, and multimedia content. Its intuitive interface allows library staff to effortlessly manage cataloging, metadata extraction, and content enrichment, ensuring accurate and comprehensive access to digital materials. For patrons, the software offers a personalized and immersive digital library experience. They can easily browse the digital catalog, search for specific items, and explore related content through intelligent recommendation algorithms. The software also facilitates seamless borrowing, lending, and preservation of digital items, enabling users to access their favorite resources anytime, anywhere, on multiple devices. With robust security features, the software ensures the protection of intellectual property rights and enforces access controls to safeguard sensitive content. Integration with external authentication systems and user management tools streamlines the library's administration processes, while advanced analytics provide valuable insights into patron behavior and content usage. Overall, this software for the digital library system empowers libraries to embrace the digital era, offering enhanced access, convenience, and discoverability of their vast collections. It paves the way for a more inclusive and engaging library experience, catering to the evolving needs of tech-savvy patrons.

Keywords: software development, empowering digital libraries, digital cataloging and access, management system

Procedia PDF Downloads 51
32719 Phosphorus Reduction in Plain and Fully Formulated Oils Using Fluorinated Additives

Authors: Gabi N. Nehme

Abstract:

The reduction of phosphorus and sulfur in engine oil are the main topics of this paper. Very reproducible boundary lubrication tests were conducted as part of Design of Experiment software (DOE) to study the behavior of fluorinated catalyst iron fluoride (FeF3), and polutetrafluoroethylene or Teflon (PTFE) in developing environmentally friendly (reduced P and S) anti-wear additives for future engine oil formulations. Multi-component Chevron fully formulated oil (GF3) and Chevron plain oil were used with the addition of PTFE and catalyst to characterize and analyze their performance. Lower phosphorus blends were the goal of the model solution. Experiments indicated that new sub-micron FeF3 catalyst played an important role in preventing breakdown of the tribofilm.

Keywords: wear, SEM, EDS, friction, lubricants

Procedia PDF Downloads 266
32718 An Experience of Translating an Excerpt from Sophie Adonon’s Echos de Femmes from French to English, Using Reverso.

Authors: Michael Ngongeh Mombe

Abstract:

This Paper seeks to investigate an assertion made by some colleagues that there is no need paying a human translator to translate their literary texts, that there are softwares such as Reverso that can be used to do the translation. The main objective of this study is to examine the veracity of this assertion using Reverso to translate a literary text without any post-editing by a human translator. The work is based on two theories: Skopos and Communicative theories of translation. The work is a documentary research where data were collected from published documents in libraries, on the internet and from the translation produced by Reverso. We made a comparative text analyses of both source and target texts in a bid to highlight the weaknesses and strengths of the software. Findings of this work revealed that those who advocate the use of only Machine translation do so in ignorance of the translation mistakes usually made by the software. From the review of all the 268 segments of translation, we found out that the translation produced by Reverso is fraught with errors. We therefore recommend the use of human translators to either do the translation of their literary texts or revise the translation produced by machine to conform to the skopos of the work. This paper is based on Reverso translation. Similar works in the near future will be based on the other translation softwares to determine their weaknesses and strengths.

Keywords: machine translation, human translator, Reverso, literary text

Procedia PDF Downloads 72
32717 Basic Need Satisfaction and Students’ Willingness to Use Spreadsheet Software

Authors: Anne Sørebø

Abstract:

The present study was designed to test how fulfilment of three basic psychological needs influence students development of perceived usefulness (PU) and ease of use (EOU) in connection with use of a spreadsheet. Both PU and EOU are assumed to be critical for development of students' willingness to utilize spreadsheet in future work within business administration. A questionnaire was completed by 196 business students in Norway. We found that satisfying the need for competence and autonomy is most critical for willingness to utilize the software package. The results also indicate that satisfying the need for relatedness, surprisingly, has no influence on students’ willingness to utilize the software package. A key implication of the present research is that teachers mainly should focus on fulfilling students need for competence and self-determination when the purpose is to motivate them to utilize new software. That students’ should develop their own competence when using a new technology is somewhat obvious, but that the feeling of being self-determined needs to be a complementary element in this connection is not necessary seen as obvious.

Keywords: spreadsheet, business students, technology acceptance, basic psychological needs

Procedia PDF Downloads 369
32716 Effects of Different Meteorological Variables on Reference Evapotranspiration Modeling: Application of Principal Component Analysis

Authors: Akinola Ikudayisi, Josiah Adeyemo

Abstract:

The correct estimation of reference evapotranspiration (ETₒ) is required for effective irrigation water resources planning and management. However, there are some variables that must be considered while estimating and modeling ETₒ. This study therefore determines the multivariate analysis of correlated variables involved in the estimation and modeling of ETₒ at Vaalharts irrigation scheme (VIS) in South Africa using Principal Component Analysis (PCA) technique. Weather and meteorological data between 1994 and 2014 were obtained both from South African Weather Service (SAWS) and Agricultural Research Council (ARC) in South Africa for this study. Average monthly data of minimum and maximum temperature (°C), rainfall (mm), relative humidity (%), and wind speed (m/s) were the inputs to the PCA-based model, while ETₒ is the output. PCA technique was adopted to extract the most important information from the dataset and also to analyze the relationship between the five variables and ETₒ. This is to determine the most significant variables affecting ETₒ estimation at VIS. From the model performances, two principal components with a variance of 82.7% were retained after the eigenvector extraction. The results of the two principal components were compared and the model output shows that minimum temperature, maximum temperature and windspeed are the most important variables in ETₒ estimation and modeling at VIS. In order words, ETₒ increases with temperature and windspeed. Other variables such as rainfall and relative humidity are less important and cannot be used to provide enough information about ETₒ estimation at VIS. The outcome of this study has helped to reduce input variable dimensionality from five to the three most significant variables in ETₒ modelling at VIS, South Africa.

Keywords: irrigation, principal component analysis, reference evapotranspiration, Vaalharts

Procedia PDF Downloads 228
32715 Assessment of Social Vulnerability of Urban Population to Floods – a Case Study of Mumbai

Authors: Sherly M. A., Varsha Vijaykumar, Subhankar Karmakar, Terence Chan, Christian Rau

Abstract:

This study aims at proposing an indicator-based framework for assessing social vulnerability of any coastal megacity to floods. The final set of indicators of social vulnerability are chosen from a set of feasible and available indicators which are prepared using a Geographic Information System (GIS) framework on a smaller scale considering 1-km grid cell to provide an insight into the spatial variability of vulnerability. The optimal weight for each individual indicator is assigned using data envelopment analysis (DEA) as it avoids subjective weights and improves the confidence on the results obtained. In order to de-correlate and reduce the dimension of multivariate data, principal component analysis (PCA) has been applied. The proposed methodology is demonstrated on twenty four wards of Mumbai under the jurisdiction of Municipal Corporation of Greater Mumbai (MCGM). This framework of vulnerability assessment is not limited to the present study area, and may be applied to other urban damage centers.

Keywords: urban floods, vulnerability, data envelopment analysis, principal component analysis

Procedia PDF Downloads 339
32714 Software Architecture Optimization Using Swarm Intelligence Techniques

Authors: Arslan Ellahi, Syed Amjad Hussain, Fawaz Saleem Bokhari

Abstract:

Optimization of software architecture can be done with respect to a quality attributes (QA). In this paper, there is an analysis of multiple research papers from different dimensions that have been used to classify those attributes. We have proposed a technique of swarm intelligence Meta heuristic ant colony optimization algorithm as a contribution to solve this critical optimization problem of software architecture. We have ranked quality attributes and run our algorithm on every QA, and then we will rank those on the basis of accuracy. At the end, we have selected the most accurate quality attributes. Ant colony algorithm is an effective algorithm and will perform best in optimizing the QA’s and ranking them.

Keywords: complexity, rapid evolution, swarm intelligence, dimensions

Procedia PDF Downloads 234
32713 Infrared Thermography as an Informative Tool in Energy Audit and Software Modelling of Historic Buildings: A Case Study of the Sheffield Cathedral

Authors: Ademuyiwa Agbonyin, Stamatis Zoras, Mohammad Zandi

Abstract:

This paper investigates the extent to which building energy modelling can be informed based on preliminary information provided by infrared thermography using a thermal imaging camera in a walkthrough audit. The case-study building is the Sheffield Cathedral, built in the early 1400s. Based on an informative qualitative report generated from the thermal images taken at the site, the regions showing significant heat loss are input into a computer model of the cathedral within the integrated environmental solution (IES) virtual environment software which performs an energy simulation to determine quantitative heat losses through the building envelope. Building data such as material thermal properties and building plans are provided by the architects, Thomas Ford and Partners Ltd. The results of the modelling revealed the portions of the building with the highest heat loss and these aligned with those suggested by the thermal camera. Retrofit options for the building are also considered, however, may not see implementation due to a desire to conserve the architectural heritage of the building. Results show that thermal imaging in a walk-through audit serves as a useful guide for the energy modelling process. Hand calculations were also performed to serve as a 'control' to estimate losses, providing a second set of data points of comparison.

Keywords: historic buildings, energy retrofit, thermal comfort, software modelling, energy modelling

Procedia PDF Downloads 145
32712 Conceptual Model for Logistics Information System

Authors: Ana María Rojas Chaparro, Cristian Camilo Sarmiento Chaves

Abstract:

Given the growing importance of logistics as a discipline for efficient management of materials flow and information, the adoption of tools that permit to create facilities in making decisions based on a global perspective of the system studied has been essential. The article shows how from a concepts-based model is possible to organize and represent in appropriate way the reality, showing accurate and timely information, features that make this kind of models an ideal component to support an information system, recognizing that information as relevant to establish particularities that allow get a better performance about the evaluated sector.

Keywords: system, information, conceptual model, logistics

Procedia PDF Downloads 470
32711 Some Trends in Analysis of Two-Way Solid Slabs

Authors: Reem I. Al-Ya' Goub, Nasim Shatarat

Abstract:

This paper presents the results of analytical and comparative study among software programs' outputs in analysis of some two way solid slabs; flat plate, flat slab with beams and flat slab with drop panels problems that already been analyzed using Classical Equivalent Frame Method (CEFM) by several reinforced concrete book authors. The primary objective of this research is to determine the moment results using various software programs. Then, a summary of the results and differences percentages were obtained to show how analysis procedure effects the outputs of calculations that vary from software program to another when comparing them with the results of CEFM. Moment values were obtained using either the Equivalent Frame Method (EFM) or Finite Element Method (FEM) that's used among many software programs. The results of the analyses demonstrate that software programs vary markedly in terms of the information they provide to the structural designer regarding values of the model insertion, stiffness, effective moment of inertia used and specially the moment values.

Keywords: two-way solid slabs, flat plate, flat slab with beams, flat slab with drop panels, analysis, modeling, EFM, CEFM, FEM

Procedia PDF Downloads 395
32710 Buffer Allocation and Traffic Shaping Policies Implemented in Routers Based on a New Adaptive Intelligent Multi Agent Approach

Authors: M. Taheri Tehrani, H. Ajorloo

Abstract:

In this paper, an intelligent multi-agent framework is developed for each router in which agents have two vital functionalities, traffic shaping and buffer allocation and are positioned in the ports of the routers. With traffic shaping functionality agents shape the traffic forward by dynamic and real time allocation of the rate of generation of tokens in a Token Bucket algorithm and with buffer allocation functionality agents share their buffer capacity between each other based on their need and the conditions of the network. This dynamic and intelligent framework gives this opportunity to some ports to work better under burst and more busy conditions. These agents work intelligently based on Reinforcement Learning (RL) algorithm and will consider effective parameters in their decision process. As RL have limitation considering much parameter in its decision process due to the volume of calculations, we utilize our novel method which invokes Principle Component Analysis (PCA) on the RL and gives a high dimensional ability to this algorithm to consider as much as needed parameters in its decision process. This implementation when is compared to our previous work where traffic shaping was done without any sharing and dynamic allocation of buffer size for each port, the lower packet drop in the whole network specifically in the source routers can be seen. These methods are implemented in our previous proposed intelligent simulation environment to be able to compare better the performance metrics. The results obtained from this simulation environment show an efficient and dynamic utilization of resources in terms of bandwidth and buffer capacities pre allocated to each port.

Keywords: principal component analysis, reinforcement learning, buffer allocation, multi- agent systems

Procedia PDF Downloads 488
32709 Kernel-Based Double Nearest Proportion Feature Extraction for Hyperspectral Image Classification

Authors: Hung-Sheng Lin, Cheng-Hsuan Li

Abstract:

Over the past few years, kernel-based algorithms have been widely used to extend some linear feature extraction methods such as principal component analysis (PCA), linear discriminate analysis (LDA), and nonparametric weighted feature extraction (NWFE) to their nonlinear versions, kernel principal component analysis (KPCA), generalized discriminate analysis (GDA), and kernel nonparametric weighted feature extraction (KNWFE), respectively. These nonlinear feature extraction methods can detect nonlinear directions with the largest nonlinear variance or the largest class separability based on the given kernel function. Moreover, they have been applied to improve the target detection or the image classification of hyperspectral images. The double nearest proportion feature extraction (DNP) can effectively reduce the overlap effect and have good performance in hyperspectral image classification. The DNP structure is an extension of the k-nearest neighbor technique. For each sample, there are two corresponding nearest proportions of samples, the self-class nearest proportion and the other-class nearest proportion. The term “nearest proportion” used here consider both the local information and other more global information. With these settings, the effect of the overlap between the sample distributions can be reduced. Usually, the maximum likelihood estimator and the related unbiased estimator are not ideal estimators in high dimensional inference problems, particularly in small data-size situation. Hence, an improved estimator by shrinkage estimation (regularization) is proposed. Based on the DNP structure, LDA is included as a special case. In this paper, the kernel method is applied to extend DNP to kernel-based DNP (KDNP). In addition to the advantages of DNP, KDNP surpasses DNP in the experimental results. According to the experiments on the real hyperspectral image data sets, the classification performance of KDNP is better than that of PCA, LDA, NWFE, and their kernel versions, KPCA, GDA, and KNWFE.

Keywords: feature extraction, kernel method, double nearest proportion feature extraction, kernel double nearest feature extraction

Procedia PDF Downloads 321
32708 Evaluation of Internal Friction Angle in Overconsolidated Granular Soil Deposits Using P- and S-Wave Seismic Velocities

Authors: Ehsan Pegah, Huabei Liu

Abstract:

Determination of the internal friction angle (φ) in natural soil deposits is an important issue in geotechnical engineering. The main objective of this study was to examine the evaluation of this parameter in overconsolidated granular soil deposits by using the P-wave velocity and the anisotropic components of S-wave velocity (i.e., both the vertical component (SV) and the horizontal component (SH) of S-wave). To this end, seventeen pairs of P-wave and S-wave seismic refraction profiles were carried out at three different granular sites in Iran using non-invasive seismic wave methods. The acquired shot gathers were processed, from which the P-wave, SV-wave and SH-wave velocities were derived. The reference values of φ and overconsolidation ratio (OCR) in the soil deposits were measured through laboratory tests. By assuming cross-anisotropy of the soils, the P-wave and S-wave velocities were utilized to develop an equation for calculating the coefficient of lateral earth pressure at-rest (K₀) based on the theory of elasticity for a cross-anisotropic medium. In addition, to develop an equation for OCR estimation in granular geomaterials in terms of SH/SV velocity ratios, a general regression analysis was performed on the resulting information from this research incorporated with the respective data published in the literature. The calculated K₀ values coupled with the estimated OCR values were finally employed in the Mayne and Kulhawy formula to evaluate φ in granular soil deposits. The results showed that reliable values of φ could be estimated based on the seismic wave velocities. The findings of this study may be used as the appropriate approaches for economic and non-invasive determination of in-situ φ in granular soil deposits using the surface seismic surveys.

Keywords: angle of internal friction, overconsolidation ratio, granular soils, P-wave velocity, SV-wave velocity, SH-wave velocity

Procedia PDF Downloads 134
32707 Working Improvement of Modern Finance in Millennium World

Authors: Saeed Mohammadirad

Abstract:

Financing activities involve long-term liabilities, stockholders' equity (or owner's equity), and changes to short-term borrowings. Finance is very important for every business activities. To perform the finance we have to follow the accounting languages bases on the nature of the business. If all are one package in the software, it is easy to handle, monitor, control, plan, organize, direct and budget the finance. Let us make a challenge in the computer software for the whole finance packages of every business related activities. In this article, it mentioned about the finance functions in the various levels of the business activities and how it should be maintained properly to avoid the unethical events.

Keywords: financing activities, business activities, computer software, unethical events

Procedia PDF Downloads 333
32706 Design of a Low Cost Motion Data Acquisition Setup for Mechatronic Systems

Authors: Baris Can Yalcin

Abstract:

Motion sensors have been commonly used as a valuable component in mechatronic systems, however, many mechatronic designs and applications that need motion sensors cost enormous amount of money, especially high-tech systems. Design of a software for communication protocol between data acquisition card and motion sensor is another issue that has to be solved. This study presents how to design a low cost motion data acquisition setup consisting of MPU 6050 motion sensor (gyro and accelerometer in 3 axes) and Arduino Mega2560 microcontroller. Design parameters are calibration of the sensor, identification and communication between sensor and data acquisition card, interpretation of data collected by the sensor.

Keywords: design, mechatronics, motion sensor, data acquisition

Procedia PDF Downloads 562
32705 Principal Component Regression in Amylose Content on the Malaysian Market Rice Grains Using Near Infrared Reflectance Spectroscopy

Authors: Syahira Ibrahim, Herlina Abdul Rahim

Abstract:

The amylose content is an essential element in determining the texture and taste of rice grains. This paper evaluates the use of VIS-SWNIRS in estimating the amylose content for seven varieties of rice grains available in the Malaysian market. Each type consists of 30 samples and all the samples are scanned using the spectroscopy to obtain a range of values between 680-1000nm. The Savitzky-Golay (SG) smoothing filter is applied to each sample’s data before the Principal Component Regression (PCR) technique is used to examine the data and produce a single value for each sample. This value is then compared with reference values obtained from the standard iodine colorimetric test in terms of its coefficient of determination, R2. Results show that this technique produced low R2 values of less than 0.50. In order to improve the result, the range should include a wavelength range of 1100-2500nm and the number of samples processed should also be increased.

Keywords: amylose content, diffuse reflectance, Malaysia rice grain, principal component regression (PCR), Visible and Shortwave near-infrared spectroscopy (VIS-SWNIRS)

Procedia PDF Downloads 361
32704 Aberrant Consumer Behavior in Seller’s and Consumer’s Eyes: Newly Developed Classification

Authors: Amal Abdelhadi

Abstract:

Consumer misbehavior evaluation can be markedly different based on a number of variables and different from one environment to another. Using three aberrant consumer behavior (ACB) scenarios (shoplifting, stealing from hotel rooms and software piracy) this study aimed to explore Libyan seller and consumers of ACB. Materials were collected by using a multi-method approach was employed (qualitative and quantitative approaches) in two fieldwork phases. In the phase stage, a qualitative data were collected from 26 Libyan sellers’ by face-to-face interviews. In the second stage, a consumer survey was used to collect quantitative data from 679 Libyan consumers. This study found that the consumer’s and seller’s evaluation of ACB are not always consistent. Further, ACB evaluations differed based on the form of ACB. Furthermore, the study found that not all consumer behaviors that were considered as bad behavior in other countries have the same evaluation in Libya; for example, software piracy. Therefore this study suggested a newly developed classification of ACB based on marketers’ and consumers’ views. This classification provides 9 ACB types within two dimensions (marketers’ and consumers’ views) and three degrees of behavior evaluation (good, acceptable and misbehavior).

Keywords: aberrant consumer behavior, Libya, multi-method approach, planned behavior theory

Procedia PDF Downloads 550
32703 Don't Just Guess and Slip: Estimating Bayesian Knowledge Tracing Parameters When Observations Are Scant

Authors: Michael Smalenberger

Abstract:

Intelligent tutoring systems (ITS) are computer-based platforms which can incorporate artificial intelligence to provide step-by-step guidance as students practice problem-solving skills. ITS can replicate and even exceed some benefits of one-on-one tutoring, foster transactivity in collaborative environments, and lead to substantial learning gains when used to supplement the instruction of a teacher or when used as the sole method of instruction. A common facet of many ITS is their use of Bayesian Knowledge Tracing (BKT) to estimate parameters necessary for the implementation of the artificial intelligence component, and for the probability of mastery of a knowledge component relevant to the ITS. While various techniques exist to estimate these parameters and probability of mastery, none directly and reliably ask the user to self-assess these. In this study, 111 undergraduate students used an ITS in a college-level introductory statistics course for which detailed transaction-level observations were recorded, and users were also routinely asked direct questions that would lead to such a self-assessment. Comparisons were made between these self-assessed values and those obtained using commonly used estimation techniques. Our findings show that such self-assessments are particularly relevant at the early stages of ITS usage while transaction level data are scant. Once a user’s transaction level data become available after sufficient ITS usage, these can replace the self-assessments in order to eliminate the identifiability problem in BKT. We discuss how these findings are relevant to the number of exercises necessary to lead to mastery of a knowledge component, the associated implications on learning curves, and its relevance to instruction time.

Keywords: Bayesian Knowledge Tracing, Intelligent Tutoring System, in vivo study, parameter estimation

Procedia PDF Downloads 151
32702 Multisignature Schemes for Reinforcing Trust in Cloud Software-As-A-Service Services

Authors: Mustapha Hedabou, Ali Azougaghe, Ahmed Bentajer, Hicham Boukhris, Mourad Eddiwani, Zakaria Igarramen

Abstract:

Software-as-a-service (SaaS) is emerging as a dominant approach to delivering software. It encompasses a range of business, technical opportunities, issue, and challenges. Trustiness in the cloud services regarding the security and the privacy of the delivered data is the most critical issue with the SaaS model. In this paper, we survey the security concerns related to the SaaS model, and we propose the design of a trusted SaaS model that gives users more confidence into SaaS services by leveraging a trust in a neutral source code certifying authority. The proposed design is based on the use of the multisignature mechanism for signing the source code of the application service. In our model, the cloud provider acts as a root of trust by ensuring the integrity of the application service when it was running on its platform. The proposed design prevents insider attacks from tampering with application service before and after it was launched in a cloud provider platform.

Keywords: cloud computing, SaaS Platform, TPM, trustiness, code source certification, multi-signature schemes

Procedia PDF Downloads 253
32701 Determining Components of Deflection of the Vertical in Owerri West Local Government, Imo State Nigeria Using Least Square Method

Authors: Chukwu Fidelis Ndubuisi, Madufor Michael Ozims, Asogwa Vivian Ndidiamaka, Egenamba Juliet Ngozi, Okonkwo Stephen C., Kamah Chukwudi David

Abstract:

Deflection of the vertical is a quantity used in reducing geodetic measurements related to geoidal networks to the ellipsoidal plane; and it is essential in Geoid modeling processes. Computing the deflection of the vertical component of a point in a given area is necessary in evaluating the standard errors along north-south and east-west direction. Using combined approach for the determination of deflection of the vertical component provides improved result but labor intensive without appropriate method. Least square method is a method that makes use of redundant observation in modeling a given sets of problem that obeys certain geometric condition. This research work is aimed to computing the deflection of vertical component of Owerri West local government area of Imo State using geometric method as field technique. In this method combination of Global Positioning System on static mode and precise leveling observation were utilized in determination of geodetic coordinate of points established within the study area by GPS observation and the orthometric heights through precise leveling. By least square using Matlab programme; the estimated deflections of vertical component parameters for the common station were -0.0286 and -0.0001 arc seconds for the north-south and east-west components respectively. The associated standard errors of the processed vectors of the network were computed. The computed standard errors of the North-south and East-west components were 5.5911e-005 and 1.4965e-004 arc seconds, respectively. Therefore, including the derived component of deflection of the vertical to the ellipsoidal model will yield high observational accuracy since an ellipsoidal model is not tenable due to its far observational error in the determination of high quality job. It is important to include the determined deflection of the vertical component for Owerri West Local Government in Imo State, Nigeria.

Keywords: deflection of vertical, ellipsoidal height, least square, orthometric height

Procedia PDF Downloads 184
32700 Cluster-Based Exploration of System Readiness Levels: Mathematical Properties of Interfaces

Authors: Justin Fu, Thomas Mazzuchi, Shahram Sarkani

Abstract:

A key factor in technological immaturity in defense weapons acquisition is lack of understanding critical integrations at the subsystem and component level. To address this shortfall, recent research in integration readiness level (IRL) combines with technology readiness level (TRL) to form a system readiness level (SRL). SRL can be enriched with more robust quantitative methods to provide the program manager a useful tool prior to committing to major weapons acquisition programs. This research harnesses previous mathematical models based on graph theory, Petri nets, and tropical algebra and proposes a modification of the desirable SRL mathematical properties such that a tightly integrated (multitude of interfaces) subsystem can display a lower SRL than an inherently less coupled subsystem. The synthesis of these methods informs an improved decision tool for the program manager to commit to expensive technology development. This research ties the separately developed manufacturing readiness level (MRL) into the network representation of the system and addresses shortfalls in previous frameworks, including the lack of integration weighting and the over-importance of a single extremely immature component. Tropical algebra (based on the minimum of a set of TRLs or IRLs) allows one low IRL or TRL value to diminish the SRL of the entire system, which may not be reflective of actuality if that component is not critical or tightly coupled. Integration connections can be weighted according to importance and readiness levels are modified to be a cardinal scale (based on an analytic hierarchy process). Integration arcs’ importance are dependent on the connected nodes and the additional integrations arcs connected to those nodes. Lack of integration is not represented by zero, but by a perfect integration maturity value. Naturally, the importance (or weight) of such an arc would be zero. To further explore the impact of grouping subsystems, a multi-objective genetic algorithm is then used to find various clusters or communities that can be optimized for the most representative subsystem SRL. This novel calculation is then benchmarked through simulation and using past defense acquisition program data, focusing on the newly introduced Middle Tier of Acquisition (rapidly field prototypes). The model remains a relatively simple, accessible tool, but at higher fidelity and validated with past data for the program manager to decide major defense acquisition program milestones.

Keywords: readiness, maturity, system, integration

Procedia PDF Downloads 66
32699 Statistical Model of Water Quality in Estero El Macho, Machala-El Oro

Authors: Rafael Zhindon Almeida

Abstract:

Surface water quality is an important concern for the evaluation and prediction of water quality conditions. The objective of this study is to develop a statistical model that can accurately predict the water quality of the El Macho estuary in the city of Machala, El Oro province. The methodology employed in this study is of a basic type that involves a thorough search for theoretical foundations to improve the understanding of statistical modeling for water quality analysis. The research design is correlational, using a multivariate statistical model involving multiple linear regression and principal component analysis. The results indicate that water quality parameters such as fecal coliforms, biochemical oxygen demand, chemical oxygen demand, iron and dissolved oxygen exceed the allowable limits. The water of the El Macho estuary is determined to be below the required water quality criteria. The multiple linear regression model, based on chemical oxygen demand and total dissolved solids, explains 99.9% of the variance of the dependent variable. In addition, principal component analysis shows that the model has an explanatory power of 86.242%. The study successfully developed a statistical model to evaluate the water quality of the El Macho estuary. The estuary did not meet the water quality criteria, with several parameters exceeding the allowable limits. The multiple linear regression model and principal component analysis provide valuable information on the relationship between the various water quality parameters. The findings of the study emphasize the need for immediate action to improve the water quality of the El Macho estuary to ensure the preservation and protection of this valuable natural resource.

Keywords: statistical modeling, water quality, multiple linear regression, principal components, statistical models

Procedia PDF Downloads 65
32698 The Development of an Automated Computational Workflow to Prioritize Potential Resistance Variants in HIV Integrase Subtype C

Authors: Keaghan Brown

Abstract:

The prioritization of drug resistance mutations impacting protein folding or protein-drug and protein-DNA interactions within macromolecular systems is critical to the success of treatment regimens. With a continual increase in computational tools to assess these impacts, the need for scalability and reproducibility became an essential component of computational analysis and experimental research. Here it introduce a bioinformatics pipeline that combines several structural analysis tools in a simplified workflow, by optimizing the present computational hardware and software to automatically ease the flow of data transformations. Utilizing preestablished software tools, it was possible to develop a pipeline with a set of pre-defined functions that will automate mutation introduction into the HIV-1 Integrase protein structure, calculate the gain and loss of polar interactions and calculate the change in energy of protein fold. Additionally, an automated molecular dynamics analysis was implemented which reduces the constant need for user input and output management. The resulting pipeline, Automated Mutation Introduction and Analysis (AMIA) is an open source set of scripts designed to introduce and analyse the effects of mutations on the static protein structure as well as the results of the multi-conformational states from molecular dynamic simulations. The workflow allows the user to visualize all outputs in a user friendly manner thereby successfully enabling the prioritization of variant systems for experimental validation.

Keywords: automated workflow, variant prioritization, drug resistance, HIV Integrase

Procedia PDF Downloads 54
32697 Amplitude and Latency of P300 Component from Auditory Stimulus in Different Types of Personality: An Event Related Potential Study

Authors: Nasir Yusoff, Ahmad Adamu Adamu, Tahamina Begum, Faruque Reza

Abstract:

The P300 from Event related potential (ERP) explains the psycho-physiological phenomenon in human body. The present study aims to identify the differences of amplitude and latency of P300 component from auditory stimuli, between ambiversion and extraversion types of personality. Ambivert (N=20) and extravert (N=20) undergoing ERP recording at the Hospital Universiti Sains Malaysia (HUSM) laboratory. Electroencephalogram data was recorded with oddball paradigm, counting auditory standard and target tones, from nine electrode sites (Fz, Cz, Pz, T3, T4, T5, T6, P3 and P4) by using the 128 HydroCel Geodesic Sensor Net. The P300 latency of the target tones at all electrodes were insignificant. Similarly, the P300 latency of the standard tones were also insignificant except at Fz and T3 electrode. Likewise, the P300 amplitude of the target and standard tone in all electrode sites were insignificant. Extravert and ambivert indicate similar characteristic in cognition processing from auditory task.

Keywords: amplitude, event related potential, p300 component, latency

Procedia PDF Downloads 347
32696 Frontal Oscillatory Activity and Phase–Amplitude Coupling during Chan Meditation

Authors: Arthur C. Tsai, Chii-Shyang Kuo, Vincent S. C. Chien, Michelle Liou, Philip E. Cheng

Abstract:

Meditation enhances mental abilities and it is an antidote to anxiety. However, very little is known about brain mechanisms and cortico-subcortical interactions underlying meditation-induced anxiety relief. In this study, the changes of phase-amplitude coupling (PAC) in which the amplitude of the beta frequency band were modulated in phase with delta rhythm were investigated after eight-week of meditation training. The study hypothesized that through a concentrate but relaxed mental training the delta-beta coupling in the frontal regions is attenuated. The delta-beta coupling analysis was applied to within and between maximally-independent component sources returned from the extended infomax independent components analysis (ICA) algorithm on the continuous EEG data during mediation. A unique meditative concentration task through relaxing body and mind was used with a constant level of moderate mental effort, so as to approach an ‘emptiness’ meditative state. A pre-test/post-test control group design was used in this study. To evaluate cross-frequency phase-amplitude coupling of component sources, the modulation index (MI) with statistics to calculate circular phase statistics were estimated. Our findings reveal that a significant delta-beta decoupling was observed in a set of frontal regions bilaterally. In addition, beta frequency band of prefrontal component were amplitude modulated in phase with the delta rhythm of medial frontal component.

Keywords: phase-amplitude coupling, ICA, meditation, EEG

Procedia PDF Downloads 403
32695 Analyzing the Impact of Code Commenting on Software Quality

Authors: Thulya Premathilake, Tharushi Perera, Hansi Thathsarani, Tharushi Nethmini, Dilshan De Silva, Piyumika Samarasekara

Abstract:

One of the most efficient ways to assist developers in grasping the source code is to make use of comments, which can be found throughout the code. When working in fields such as software development, having comments in your code that are of good quality is a fundamental requirement. Tackling software problems while making use of programs that have already been built. It is essential for the intention of the source code to be made crystal apparent in the comments that are added to the code. This assists programmers in better comprehending the programs they are working on and enables them to complete software maintenance jobs in a more timely manner. In spite of the fact that comments and documentation are meant to improve readability and maintainability, the vast majority of programmers place the majority of their focus on the actual code that is being written. This study provides a complete and comprehensive overview of the previous research that has been conducted on the topic of code comments. The study focuses on four main topics, including automated comment production, comment consistency, comment classification, and comment quality rating. One is able to get the knowledge that is more complete for use in following inquiries if they conduct an analysis of the proper approaches that were used in this study issue.

Keywords: code commenting, source code, software quality, quality assurance

Procedia PDF Downloads 64
32694 An Agile, Intelligent and Scalable Framework for Global Software Development

Authors: Raja Asad Zaheer, Aisha Tanveer, Hafza Mehreen Fatima

Abstract:

Global Software Development (GSD) is becoming a common norm in software industry, despite of the fact that global distribution of the teams presents special issues for effective communication and coordination of the teams. Now trends are changing and project management for distributed teams is no longer in a limbo. GSD can be effectively established using agile and project managers can use different agile techniques/tools for solving the problems associated with distributed teams. Agile methodologies like scrum and XP have been successfully used with distributed teams. We have employed exploratory research method to analyze different recent studies related to challenges of GSD and their proposed solutions. In our study, we had deep insight in six commonly faced challenges: communication and coordination, temporal differences, cultural differences, knowledge sharing/group awareness, speed and communication tools. We have established that each of these challenges cannot be neglected for distributed teams of any kind. They are interlinked and as an aggregated whole can cause the failure of projects. In this paper we have focused on creating a scalable framework for detecting and overcoming these commonly faced challenges. In the proposed solution, our objective is to suggest agile techniques/tools relevant to a particular problem faced by the organizations related to the management of distributed teams. We focused mainly on scrum and XP techniques/tools because they are widely accepted and used in the industry. Our solution identifies the problem and suggests an appropriate technique/tool to help solve the problem based on globally shared knowledgebase. We can establish a cause and effect relationship using a fishbone diagram based on the inputs provided for issues commonly faced by organizations. Based on the identified cause, suitable tool is suggested, our framework suggests a suitable tool. Hence, a scalable, extensible, self-learning, intelligent framework proposed will help implement and assess GSD to achieve maximum out of it. Globally shared knowledgebase will help new organizations to easily adapt best practices set forth by the practicing organizations.

Keywords: agile project management, agile tools/techniques, distributed teams, global software development

Procedia PDF Downloads 276
32693 Pantograph-Catenary Contact Force: Features Evaluation for Catenary Diagnostics

Authors: Mehdi Brahimi, Kamal Medjaher, Noureddine Zerhouni, Mohammed Leouatni

Abstract:

The Prognostics and Health Management is a system engineering discipline which provides solutions and models to the implantation of a predictive maintenance. The approach is based on extracting useful information from monitoring data to assess the “health” state of an industrial equipment or an asset. In this paper, we examine multiple extracted features from Pantograph-Catenary contact force in order to select the most relevant ones to achieve a diagnostics function. The feature extraction methodology is based on simulation data generated thanks to a Pantograph-Catenary simulation software called INPAC and measurement data. The feature extraction method is based on both statistical and signal processing analyses. The feature selection method is based on statistical criteria.

Keywords: catenary/pantograph interaction, diagnostics, Prognostics and Health Management (PHM), quality of current collection

Procedia PDF Downloads 273
32692 Evaluation of Joint Contact Forces and Muscle Forces in the Subjects with Non-Specific Low Back Pain

Authors: Mohammad Taghi Karimi, Maryam Hasan Zahraee

Abstract:

Background: Low back pain (LBP) is a common health and socioeconomic problem, especially the chronic one. The joint contact force is an important parameter during walking which increases the incidence of injury and degenerative joint disease. To our best knowledge, there are not enough evidences in literature on the muscular forces and joint contact forces in subjects with low back pain. Purpose: The main hypothesis associated with this research was that joint contact force of L4/L5 of non-specific chronic low back pain subjects was the same as that of normal. Therefore, the aim of this study was to determine the joint contact force difference between non-specific chronic low back pain and normal subjects. Method: This was an experimental-comparative study. 20 normal subjects and 20 non-specific chronic low back pain patients were recruited in this study. Qualysis motion analysis system and a Kistler force plate were used to collect the motions and the force applied on the leg, respectively. OpenSimm software used to determine joint contact force and muscle forces in this study. Some parameters such as force applied on the legs (pelvis), kinematic of hip and pelvic, peaks of muscles, force of trunk musculature and joint contact force of L5/S1 were used for further analysis. Differences between mean values of all data were measured using two-sample t-test among the subjects. Results: The force produced by Semitendinosus, Biceps Femoris, and Adductor muscles were significantly different between low back pain and normal subjects. Moreover, the mean value of breaking component of the force of the knee joint increased significantly in low back pain subjects, besides a significant decrease in mean value of the vertical component of joint reaction force compared to the normal ones. Conclusions: The forces produced by the trunk and pelvic muscles, and joint contact forces differ significantly between low back pain and normal subjects. It seems that those with non-specific chronic low back pain use trunk muscles more than normal subjects to stabilize the pelvic during walking.

Keywords: low back pain, joint contact force, kinetic, muscle force

Procedia PDF Downloads 221
32691 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations

Authors: Ram Mohan, Richard Haney, Ajit Kelkar

Abstract:

Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.

Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance

Procedia PDF Downloads 347