Search results for: quantitative approach.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5529

Search results for: quantitative approach.

5169 Service Identification Approach to SOA Development

Authors: Nafise Fareghzadeh

Abstract:

Service identification is one of the main activities in the modeling of a service-oriented solution, and therefore errors made during identification can flow down through detailed design and implementation activities that may necessitate multiple iterations, especially in building composite applications. Different strategies exist for how to identify candidate services that each of them has its own benefits and trade offs. The approach presented in this paper proposes a selective identification of services approach, based on in depth business process analysis coupled with use cases and existing assets analysis and goal service modeling. This article clearly emphasizes the key activities need for the analysis and service identification to build a optimized service oriented architecture. In contrast to other approaches this article mentions some best practices and steps, wherever appropriate, to point out the vagueness involved in service identification.

Keywords: SOA, service identification, service taxonomy, service layer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3079
5168 Revisiting Domestication and Foreignisation Methods: Translating the Quran by the Hybrid Approach

Authors: Aladdin Al-Tarawneh

Abstract:

The Quran, as it is the sacred book of Islam and considered the literal word of God (Allah) in Arabic, is highly translated into many languages; however, the foreignising or the literal approach excessively stains the quality and discredits the final product in the eyes of its receptors. Such an approach fails to capture the intended meaning of the Quran and to communicate it in any language. Therefore, this study is conducted to propose a different approach that seeks involving other ones according to a hybrid model. Indeed, this study challenges the binary adherence that is highly used in Translation Studies (TS) in general and in the translation of the Quran in particular. Drawing on the genuine fact that the Quran can be communicated in any language in terms of meaning, and the translation is not sacred; this paper approaches the translation of the Quran by blending different methods like domestication or foreignisation in a systematic way, avoiding the binary choice made by many translators. To reach this aim, the paper has a conceptual part that seeks to elucidate and clarify the main methods employed in TS, and criticise and modify them to propose the new hybrid approach (the hybrid model) for translating the Quran – that is, the deductive method. To support and validate the outcome of the previous part, a comparative model is employed in order to highlight the differences between the suggested translation and other widely used ones – that is, the inductive method. By applying this methodology, the paper proves that there is a deficiency of communicating the original meaning of the Quran in light of the foreignising approach. In conclusion, the paper suggests producing a Quran translation has to take into account the adoption of many techniques to express the meaning of the Quran as understood in the original, and to offer this understanding in English in the most native-like manner to serve the intended target readers.

Keywords: Quran translation, hybrid approach, domestication, foreignisation, hybrid model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1167
5167 WLAN Positioning Based on Joint TOA and RSS Characteristics

Authors: Peerapong Uthansakul, Monthippa Uthansakul

Abstract:

WLAN Positioning has been presented by many approaches in literatures using the characteristics of Received Signal Strength (RSS), Time of Arrival (TOA) or Time Difference of Arrival (TDOA), Angle of Arrival (AOA) and cell ID. Among these, RSS approach is the simplest method to implement because there is no need of modification on both access points and client devices whereas its accuracy is terrible due to physical environments. For TOA or TDOA approach, the accuracy is quite acceptable but most researches have to modify either software or hardware on existing WLAN infrastructure. The scales of modifications are made on only access card up to the changes in protocol of WLAN. Hence, it is an unattractive approach to use TOA or TDOA for positioning system. In this paper, the new concept of merging both RSS and TOA positioning techniques is proposed. In addition, the method to achieve TOA characteristic for positioning WLAN user without any extra modification necessarily appended in the existing system is presented. The measurement results confirm that the proposed technique using both RSS and TOA characteristics provides better accuracy than using only either RSS or TOA approach.

Keywords: Received signal strength, Time of arrival, Positioning system, WLAN, Measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1753
5166 Role of Director's Philosophical Approach in Cinematographic Expression

Authors: Sedat Cereci

Abstract:

The original idea for a feature film may come from a writer, director or a producer. Director is the person responsible for the creative aspects, both interpretive and technical, of a motion picture production in a film. Director may be shot discussing his project with his or her cowriters, members of production staff, and producer, and director may be shown selecting locales or constructing sets. All these activities provide, of course, ways of externalizing director-s ideas about the film. A director sometimes pushes both the film image and techniques of narration to new artistic limits, but main responsibility of director is take the spectator to an original opinion in his philosophical approach. Director tries to find an artistic angle in every scene and change screenplay into an effective story and sets his film on a spiritual and philosophical base.

Keywords: Director, role, film, approach, opinion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1527
5165 Density Estimation using Generalized Linear Model and a Linear Combination of Gaussians

Authors: Aly Farag, Ayman El-Baz, Refaat Mohamed

Abstract:

In this paper we present a novel approach for density estimation. The proposed approach is based on using the logistic regression model to get initial density estimation for the given empirical density. The empirical data does not exactly follow the logistic regression model, so, there will be a deviation between the empirical density and the density estimated using logistic regression model. This deviation may be positive and/or negative. In this paper we use a linear combination of Gaussian (LCG) with positive and negative components as a model for this deviation. Also, we will use the expectation maximization (EM) algorithm to estimate the parameters of LCG. Experiments on real images demonstrate the accuracy of our approach.

Keywords: Logistic regression model, Expectationmaximization, Segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1722
5164 Evaluation of Hydrogen Particle Volume on Surfaces of Selected Nanocarbons

Authors: M. Ziółkowska, J. T. Duda, J. Milewska-Duda

Abstract:

This paper describes an approach to the adsorption phenomena modeling aimed at specifying the adsorption mechanisms on localized or nonlocalized adsorbent sites, when applied to the nanocarbons. The concept comes from the fundamental thermodynamic description of adsorption equilibrium and is based on numerical calculations of the hydrogen adsorbed particles volume on the surface of selected nanocarbons: single-walled nanotube and nanocone. This approach enables to obtain information on adsorption mechanism and then as a consequence to take appropriate mathematical adsorption model, thus allowing for a more reliable identification of the material porous structure. Theoretical basis of the approach is discussed and newly derived results of the numerical calculations are presented for the selected nanocarbons.

Keywords: Adsorption, mathematical modeling, nanocarbons, numerical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1891
5163 A Search Algorithm for Solving the Economic Lot Scheduling Problem with Reworks under the Basic Period Approach

Authors: Yu-Jen Chang, Shih-Chieh Chen, Yu-Wei Kuo

Abstract:

In this study, we are interested in the economic lot scheduling problem (ELSP) that considers manufacturing of the serviceable products and remanufacturing of the reworked products. In this paper, we formulate a mathematical model for the ELSP with reworks using the basic period approach. In order to solve this problem, we propose a search algorithm to find the cyclic multiplier ki of each product that can be cyclically produced for every ki basic periods. This research also uses two heuristics to search for the optimal production sequence of all lots and the optimal time length of the basic period so as to minimize the average total cost. This research uses a numerical example to show the effectiveness of our approach.

Keywords: Economic lot, reworks, inventory, basic period.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1503
5162 A New Approach for Recoverable Timestamp Ordering Schedule

Authors: Hassan M. Najadat

Abstract:

A new approach for timestamp ordering problem in serializable schedules is presented. Since the number of users using databases is increasing rapidly, the accuracy and needing high throughput are main topics in database area. Strict 2PL does not allow all possible serializable schedules and so does not result high throughput. The main advantages of the approach are the ability to enforce the execution of transaction to be recoverable and the high achievable performance of concurrent execution in central databases. Comparing to Strict 2PL, the general structure of the algorithm is simple, free deadlock, and allows executing all possible serializable schedules which results high throughput. Various examples which include different orders of database operations are discussed.

Keywords: Concurrency control, schedule, timestamp, transaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2072
5161 Novel Approach for Wideband VNA by Sixport Principle

Authors: Tomáš Urbanec

Abstract:

Paper presents simple sixport principle and its frequency bandwidth. The novel multisixport approach is presented with its possibilities, typical parameters and frequency bandwidth. Practical implementation is shown with its measurement parameters and calibration. The bandwidth circa 1:100 is obtained.

Keywords: microwave measurement, sixport, VNA, wideband.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1348
5160 Investigating Interference Errors Made by Azzawia University 1st year Students of English in Learning English Prepositions

Authors: Aimen Mohamed Almaloul

Abstract:

The main focus of this study is investigating the interference of Arabic in the use of English prepositions by Libyan university students. Prepositions in the tests used in the study were categorized, according to their relation to Arabic, into similar Arabic and English prepositions (SAEP), dissimilar Arabic and English prepositions (DAEP), Arabic prepositions with no English counterparts (APEC), and English prepositions with no Arabic counterparts (EPAC).

The subjects of the study were the first year university students of the English department, Sabrata Faculty of Arts, Azzawia University; both males and females, and they were 100 students. The basic tool for data collection was a test of English prepositions; students are instructed to fill in the blanks with the correct prepositions and to put a zero (0) if no preposition was needed. The test was then handed to the subjects of the study.

The test was then scored and quantitative as well as qualitative results were obtained. Quantitative results indicated the number, percentages and rank order of errors in each of the categories and qualitative results indicated the nature and significance of those errors and their possible sources. Based on the obtained results the researcher could detect that students made more errors in the EPAC category than the other three categories and these errors could be attributed to the lack of knowledge of the different meanings of English prepositions. This lack of knowledge forced the students to adopt what is called the strategy of transfer.

Keywords: Foreign language acquisition, foreign language learning, interference system, interlanguage system, mother tongue interference.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5029
5159 Discontinuous Feedback Linearization of an Electrically Driven Fast Robot Manipulator

Authors: A. Izadbakhsh, M. M. Fateh, M. A. Sadrnia

Abstract:

A multivariable discontinuous feedback linearization approach is proposed to position control of an electrically driven fast robot manipulator. A desired performance is achieved by selecting a useful controller and suitable sampling rate and considering saturation for actuators. There is a high flexibility to apply the proposed control approach on different electrically driven manipulators. The control approach can guarantee the stability and satisfactory tracking performance. A PUMA 560 robot driven by geared permanent magnet dc motors is simulated. The simulation results show a desired performance for control system under technical specifications.

Keywords: Fast robot, feedback linearization, multivariabledigital control, PUMA560.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1950
5158 Enhancing Word Meaning Retrieval Using FastText and NLP Techniques

Authors: Sankalp Devanand, Prateek Agasimani, V. S. Shamith, Rohith Neeraje

Abstract:

Machine translation has witnessed significant advancements in recent years, but the translation of languages with distinct linguistic characteristics, such as English and Sanskrit, remains a challenging task. This research presents the development of a dedicated English to Sanskrit machine translation model, aiming to bridge the linguistic and cultural gap between these two languages. Using a variety of natural language processing (NLP) approaches including FastText embeddings, this research proposes a thorough method to improve word meaning retrieval. Data preparation, part-of-speech tagging, dictionary searches, and transliteration are all included in the methodology. The study also addresses the implementation of an interpreter pattern and uses a word similarity task to assess the quality of word embeddings. The experimental outcomes show how the suggested approach may be used to enhance word meaning retrieval tasks with greater efficacy, accuracy, and adaptability. Evaluation of the model's performance is conducted through rigorous testing, comparing its output against existing machine translation systems. The assessment includes quantitative metrics such as BLEU scores, METEOR scores, Jaccard Similarity etc.

Keywords: Machine translation, English to Sanskrit, natural language processing, word meaning retrieval, FastText embeddings.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15
5157 Danger Theory and Intelligent Data Processing

Authors: Anjum Iqbal, Mohd Aizaini Maarof

Abstract:

Artificial Immune System (AIS) is relatively naive paradigm for intelligent computations. The inspiration for AIS is derived from natural Immune System (IS). Classically it is believed that IS strives to discriminate between self and non-self. Most of the existing AIS research is based on this approach. Danger Theory (DT) argues this approach and proposes that IS fights against danger producing elements and tolerates others. We, the computational researchers, are not concerned with the arguments among immunologists but try to extract from it novel abstractions for intelligent computation. This paper aims to follow DT inspiration for intelligent data processing. The approach may introduce new avenue in intelligent processing. The data used is system calls data that is potentially significant in intrusion detection applications.

Keywords: artificial immune system, danger theory, intelligent processing, system calls

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1869
5156 An Improved Dynamic Window Approach with Environment Awareness for Local Obstacle Avoidance of Mobile Robots

Authors: Baoshan Wei, Shuai Han, Xing Zhang

Abstract:

Local obstacle avoidance is critical for mobile robot navigation. It is a challenging task to ensure path optimality and safety in cluttered environments. We proposed an Environment Aware Dynamic Window Approach in this paper to cope with the issue. The method integrates environment characterization into Dynamic Window Approach (DWA). Two strategies are proposed in order to achieve the integration. The local goal strategy guides the robot to move through openings before approaching the final goal, which solves the local minima problem in DWA. The adaptive control strategy endows the robot to adjust its state according to the environment, which addresses path safety compared with DWA. Besides, the evaluation shows that the path generated from the proposed algorithm is safer and smoother compared with state-of-the-art algorithms.

Keywords: Adaptive control, dynamic window approach, environment aware, local obstacle avoidance, mobile robots.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1260
5155 Data-driven Multiscale Tsallis Complexity: Application to EEG Analysis

Authors: Young-Seok Choi

Abstract:

This work proposes a data-driven multiscale based quantitative measures to reveal the underlying complexity of electroencephalogram (EEG), applying to a rodent model of hypoxic-ischemic brain injury and recovery. Motivated by that real EEG recording is nonlinear and non-stationary over different frequencies or scales, there is a need of more suitable approach over the conventional single scale based tools for analyzing the EEG data. Here, we present a new framework of complexity measures considering changing dynamics over multiple oscillatory scales. The proposed multiscale complexity is obtained by calculating entropies of the probability distributions of the intrinsic mode functions extracted by the empirical mode decomposition (EMD) of EEG. To quantify EEG recording of a rat model of hypoxic-ischemic brain injury following cardiac arrest, the multiscale version of Tsallis entropy is examined. To validate the proposed complexity measure, actual EEG recordings from rats (n=9) experiencing 7 min cardiac arrest followed by resuscitation were analyzed. Experimental results demonstrate that the use of the multiscale Tsallis entropy leads to better discrimination of the injury levels and improved correlations with the neurological deficit evaluation after 72 hours after cardiac arrest, thus suggesting an effective metric as a prognostic tool.

Keywords: Electroencephalogram (EEG), multiscale complexity, empirical mode decomposition, Tsallis entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2049
5154 A Pairwise-Gaussian-Merging Approach: Towards Genome Segmentation for Copy Number Analysis

Authors: Chih-Hao Chen, Hsing-Chung Lee, Qingdong Ling, Hsiao-Jung Chen, Sun-Chong Wang, Li-Ching Wu, H.C. Lee

Abstract:

Segmentation, filtering out of measurement errors and identification of breakpoints are integral parts of any analysis of microarray data for the detection of copy number variation (CNV). Existing algorithms designed for these tasks have had some successes in the past, but they tend to be O(N2) in either computation time or memory requirement, or both, and the rapid advance of microarray resolution has practically rendered such algorithms useless. Here we propose an algorithm, SAD, that is much faster and much less thirsty for memory – O(N) in both computation time and memory requirement -- and offers higher accuracy. The two key ingredients of SAD are the fundamental assumption in statistics that measurement errors are normally distributed and the mathematical relation that the product of two Gaussians is another Gaussian (function). We have produced a computer program for analyzing CNV based on SAD. In addition to being fast and small it offers two important features: quantitative statistics for predictions and, with only two user-decided parameters, ease of use. Its speed shows little dependence on genomic profile. Running on an average modern computer, it completes CNV analyses for a 262 thousand-probe array in ~1 second and a 1.8 million-probe array in 9 seconds

Keywords: Cancer, pathogenesis, chromosomal aberration, copy number variation, segmentation analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1459
5153 The Effect of an Al Andalus Fused Curriculum Model on the Learning Outcomes of Elementary School Students

Authors: Sobhy Fathy A. Hashesh

Abstract:

The study was carried out in the Elementary Classes of Andalus Private Schools, girls section using control and experimental groups formed by Random Assignment Strategy. The study aimed at investigating the effect of Al-Andalus Fused Curriculum (AFC) model of learning and the effect of separate subjects’ approach on the development of students’ conceptual learning and skills acquiring. The society of the study composed of Al-Andalus Private Schools, elementary school students, Girls Section (N=240), while the sample of the study composed of two randomly assigned groups (N=28) with one experimental group and one control group. The study followed the quantitative and qualitative approaches in collecting and analyzing data to investigate the study hypotheses. Results of the study revealed that there were significant statistical differences between students’ conceptual learning and skills acquiring for the favor of the experimental group. The study recommended applying this model on different educational variables and on other age groups to generate more data leading to more educational results for the favor of students’ learning outcomes.

Keywords: AFC, Lego Education, mechatronics, STEAM, Al-Andalus Fused Curriculum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 841
5152 Cutting Tools in Finishing Operations for CNC Rapid Manufacturing Processes: Experimental Studies

Authors: M. N. Osman Zahid, K. Case, D. Watts

Abstract:

This paper reports an advanced approach in the application of CNC machining for rapid manufacturing processes (CNC-RM). The aim of this study is to improve the quality of machined parts by introducing different cutting tools during finishing operations. As the cutting is performed in different directions, the surfaces presented on part can be classified into several categories. Therefore, suitable cutting tools are assigned to machine particular surfaces and to improve the quality. Experimental studies have been carried out by fabricating several parts based on the suggested approach. The results provide further support for implementing this approach in rapid machining processes.

Keywords: CNC machining, End mill tool, Finishing operation, Rapid manufacturing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2560
5151 On Phase Based Stereo Matching and Its Related Issues

Authors: Andr´as R¨ovid, Takeshi Hashimoto

Abstract:

The paper focuses on the problem of the point correspondence matching in stereo images. The proposed matching algorithm is based on the combination of simpler methods such as normalized sum of squared differences (NSSD) and a more complex phase correlation based approach, by considering the noise and other factors, as well. The speed of NSSD and the preciseness of the phase correlation together yield an efficient approach to find the best candidate point with sub-pixel accuracy in stereo image pairs. The task of the NSSD in this case is to approach the candidate pixel roughly. Afterwards the location of the candidate is refined by an enhanced phase correlation based method which in contrast to the NSSD has to run only once for each selected pixel.

Keywords: Stereo matching, Sub-pixel accuracy, phase correlation, SVD, NSSD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1850
5150 Perceptions of Educators on the Learners’ Youngest Age for the Introduction of ICTs in Schools: A Personality Theory Approach

Authors: K. E. Oyetade, S. D. Eyono Obono

Abstract:

Age ratings are very helpful in providing parents with relevant information for the purchase and use of digital technologies by the children; this is why the non-definition of age ratings for the use of ICTs by children in schools is a major concern; and this problem serves as a motivation for this study whose aim is to examine the factors affecting the perceptions of educators on the learners’ youngest age for the introduction of ICTs in schools. This aim is achieved through two types of research objectives: the identification and design of theories and models on age ratings, and the empirical testing of such theories and models in a survey of educators from the Camperdown district of the South African KwaZulu-Natal province. A questionnaire is used for the collection of the data of this survey whose validity and reliability is checked in SPSS prior to its descriptive and correlative quantitative analysis. The main hypothesis supporting this research is the association between the demographics of educators, their personality, and their perceptions on the learners’ youngest age for the introduction of ICTs in schools; as claimed by existing research; except that the present study looks at personality from three dimensions: self-actualized personalities, fully functioning personalities, and healthy personalities. This hypothesis was fully confirmed by the empirical study conducted by this research except for the demographic factor where only the educators’ grade or class was found to be associated with the personality of educators.

Keywords: Age ratings, Educators, E-learning, Personality Theories.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1831
5149 JREM: An Approach for Formalising Models in the Requirements Phase with JSON and NoSQL Databases

Authors: Aitana Alonso-Nogueira, Helia Estévez-Fernández, Isaías García

Abstract:

This paper presents an approach to reduce some of its current flaws in the requirements phase inside the software development process. It takes the software requirements of an application, makes a conceptual modeling about it and formalizes it within JSON documents. This formal model is lodged in a NoSQL database which is document-oriented, that is, MongoDB, because of its advantages in flexibility and efficiency. In addition, this paper underlines the contributions of the detailed approach and shows some applications and benefits for the future work in the field of automatic code generation using model-driven engineering tools.

Keywords: Conceptual modeling, JSON, NoSQL databases, requirements engineering, software development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1063
5148 Analyzing The Effect of Variable Round Time for Clustering Approach in Wireless Sensor Networks

Authors: Vipin Pal, Girdhari Singh, R P Yadav

Abstract:

As wireless sensor networks are energy constraint networks so energy efficiency of sensor nodes is the main design issue. Clustering of nodes is an energy efficient approach. It prolongs the lifetime of wireless sensor networks by avoiding long distance communication. Clustering algorithms operate in rounds. Performance of clustering algorithm depends upon the round time. A large round time consumes more energy of cluster heads while a small round time causes frequent re-clustering. So existing clustering algorithms apply a trade off to round time and calculate it from the initial parameters of networks. But it is not appropriate to use initial parameters based round time value throughout the network lifetime because wireless sensor networks are dynamic in nature (nodes can be added to the network or some nodes go out of energy). In this paper a variable round time approach is proposed that calculates round time depending upon the number of active nodes remaining in the field. The proposed approach makes the clustering algorithm adaptive to network dynamics. For simulation the approach is implemented with LEACH in NS-2 and the results show that there is 6% increase in network lifetime, 7% increase in 50% node death time and 5% improvement over the data units gathered at the base station.

Keywords: Wireless Sensor Network, Clustering, Energy Efficiency, Round Time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1776
5147 A New Brazilian Friction-Resistant Low Alloy High Strength Steel – A Life Testing Approach

Authors: D. I. De Souza, G. P. Azevedo, R. Rocha

Abstract:

In this paper we will develop a sequential life test approach applied to a modified low alloy-high strength steel part used in highway overpasses in Brazil.We will consider two possible underlying sampling distributions: the Normal and theInverse Weibull models. The minimum life will be considered equal to zero. We will use the two underlying models to analyze a fatigue life test situation, comparing the results obtained from both.Since a major chemical component of this low alloy-high strength steel part has been changed, there is little information available about the possible values that the parameters of the corresponding Normal and Inverse Weibull underlying sampling distributions could have. To estimate the shape and the scale parameters of these two sampling models we will use a maximum likelihood approach for censored failure data. We will also develop a truncation mechanism for the Inverse Weibull and Normal models. We will provide rules to truncate a sequential life testing situation making one of the two possible decisions at the moment of truncation; that is, accept or reject the null hypothesis H0. An example will develop the proposed truncated sequential life testing approach for the Inverse Weibull and Normal models.

Keywords: Sequential life testing, normal and inverse Weibull models, maximum likelihood approach, truncation mechanism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1419
5146 Robust Integrated Design for a Mechatronic Feed Drive System of Machine Tools

Authors: Chin-Yin Chen, Chi-Cheng Cheng

Abstract:

This paper aims at to develop a robust optimization methodology for the mechatronic modules of machine tools by considering all important characteristics from all structural and control domains in one single process. The relationship between these two domains is strongly coupled. In order to reduce the disturbance caused by parameters in either one, the mechanical and controller design domains need to be integrated. Therefore, the concurrent integrated design method Design For Control (DFC), will be employed in this paper. In this connect, it is not only applied to achieve minimal power consumption but also enhance structural performance and system response at same time. To investigate the method for integrated optimization, a mechatronic feed drive system of the machine tools is used as a design platform. Pro/Engineer and AnSys are first used to build the 3D model to analyze and design structure parameters such as elastic deformation, nature frequency and component size, based on their effects and sensitivities to the structure. In addition, the robust controller,based on Quantitative Feedback Theory (QFT), will be applied to determine proper control parameters for the controller. Therefore, overall physical properties of the machine tool will be obtained in the initial stage. Finally, the technology of design for control will be carried out to modify the structural and control parameters to achieve overall system performance. Hence, the corresponding productivity is expected to be greatly improved.

Keywords: Machine tools, integrated structure and control design, design for control, multilevel decomposition, quantitative feedback theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1931
5145 Resource Leveling in Construction Projects using Re- Modified Minimum Moment Approach

Authors: Abhay Tawalare, Rajesh Lalwani

Abstract:

An attempt in this paper proposes a re-modification to the minimum moment approach of resource leveling which is a modified minimum moment approach to the traditional method by Harris. The method is based on critical path method. The new approach suggests the difference between the methods in the selection criteria of activity which needs to be shifted for leveling resource histogram. In traditional method, the improvement factor found first to select the activity for each possible day of shifting. In modified method maximum value of the product of Resources Rate and Free Float was found first and improvement factor is then calculated for that activity which needs to be shifted. In the proposed method the activity to be selected first for shifting is based on the largest value of resource rate. The process is repeated for all the remaining activities for possible shifting to get updated histogram. The proposed method significantly reduces the number of iterations and is easier for manual computations.

Keywords: Re-Modified, Resource Leveling, Resources Rate, Free Float, Resource Histogram

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3811
5144 Investigating Crime Hotspot Places and their Implication to Urban Environmental Design: A Geographic Visualization and Data Mining Approach

Authors: Donna R. Tabangin, Jacqueline C. Flores, Nelson F. Emperador

Abstract:

Information is power. Geographical information is an emerging science that is advancing the development of knowledge to further help in the understanding of the relationship of “place" with other disciplines such as crime. The researchers used crime data for the years 2004 to 2007 from the Baguio City Police Office to determine the incidence and actual locations of crime hotspots. Combined qualitative and quantitative research methodology was employed through extensive fieldwork and observation, geographic visualization with Geographic Information Systems (GIS) and Global Positioning Systems (GPS), and data mining. The paper discusses emerging geographic visualization and data mining tools and methodologies that can be used to generate baseline data for environmental initiatives such as urban renewal and rejuvenation. The study was able to demonstrate that crime hotspots can be computed and were seen to be occurring to some select places in the Central Business District (CBD) of Baguio City. It was observed that some characteristics of the hotspot places- physical design and milieu may play an important role in creating opportunities for crime. A list of these environmental attributes was generated. This derived information may be used to guide the design or redesign of the urban environment of the City to be able to reduce crime and at the same time improve it physically.

Keywords: Crime mapping, data mining, environmental design, geographic visualization, GIS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2591
5143 Quantitative Assessment of Different Formulations of Antimalarials in Sentinel Sites of India

Authors: Taruna Katyal Arora, Geeta Kumari, Hari Shankar, Neelima Mishra

Abstract:

Substandard and counterfeit antimalarials is a major problem in malaria endemic areas. The availability of counterfeit/ substandard medicines is not only decreasing the efficacy in patients, but it is also one of the contributing factors for developing antimalarial drug resistance. Owing to this, a pilot study was conducted to survey quality of drugs collected from different malaria endemic areas of India. Artesunate+Sulphadoxine-Pyrimethamine (AS+SP), Artemether-Lumefantrine (AL), Chloroquine (CQ) tablets were randomly picked from public health facilities in selected states of India. The quality of antimalarial drugs from these areas was assessed by using Global Pharma Health Fund Minilab test kit. This includes physical/visual inspection and disintegration test. Thin-layer chromatography (TLC) was carried out for semi-quantitative assessment of active pharmaceutical ingredients. A total of 45 brands, out of which 21 were for CQ, 14 for AL and 10 for AS+SP were tested from Uttar Pradesh (U.P.), Mizoram, Meghalaya and Gujrat states. One out of 45 samples showed variable disintegration and retension factor. The variable disintegration and retention factor which would have been due to substandard quality or other factors including storage. However, HPLC analysis confirms standard active pharmaceutical ingredient, but may be due to humid temperature and moisture in storage may account for the observed result.

Keywords: Antimalarial medicines, counterfeit, substandard, thin layer chromatography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1509
5142 Robust Control of a High-Speed Manipulator in State Space

Authors: M. M. Fateh, A. Izadbakhsh

Abstract:

A robust control approach is proposed for a high speed manipulator using a hybrid computed torque control approach in the state space. The high-speed manipulator is driven by permanent magnet dc motors to track a trajectory in the joint space in the presence of disturbances. Tracking problem is analyzed in the state space where the completed models are considered for actuators. The proposed control approach can guarantee the stability and a satisfactory tracking performance. A two-link elbow manipulator driven by electrical actuators is simulated and results are shown to satisfy conditions under technical specifications.

Keywords: Computed torque, manipulator, robust control, state space.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2321
5141 Agile Software Development Implementation in Developing a Diet Tracker Mobile Application

Authors: Dwi Puspita Sari, Gulnur Baltabayeva, Nadia Salman, Maxut Toleuov, Vijay Kanabar

Abstract:

Technology era drives people to use mobile phone to support their daily life activities. Technology development has a rapid phase which pushes the IT company to adjust any technology changes in order to fulfill customer’s satisfaction. As a result of that, many companies in the USA emerged from systematics software development approach to agile software development approach in developing systems and applications to develop many mobile phone applications in a short phase to fulfill user’s needs. As a systematic approach is considered as time consuming, costly, and too risky, agile software development has become a more popular approach to use for developing software including mobile applications. This paper reflects a short-term project to develop a diet tracker mobile application using agile software development that focused on applying scrum framework in the development process.

Keywords: Agile software development, scrum, diet tracker, mobile application.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 880
5140 A New Approach to the Approximate Solutions of Hamilton-Jacobi Equations

Authors: Joe Imae, Kenjiro Shinagawa, Tomoaki Kobayashi, Guisheng Zhai

Abstract:

We propose a new approach on how to obtain the approximate solutions of Hamilton-Jacobi (HJ) equations. The process of the approximation consists of two steps. The first step is to transform the HJ equations into the virtual time based HJ equations (VT-HJ) by introducing a new idea of ‘virtual-time’. The second step is to construct the approximate solutions of the HJ equations through a computationally iterative procedure based on the VT-HJ equations. It should be noted that the approximate feedback solutions evolve by themselves as the virtual-time goes by. Finally, we demonstrate the effectiveness of our approximation approach by means of simulations with linear and nonlinear control problems.

Keywords: Nonlinear Control, Optimal Control, Hamilton-Jacobi Equation, Virtual-Time

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1501