Search results for: Statistical Analysis.
8295 Early Requirement Engineering for Design of Learner Centric Dynamic LMS
Authors: Kausik Halder, Nabendu Chaki, Ranjan Dasgupta
Abstract:
We present a modeling framework that supports the engineering of early requirements specifications for design of learner centric dynamic Learning Management System. The framework is based on i* modeling tool and Means End Analysis, that adopts primitive concepts for modeling early requirements (such as actor, goal, and strategic dependency). We show how pedagogical and computational requirements for designing a learner centric Learning Management system can be adapted for the automatic early requirement engineering specifications. Finally, we presented a model on a Learner Quanta based adaptive Courseware. Our early requirement analysis shows that how means end analysis reveals gaps and inconsistencies in early requirements specifications that are by no means trivial to discover without the help of formal analysis tool.
Keywords: Adaptive Courseware, Early Requirement Engineering, Means End Analysis, Organizational Modeling, Requirement Modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16478294 Component-based Segmentation of Words from Handwritten Arabic Text
Authors: Jawad H AlKhateeb, Jianmin Jiang, Jinchang Ren, Stan S Ipson
Abstract:
Efficient preprocessing is very essential for automatic recognition of handwritten documents. In this paper, techniques on segmenting words in handwritten Arabic text are presented. Firstly, connected components (ccs) are extracted, and distances among different components are analyzed. The statistical distribution of this distance is then obtained to determine an optimal threshold for words segmentation. Meanwhile, an improved projection based method is also employed for baseline detection. The proposed method has been successfully tested on IFN/ENIT database consisting of 26459 Arabic words handwritten by 411 different writers, and the results were promising and very encouraging in more accurate detection of the baseline and segmentation of words for further recognition.Keywords: Arabic OCR, off-line recognition, Baseline estimation, Word segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22058293 Holistic Approach to Teaching Mathematics in Secondary School as a Means of Improving Students’ Comprehension of Study Material
Authors: Natalia Podkhodova, Olga Sheremeteva, Mariia Soldaeva
Abstract:
Creating favourable conditions for students’ comprehension of mathematical content is one of the primary problems in teaching mathematics in secondary school. The fact of comprehension includes the ability to build a working situational model and thus becomes an important means of solving mathematical problems. This paper describes a holistic approach to teaching mathematics designed to address the primary challenges of such teaching; specifically, the challenge of students’ comprehension. Essentially, this approach consists of (1) establishing links between the attributes of the notion: the sense, the meaning, and the term; (2) taking into account the components of student’s subjective experience—value-based emotions, contextual, procedural and communicative—during the educational process; (3) linking together different ways to present mathematical information; (4) identifying and leveraging the relationships between real, perceptual and conceptual (scientific) mathematical spaces by applying real-life situational modelling. The article describes approaches to the practical use of these foundational concepts. Identifying how proposed methods and techniques influence understanding of material used in teaching mathematics was the primary goal. The study included an experiment in which 256 secondary school students took part: 142 in the study group and 114 in the control group. All students in these groups had similar levels of achievement in math and studied math under the same curriculum. In the course of the experiment, comprehension of two topics — “Derivative” and “Trigonometric functions”—was evaluated. Control group participants were taught using traditional methods. Students in the study group were taught using the holistic method: under teacher’s guidance, they carried out assignments designed to establish linkages between notion’s characteristics, to convert information from one mode of presentation to another, as well as assignments that required the ability to operate with all modes of presentation. Identification, accounting for and transformation of subjective experience were associated with methods of stimulating the emotional value component of the studied mathematical content (discussions of lesson titles, assignments aimed to create study dominants, performing theme-related physical exercise ...) The use of techniques that forms inter-subject notions based on linkages between, perceptual real and mathematical conceptual spaces proved to be of special interest to the students. Results of the experiment were analysed by presenting students in each of the groups with a final test in each of the studied topics. The test included assignments that required building real situational models. Statistical analysis was used to aggregate test results. Pierson criterion x2 was used to reveal statistics significance of results (pass-fail the modelling test). Significant difference of results was revealed (p < 0.001), which allowed to conclude that students in the study group showed better comprehension of mathematical information than those in the control group. The total number of completed assignments of each student was analysed as well, with average results calculated for each group. Statistical significance of result differences against the quantitative criterion (number of completed assignments) was determined using Student’s t-test, which showed that students in the study group completed significantly more assignments than those in the control group (p = 0.0001). Authors thus come to the conclusion that suggested increase in the level of comprehension of study material took place as a result of applying implemented methods and techniques.
Keywords: Comprehension of mathematical content, holistic approach to teaching mathematics in secondary school, subjective experience, technology of the formation of inter-subject notions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6058292 Multipurpose Three Dimensional Finite Element Procedure for Thermal Analysis in Pulsed Current Gas Tungsten Arc Welding of AZ 31B Magnesium Alloy Sheets
Authors: N.Karunakaran, V.Balasubramanian
Abstract:
This paper presents the results of a study aimed at establishing the temperature distribution during the welding of magnesium alloy sheets by Pulsed Current Gas Tungsten Arc Welding (PCGTAW) and Constant Current Gas Tungsten Arc Welding (CCGTAW) processes. Pulsing of the GTAW welding current influences the dimensions and solidification rate of the fused zone, it also reduces the weld pool volume hence a narrower bead. In this investigation, the base material considered was 2mm thin AZ 31 B magnesium alloy, which is finding use in aircraft, automobile and high-speed train components. A finite element analysis was carried out using ANSYS, and the results of the FEA were compared with the experimental results. It is evident from this study that the finite element analysis using ANSYS can be effectively used to model PCGTAW process for finding temperature distribution.Keywords: gas tungsten arc welding, pulsed current, finiteelement analysis, thermal analysis, magnesium alloy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20028291 A Mixed Method Investigation of the Impact of Practicum Experience on Mathematics Female Pre-Service Teachers’ Sense of Preparedness
Authors: Fatimah Alsaleh, Glenda Anthony
Abstract:
The practicum experience is a critical component of any initial teacher education (ITE) course. As well as providing a near authentic setting for pre-service teachers (PSTs) to practice in, it also plays a key role in shaping their perceptions and sense of preparedness. Nevertheless, merely including a practicum period as a compulsory part of ITE may not in itself be enough to induce feelings of preparedness and efficacy; the quality of the classroom experience must also be considered. Drawing on findings of a larger study of secondary and intermediate level mathematics PSTs’ sense of preparedness to teach, this paper examines the influence of the practicum experience in particular. The study sample comprised female mathematics PSTs who had almost completed their teaching methods course in their fourth year of ITE across 16 teacher education programs in Saudi Arabia. The impact of the practicum experience on PSTs’ sense of preparedness was investigated via a mixed-methods approach combining a survey (N = 105) and in-depth interviews with survey volunteers (N = 16). Statistical analysis in SPSS was used to explore the quantitative data, and thematic analysis was applied to the qualitative interviews data. The results revealed that the PSTs perceived the practicum experience to have played a dominant role in shaping their feelings of preparedness and efficacy. However, despite the generally positive influence of practicum, the PSTs also reported numerous challenges that lessened their feelings of preparedness. These challenges were often related to the classroom environment and the school culture. For example, about half of the PSTs indicated that the practicum schools did not have the resources available or the support necessary to help them learn the work of teaching. In particular, the PSTs expressed concerns about translating the theoretical knowledge learned at the university into practice in authentic classrooms. These challenges engendered PSTs feeling less prepared and suggest that more support from both the university and the school is needed to help PSTs develop a stronger sense of preparedness. The area in which PSTs felt least prepared was that of classroom and behavior management, although the results also indicated that PSTs only felt a moderate level of general teaching efficacy and were less confident about how to support students as learners. Again, feelings of lower efficacy were related to the dissonance between the theory presented at university and real-world classroom practice. In order to close this gap between theory and practice, PSTs expressed the wish to have more time in the practicum, and more accountability for support from school-based mentors. In highlighting the challenges of the practicum in shaping PSTs’ sense of preparedness and efficacy, the study argues that better communication between the ITE providers and the practicum schools is necessary in order to maximize the benefit of the practicum experience.
Keywords: Mathematics, practicum experience, pre-service teachers, sense of preparedness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11218290 Open Source Library Management System Software: A Review
Authors: Sangsuree Vasupongayya, Kittisak Keawneam, Kittipong Sengloilaun, Patt Emmawat
Abstract:
Library management systems are commonly used in all educational related institutes. Many commercial products are available. However, many institutions may not be able to afford the cost of using commercial products. Therefore, an alternative solution in such situations would be open source software. This paper is focusing on reviewing open source library management system packages currently available. The review will focus on the abilities to perform four basic components which are traditional services, interlibrary load management, managing electronic materials and basic common management system such as security, alert system and statistical reports. In addition, environment, basic requirement and supporting aspects of each open source package are also mentioned.Keywords: open source, library management, review.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 73518289 Scanning Device for Sampling the Spatial Distribution of the E-field
Authors: Juan Blas, Alfonso Bahillo, Santiago Mazuelas, David Bullido, Patricia Fernandez, Ruben M. Lorenzo, Evaristo J. Abril
Abstract:
This paper presents a low cost automatic system for sampling the electric field in a limited area. The scanning area is a flat surface parallel to the ground at a selected height. We discuss in detail the hardware, software and all the arrangements involved in the system operation. In order to show the system performance we include a campaign of narrow band measurements with 6017 sample points in the surroundings of a cellular base station. A commercial isotropic antenna with three orthogonal axes was used as sampling device. The results are analyzed in terms of its space average, standard deviation and statistical distribution.Keywords: measurement device, propagation, spatial sampling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13848288 Real Time Acquisition and Psychoacoustic Analysis of Brain Wave
Authors: Shweta Singh, Dipali Bansal, Rashima Mahajan
Abstract:
Psychoacoustics has become a potential area of research due to the growing interest of both laypersons and medical and mental health professionals. Non invasive brain computer interface like Electroencephalography (EEG) is widely being used in this field. An attempt has been made in this paper to examine the response of EEG signals to acoustic stimuli further analyzing the brain electrical activity. The real time EEG is acquired for 6 participants using a cost effective and portable EMOTIV EEG neuro headset. EEG data analysis is further done using EMOTIV test bench, EDF browser and EEGLAB (MATLAB Tool) application software platforms. Spectral analysis of acquired neural signals (AF3 channel) using these software platforms are clearly indicative of increased brain activity in various bands. The inferences drawn from such an analysis have significant correlation with subject’s subjective reporting of the experiences. The results suggest that the methodology adopted can further be used to assist patients with sleeping and depressive disorders.
Keywords: OM’ chant, Spectral analysis, EDF Browser, EEGLAB, EMOTIV, Real time Acquisition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35768287 Evaluating the Understanding of the University Students (Basic Sciences and Engineering) about the Numerical Representation of the Average Rate of Change
Authors: Saeid Haghjoo, Ebrahim Reyhani, Fahimeh Kolahdouz
Abstract:
The present study aimed to evaluate the understanding of the students in Tehran universities (Iran) about the numerical representation of the average rate of change based on the Structure of Observed Learning Outcomes (SOLO). In the present descriptive-survey research, the statistical population included undergraduate students (basic sciences and engineering) in the universities of Tehran. The samples were 604 students selected by random multi-stage clustering. The measurement tool was a task whose face and content validity was confirmed by math and mathematics education professors. Using Cronbach's Alpha criterion, the reliability coefficient of the task was obtained 0.95, which verified its reliability. The collected data were analyzed by descriptive statistics and inferential statistics (chi-squared and independent t-tests) under SPSS-24 software. According to the SOLO model in the prestructural, unistructural, and multistructural levels, basic science students had a higher percentage of understanding than that of engineering students, although the outcome was inverse at the relational level. However, there was no significant difference in the average understanding of both groups. The results indicated that students failed to have a proper understanding of the numerical representation of the average rate of change, in addition to missconceptions when using physics formulas in solving the problem. In addition, multiple solutions were derived along with their dominant methods during the qualitative analysis. The current research proposed to focus on the context problems with approximate calculations and numerical representation, using software and connection common relations between math and physics in the teaching process of teachers and professors.
Keywords: Average rate of change, context problems, derivative, numerical representation, SOLO taxonomy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7598286 Combined Analysis of Sudoku Square Designs with Same Treatments
Authors: A. Danbaba
Abstract:
Several experiments are conducted at different environments such as locations or periods (seasons) with identical treatments to each experiment purposely to study the interaction between the treatments and environments or between the treatments and periods (seasons). The commonly used designs of experiments for this purpose are randomized block design, Latin square design, balanced incomplete block design, Youden design, and one or more factor designs. The interest is to carry out a combined analysis of the data from these multi-environment experiments, instead of analyzing each experiment separately. This paper proposed combined analysis of experiments conducted via Sudoku square design of odd order with same experimental treatments.Keywords: Sudoku designs, combined analysis, multi-environment experiments, common treatments.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15378285 Viability of Bradyrhizobium japanicum on Soybean Seeds Enhanced by Magnetite Nanoparticles during Desiccation
Authors: M. R. Ghalamboran, J. J. Ramsden
Abstract:
The aim of this study was to investigate whether magnetite nanoparticles affect the viability of Bradyrhizobium japanicum cells residing on the surface of soybean seeds during desiccation. Different concentrations of nanoparticles suspended in liquid medium, mixed with and adhering to Bradyrhizobium japanicum, were investigated at two temperatures, using both soybean seeds and glass beads as surrogates. Statistical design was a complete randomized block (CRB) in a factorial 6×2×2×6 experimental arrangement with four replications. The most important variable was the viability of Bradyrhizobium on the surface of the seeds. The nanoparticles increased Bradyrhizobium viability and inoculated seeds stored at low temperature had greater viability when nanoparticles had been added. At the optimum nanoparticle concentration, 50% bacterium viability on the seeds was retained after 5 days at 4ºC. Possible explanations for the observed effects are proposed.Keywords: Bradyrhizobium japanicum, magnetitenanoparticles, soybean seed, viability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12898284 Average Current Estimation Technique for Reliability Analysis of Multiple Semiconductor Interconnects
Authors: Ki-Young Kim, Jae-Ho Lim, Deok-Min Kim, Seok-Yoon Kim
Abstract:
Average current analysis checking the impact of current flow is very important to guarantee the reliability of semiconductor systems. As semiconductor process technologies improve, the coupling capacitance often become bigger than self capacitances. In this paper, we propose an analytic technique for analyzing average current on interconnects in multi-conductor structures. The proposed technique has shown to yield the acceptable errors compared to HSPICE results while providing computational efficiency.Keywords: current moment, interconnect modeling, reliability analysis, worst-case switching
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13868283 Brand Placement Strategies in Turkey: The Case of “Yalan Dünya”
Authors: Burçe Boyraz
Abstract:
This study examines appearances of brand placement as an alternative communication strategy in television series by focusing on Yalan Dünya which is one of the most popular television series in Turkey. Consequently, this study has a descriptive research design and quantitative content analysis method is used in order to analyze frequency and time data of brand placement appearances in first 3 seasons of Yalan Dünya with 16 episodes. Analysis of brand placement practices in Yalan Dünya is dealt in three categories: episode-based analysis, season-based analysis and comparative analysis. At the end, brand placement practices in Yalan Dünya are evaluated in terms of type, form, duration and legal arrangements. As a result of this study, it is seen that brand placement plays a determinant role in Yalan Dünya content. Also, current legal arrangements make brand placement closer to other traditional communication strategies instead of differing brand placement from them distinctly.
Keywords: Advertising, Alternative communication strategy, Brand placement, Yalan Dünya.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 43698282 Generalized Mean-field Theory of Phase Unwrapping via Multiple Interferograms
Authors: Yohei Saika
Abstract:
On the basis of Bayesian inference using the maximizer of the posterior marginal estimate, we carry out phase unwrapping using multiple interferograms via generalized mean-field theory. Numerical calculations for a typical wave-front in remote sensing using the synthetic aperture radar interferometry, phase diagram in hyper-parameter space clarifies that the present method succeeds in phase unwrapping perfectly under the constraint of surface- consistency condition, if the interferograms are not corrupted by any noises. Also, we find that prior is useful for extending a phase in which phase unwrapping under the constraint of the surface-consistency condition. These results are quantitatively confirmed by the Monte Carlo simulation.
Keywords: Bayesian inference, generalized mean-field theory, phase unwrapping, statistical mechanics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16558281 Intellectual Capital and Competitive Advantage: An Analysis of the Biotechnology Industry
Authors: Campisi Domenico, Costa Roberta
Abstract:
Intellectual capital measurement is a central aspect of knowledge management. The measurement and the evaluation of intangible assets play a key role in allowing an effective management of these assets as sources of competitiveness. For these reasons, managers and practitioners need conceptual and analytical tools taking into account the unique characteristics and economic significance of Intellectual Capital. Following this lead, we propose an efficiency and productivity analysis of Intellectual Capital, as a determinant factor of the company competitive advantage. The analysis is carried out by means of Data Envelopment Analysis (DEA) and Malmquist Productivity Index (MPI). These techniques identify Bests Practice companies that have accomplished competitive advantage implementing successful strategies of Intellectual Capital management, and offer to inefficient companies development paths by means of benchmarking. The proposed methodology is employed on the Biotechnology industry in the period 2007-2010.
Keywords: Data Envelopment Analysis, Innovation, Intangible assets, Intellectual Capital, Malmquist Productivity Index.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19228280 Extraction of Significant Phrases from Text
Authors: Yuan J. Lui
Abstract:
Prospective readers can quickly determine whether a document is relevant to their information need if the significant phrases (or keyphrases) in this document are provided. Although keyphrases are useful, not many documents have keyphrases assigned to them, and manually assigning keyphrases to existing documents is costly. Therefore, there is a need for automatic keyphrase extraction. This paper introduces a new domain independent keyphrase extraction algorithm. The algorithm approaches the problem of keyphrase extraction as a classification task, and uses a combination of statistical and computational linguistics techniques, a new set of attributes, and a new machine learning method to distinguish keyphrases from non-keyphrases. The experiments indicate that this algorithm performs better than other keyphrase extraction tools and that it significantly outperforms Microsoft Word 2000-s AutoSummarize feature. The domain independence of this algorithm has also been confirmed in our experiments.
Keywords: classification, keyphrase extraction, machine learning, summarization
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20498279 Using Data Mining Techniques for Estimating Minimum, Maximum and Average Daily Temperature Values
Authors: S. Kotsiantis, A. Kostoulas, S. Lykoudis, A. Argiriou, K. Menagias
Abstract:
Estimates of temperature values at a specific time of day, from daytime and daily profiles, are needed for a number of environmental, ecological, agricultural and technical applications, ranging from natural hazards assessments, crop growth forecasting to design of solar energy systems. The scope of this research is to investigate the efficiency of data mining techniques in estimating minimum, maximum and mean temperature values. For this reason, a number of experiments have been conducted with well-known regression algorithms using temperature data from the city of Patras in Greece. The performance of these algorithms has been evaluated using standard statistical indicators, such as Correlation Coefficient, Root Mean Squared Error, etc.
Keywords: regression algorithms, supervised machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34168278 Enhanced Clustering Analysis and Visualization Using Kohonen's Self-Organizing Feature Map Networks
Authors: Kasthurirangan Gopalakrishnan, Siddhartha Khaitan, Anshu Manik
Abstract:
Cluster analysis is the name given to a diverse collection of techniques that can be used to classify objects (e.g. individuals, quadrats, species etc). While Kohonen's Self-Organizing Feature Map (SOFM) or Self-Organizing Map (SOM) networks have been successfully applied as a classification tool to various problem domains, including speech recognition, image data compression, image or character recognition, robot control and medical diagnosis, its potential as a robust substitute for clustering analysis remains relatively unresearched. SOM networks combine competitive learning with dimensionality reduction by smoothing the clusters with respect to an a priori grid and provide a powerful tool for data visualization. In this paper, SOM is used for creating a toroidal mapping of two-dimensional lattice to perform cluster analysis on results of a chemical analysis of wines produced in the same region in Italy but derived from three different cultivators, referred to as the “wine recognition data" located in the University of California-Irvine database. The results are encouraging and it is believed that SOM would make an appealing and powerful decision-support system tool for clustering tasks and for data visualization.
Keywords: Artificial neural networks, cluster analysis, Kohonen maps, wine recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21218277 Development of Logic Model for R&D Program Plan Analysis in Preliminary Feasibility Study
Authors: Hyun-Kyu Kang
Abstract:
The Korean Government has applied the preliminary feasibility study to new government R&D program plans as a part of an evaluation system for R&D programs. The preliminary feasibility study for the R&D program is composed of 3 major criteria such as technological, policy and economic analysis. The program logic model approach is used as a part of the technological analysis in the preliminary feasibility study. We has developed and improved the R&D program logic model. The logic model is a very useful tool for evaluating R&D program plans. Using a logic model, we can generally identify important factors of the R&D program plan, analyze its logic flow and find the disconnection or jump in the logic flow among components of the logic model.
Keywords: Preliminary feasibility study, R&D program logic model, technological analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21558276 Two Spherical Three Degrees of Freedom Parallel Robots 3-RCC and 3-RRS Static Analysis
Authors: Alireza Abbasi Moshaii, Mehdi Tale Masouleh, Esmail Zarezadeh, Kamran Farajzadeh
Abstract:
The main purpose of this study is static analysis of two three-degree of freedom parallel mechanisms: 3-RCC and 3- RRS. Geometry of these mechanisms is expressed and static equilibrium equations are derived for the whole chains. For these mechanisms due to the equal number of equations and unknowns, the solution is as same as 3-RCC mechanism. A mathematical software is used to solve the equations. In order to prove the results obtained from solving the equations of mechanisms, the CAD model of these robots has been simulated and their static is analysed in ADAMS software. Due to symmetrical geometry of the mechanisms, the force and external torque acting on the end-effecter have been considered asymmetric to prove the generality of the solution method. Finally, the results of both softwares, for both mechanisms are extracted and compared as graphs. The good achieved comparison between the results indicates the accuracy of the analysis.Keywords: Robotic, Static analysis, 3-RCC, 3-RRS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19658275 Evaluation on Recent Committed Crypt Analysis Hash Function
Authors: A. Arul Lawrence Selvakumar, C. Suresh Ganandhas
Abstract:
This paper describes the study of cryptographic hash functions, one of the most important classes of primitives used in recent techniques in cryptography. The main aim is the development of recent crypt analysis hash function. We present different approaches to defining security properties more formally and present basic attack on hash function. We recall Merkle-Damgard security properties of iterated hash function. The Main aim of this paper is the development of recent techniques applicable to crypt Analysis hash function, mainly from SHA family. Recent proposed attacks an MD5 & SHA motivate a new hash function design. It is designed not only to have higher security but also to be faster than SHA-256. The performance of the new hash function is at least 30% better than that of SHA-256 in software. And it is secure against any known cryptographic attacks on hash functions.
Keywords: Crypt Analysis, cryptographic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13358274 Introductory Design Optimisation of a Machine Tool using a Virtual Machine Concept
Authors: Johan Wall, Johan Fredin, Anders Jönsson, Göran Broman
Abstract:
Designing modern machine tools is a complex task. A simulation tool to aid the design work, a virtual machine, has therefore been developed in earlier work. The virtual machine considers the interaction between the mechanics of the machine (including structural flexibility) and the control system. This paper exemplifies the usefulness of the virtual machine as a tool for product development. An optimisation study is conducted aiming at improving the existing design of a machine tool regarding weight and manufacturing accuracy at maintained manufacturing speed. The problem can be categorised as constrained multidisciplinary multiobjective multivariable optimisation. Parameters of the control and geometric quantities of the machine are used as design variables. This results in a mix of continuous and discrete variables and an optimisation approach using a genetic algorithm is therefore deployed. The accuracy objective is evaluated according to international standards. The complete systems model shows nondeterministic behaviour. A strategy to handle this based on statistical analysis is suggested. The weight of the main moving parts is reduced by more than 30 per cent and the manufacturing accuracy is improvement by more than 60 per cent compared to the original design, with no reduction in manufacturing speed. It is also shown that interaction effects exist between the mechanics and the control, i.e. this improvement would most likely not been possible with a conventional sequential design approach within the same time, cost and general resource frame. This indicates the potential of the virtual machine concept for contributing to improved efficiency of both complex products and the development process for such products. Companies incorporating such advanced simulation tools in their product development could thus improve its own competitiveness as well as contribute to improved resource efficiency of society at large.Keywords: Machine tools, Mechatronics, Non-deterministic, Optimisation, Product development, Virtual machine
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19668273 Regional Analysis of Streamflow Drought: A Case Study for Southwestern Iran
Authors: M. Byzedi, B. Saghafian
Abstract:
Droughts are complex, natural hazards that, to a varying degree, affect some parts of the world every year. The range of drought impacts is related to drought occurring in different stages of the hydrological cycle and usually different types of droughts, such as meteorological, agricultural, hydrological, and socioeconomical are distinguished. Streamflow drought was analyzed by the method of truncation level (at 70% level) on daily discharges measured in 54 hydrometric stations in southwestern Iran. Frequency analysis was carried out for annual maximum series (AMS) of drought deficit volume and duration series. Some factors including physiographic, climatic, geologic, and vegetation cover were studied as influential factors in the regional analysis. According to the results of factor analysis, six most effective factors were identified as area, rainfall from December to February, the percent of area with Normalized Difference Vegetation Index (NDVI) <0.1, the percent of convex area, drainage density and the minimum of watershed elevation that explained 90.9% of variance. The homogenous regions were determined by cluster analysis and discriminate function analysis. Suitable multivariate regression models were evaluated for streamflow drought deficit volume with 2 years return period. The significance level of regression models was 0.01. The results showed that the watershed area is the most effective factor with high correlation with deficit volume. Also, drought duration was not a suitable drought index for regional analysis.Keywords: Iran, Streamflow drought, truncation level method, regional analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17438272 Power System Contingency Analysis Using Multiagent Systems
Authors: Anant Oonsivilai, Kenedy A. Greyson
Abstract:
The demand of the energy management systems (EMS) set forth by modern power systems requires fast energy management systems. Contingency analysis is among the functions in EMS which is time consuming. In order to handle this limitation, this paper introduces agent based technology in the contingency analysis. The main function of agents is to speed up the performance. Negotiations process in decision making is explained and the issue set forth is the minimization of the operating costs. The IEEE 14 bus system and its line outage have been used in the research and simulation results are presented.
Keywords: Agents, model, negotiation, optimal dispatch, powersystems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21298271 An Advanced Time-Frequency Domain Method for PD Extraction with Non-Intrusive Measurement
Authors: Guomin Luo, Daming Zhang, Yong Kwee Koh, Kim Teck Ng, Helmi Kurniawan, Weng Hoe Leong
Abstract:
Partial discharge (PD) detection is an important method to evaluate the insulation condition of metal-clad apparatus. Non-intrusive sensors which are easy to install and have no interruptions on operation are preferred in onsite PD detection. However, it often lacks of accuracy due to the interferences in PD signals. In this paper a novel PD extraction method that uses frequency analysis and entropy based time-frequency (TF) analysis is introduced. The repetitive pulses from convertor are first removed via frequency analysis. Then, the relative entropy and relative peak-frequency of each pulse (i.e. time-indexed vector TF spectrum) are calculated and all pulses with similar parameters are grouped. According to the characteristics of non-intrusive sensor and the frequency distribution of PDs, the pulses of PD and interferences are separated. Finally the PD signal and interferences are recovered via inverse TF transform. The de-noised result of noisy PD data demonstrates that the combination of frequency and time-frequency techniques can discriminate PDs from interferences with various frequency distributions.Keywords: Entropy, Fourier analysis, non-intrusive measurement, time-frequency analysis, partial discharge
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15878270 Identifying Relationships between Technology-based Services and ICTs: A Patent Analysis Approach
Authors: Chulhyun Kim, Seungkyum Kim, Moon-soo Kim
Abstract:
A variety of new technology-based services have emerged with the development of Information and Communication Technologies (ICTs). Since technology-based services have technology-driven characteristics, the identification of relationships between technology-based services and ICTs would give meaningful implications. Thus, this paper proposes an approach for identifying the relationships between technology-based services and ICTs by analyzing patent documents. First, business model (BM) patents are classified into relevant service categories. Second, patent citation analysis is conducted to investigate the technological linkage and impacts between technology-based services and ICTs at macro level. Third, as a micro level analysis, patent co-classification analysis is employed to identify the technological linkage and coverage. The proposed approach could guide and help managers and designers of technology-based services to discover the opportunity of the development of new technology-based services in emerging service sectors.Keywords: Technology-based Services, Information and Communication Technology (ICT), Business Model (BM) Patent, Patent Analysis, Technological Relationship
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19888269 Time Series Forecasting Using Independent Component Analysis
Authors: Theodor D. Popescu
Abstract:
The paper presents a method for multivariate time series forecasting using Independent Component Analysis (ICA), as a preprocessing tool. The idea of this approach is to do the forecasting in the space of independent components (sources), and then to transform back the results to the original time series space. The forecasting can be done separately and with a different method for each component, depending on its time structure. The paper gives also a review of the main algorithms for independent component analysis in the case of instantaneous mixture models, using second and high-order statistics. The method has been applied in simulation to an artificial multivariate time series with five components, generated from three sources and a mixing matrix, randomly generated.Keywords: Independent Component Analysis, second order statistics, simulation, time series forecasting
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17788268 Role of Association Rule Mining in Numerical Data Analysis
Authors: Sudhir Jagtap, Kodge B. G., Shinde G. N., Devshette P. M
Abstract:
Numerical analysis naturally finds applications in all fields of engineering and the physical sciences, but in the 21st century, the life sciences and even the arts have adopted elements of scientific computations. The numerical data analysis became key process in research and development of all the fields [6]. In this paper we have made an attempt to analyze the specified numerical patterns with reference to the association rule mining techniques with minimum confidence and minimum support mining criteria. The extracted rules and analyzed results are graphically demonstrated. Association rules are a simple but very useful form of data mining that describe the probabilistic co-occurrence of certain events within a database [7]. They were originally designed to analyze market-basket data, in which the likelihood of items being purchased together within the same transactions are analyzed.Keywords: Numerical data analysis, Data Mining, Association Rule Mining
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28608267 Embedded Systems Energy Consumption Analysis Through Co-modelling and Simulation
Authors: José Antonio Esparza Isasa, Finn Overgaard Hansen, Peter Gorm Larsen
Abstract:
This paper presents a new methodology to study power and energy consumption in mechatronic systems early in the development process. This new approach makes use of two modeling languages to represent and simulate embedded control software and electromechanical subsystems in the discrete event and continuous time domain respectively within a single co-model. This co-model enables an accurate representation of power and energy consumption and facilitates the analysis and development of both software and electro-mechanical subsystems in parallel. This makes the engineers aware of energy-wise implications of different design alternatives and enables early trade-off analysis from the beginning of the analysis and design activities.
Keywords: Energy consumption, embedded systems, modeldriven engineering, power awareness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20738266 Wavelet-Based ECG Signal Analysis and Classification
Authors: Madina Hamiane, May Hashim Ali
Abstract:
This paper presents the processing and analysis of ECG signals. The study is based on wavelet transform and uses exclusively the MATLAB environment. This study includes removing Baseline wander and further de-noising through wavelet transform and metrics such as signal-to noise ratio (SNR), Peak signal-to-noise ratio (PSNR) and the mean squared error (MSE) are used to assess the efficiency of the de-noising techniques. Feature extraction is subsequently performed whereby signal features such as heart rate, rise and fall levels are extracted and the QRS complex was detected which helped in classifying the ECG signal. The classification is the last step in the analysis of the ECG signals and it is shown that these are successfully classified as Normal rhythm or Abnormal rhythm. The final result proved the adequacy of using wavelet transform for the analysis of ECG signals.
Keywords: ECG Signal, QRS detection, thresholding, wavelet decomposition, feature extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1272