Search results for: rational points on elliptic curves.
299 Dimension Reduction of Microarray Data Based on Local Principal Component
Authors: Ali Anaissi, Paul J. Kennedy, Madhu Goyal
Abstract:
Analysis and visualization of microarraydata is veryassistantfor biologists and clinicians in the field of diagnosis and treatment of patients. It allows Clinicians to better understand the structure of microarray and facilitates understanding gene expression in cells. However, microarray dataset is a complex data set and has thousands of features and a very small number of observations. This very high dimensional data set often contains some noise, non-useful information and a small number of relevant features for disease or genotype. This paper proposes a non-linear dimensionality reduction algorithm Local Principal Component (LPC) which aims to maps high dimensional data to a lower dimensional space. The reduced data represents the most important variables underlying the original data. Experimental results and comparisons are presented to show the quality of the proposed algorithm. Moreover, experiments also show how this algorithm reduces high dimensional data whilst preserving the neighbourhoods of the points in the low dimensional space as in the high dimensional space.
Keywords: Linear Dimension Reduction, Non-Linear Dimension Reduction, Principal Component Analysis, Biologists.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1574298 Image Contrast Enhancement based Sub-histogram Equalization Technique without Over-equalization Noise
Authors: Hyunsup Yoon, Youngjoon Han, Hernsoo Hahn
Abstract:
In order to enhance the contrast in the regions where the pixels have similar intensities, this paper presents a new histogram equalization scheme. Conventional global equalization schemes over-equalizes these regions so that too bright or dark pixels are resulted and local equalization schemes produce unexpected discontinuities at the boundaries of the blocks. The proposed algorithm segments the original histogram into sub-histograms with reference to brightness level and equalizes each sub-histogram with the limited extents of equalization considering its mean and variance. The final image is determined as the weighted sum of the equalized images obtained by using the sub-histogram equalizations. By limiting the maximum and minimum ranges of equalization operations on individual sub-histograms, the over-equalization effect is eliminated. Also the result image does not miss feature information in low density histogram region since the remaining these area is applied separating equalization. This paper includes how to determine the segmentation points in the histogram. The proposed algorithm has been tested with more than 100 images having various contrasts in the images and the results are compared to the conventional approaches to show its superiority.
Keywords: Contrast Enhancement, Histogram Equalization, Histogram Region Equalization, Equalization Noise
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3419297 Implementation of Student-Centered Learning Approach in Building Surveying Course
Authors: Amal A. Abdel-Sattar
Abstract:
The curriculum of architecture department in Prince Sultan University includes ‘Building Surveying’ course which is usually a part of civil engineering courses. As a fundamental requirement of the course, it requires a strong background in mathematics and physics, which are not usually preferred subjects to the architecture students and many of them are not giving the required and necessary attention to these courses during their preparation year before commencing their architectural study. This paper introduces the concept and the methodology of the student-centered learning approach in the course of building surveying for architects. One of the major outcomes is the improvement in the students’ involvement in the course and how this will cover and strength their analytical weak points and improve their mathematical skills. The study is conducted through three semesters with a total number of 99 students. The effectiveness of the student-centered learning approach is studied using the student survey at the end of each semester and teacher observations. This survey showed great acceptance of the students for these methods. Also, the teachers observed a great improvement in the students’ mathematical abilities and how keener they became in attending the classes which were clearly reflected on the low absence record.
Keywords: Architecture, building surveying, student-centered learning, teaching, and learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1283296 The Use of Mnemonic and Mathematical Mnemonic Method in Improving Historical Understanding
Authors: Lee Bih Ni, Nurul Asyikin Binti Hassan
Abstract:
This paper discusses the use of mnemonic and mathematical methods in enhancing the understanding of history. Mnemonics can help students from all levels including high school and in various disciplines including language, math and history. At the secondary level, students are exposed to various courses that require them to remember many facts that can be mastered through the application of mnemonic techniques. Researchers use narrative literature studies to illustrate the current state of art and science in the field of research focused. Researchers used narrative literature reviews to build a scientific base of knowledge. Researchers gather all the key points in the discussion, and put it here by referring to the specific field where the paper is essentially based. The findings suggest that the use of mnemonic techniques can improve the individual's memory by adding little effort. In implementing mnemonic techniques, it is important to integrate mathematics and history in the course as both are interconnected as mathematics has shaped our history and vice versa. This study shows that memory skills can actually be improved; the human mind can remember something more than expected.
Keywords: Cognitive strategy, mnemonic technique, secondary school level study, mathematical mnemonic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1026295 Automated Thickness Measurement of Retinal Blood Vessels for Implementation of Clinical Decision Support Systems in Diagnostic Diabetic Retinopathy
Authors: S.Jerald Jeba Kumar, M.Madheswaran
Abstract:
The structure of retinal vessels is a prominent feature, that reveals information on the state of disease that are reflected in the form of measurable abnormalities in thickness and colour. Vascular structures of retina, for implementation of clinical diabetic retinopathy decision making system is presented in this paper. Retinal Vascular structure is with thin blood vessel, whose accuracy is highly dependent upon the vessel segmentation. In this paper the blood vessel thickness is automatically detected using preprocessing techniques and vessel segmentation algorithm. First the capture image is binarized to get the blood vessel structure clearly, then it is skeletonised to get the overall structure of all the terminal and branching nodes of the blood vessels. By identifying the terminal node and the branching points automatically, the main and branching blood vessel thickness is estimated. Results are presented and compared with those provided by clinical classification on 50 vessels collected from Bejan Singh Eye hospital..Keywords: Diabetic retinopathy, Binarization, SegmentationClinical Decision Support Systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2043294 Statistical Analysis of Stresses in Rigid Pavement
Authors: Aleš Florian, Lenka Ševelová, Rudolf Hela
Abstract:
Complex statistical analysis of stresses in concrete slab of the real type of rigid pavement is performed. The computational model of the pavement is designed as a spatial (3D) model, is based on a nonlinear variant of the finite element method that respects the structural nonlinearity, enables to model different arrangement of joints, and the entire model can be loaded by the thermal load. Interaction of adjacent slabs in joints and contact of the slab and the subsequent layer are modeled with help of special contact elements. Four concrete slabs separated by transverse and longitudinal joints and the additional subgrade layers and soil to the depth of about 3m are modeled. The thickness of individual layers, physical and mechanical properties of materials, characteristics of joints, and the temperature of the upper and lower surface of slabs are supposed to be random variables. The modern simulation technique Updated Latin Hypercube Sampling with 20 simulations is used for statistical analysis. As results, the estimates of basic statistics of the principal stresses s1 and s3 in 53 points on the upper and lower surface of the slabs are obtained.Keywords: concrete, FEM, pavement, simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1574293 Estimation of Human Absorbed Dose Using Compartmental Model
Authors: M. Mousavi-Daramoroudi, H. Yousefnia, F. Abbasi-Davani, S. Zolghadri
Abstract:
Dosimetry is an indispensable and precious factor in patient treatment planning to minimize the absorbed dose in vital tissues. In this study, compartmental model was used in order to estimate the human absorbed dose of 177Lu-DOTATOC from the biodistribution data in wild type rats. For this purpose, 177Lu-DOTATOC was prepared under optimized conditions and its biodistribution was studied in male Syrian rats up to 168 h. Compartmental model was applied to mathematical description of the drug behaviour in tissue at different times. Dosimetric estimation of the complex was performed using radiation absorbed dose assessment resource (RADAR). The biodistribution data showed high accumulation in the adrenal and pancreas as the major expression sites for somatostatin receptor (SSTR). While kidneys as the major route of excretion receive 0.037 mSv/MBq, pancreas and adrenal also obtain 0.039 and 0.028 mSv/MBq. Due to the usage of this method, the points of accumulated activity data were enhanced, and further information of tissues uptake was collected that it will be followed by high (or improved) precision in dosimetric calculations.
Keywords: Compartmental modeling, human absorbed dose, 177Lu-DOTATOC, Syrian rats.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 926292 Preliminary Design of Frozen Soil Simulation System Based on Finite Element Simulation
Authors: Wenyu Song, Bingxi Li, Zhongbin Fu, Baocheng Jiang
Abstract:
Full - Scale Accelerated Loading System, one part of “the Eleventh - Five - Year National Grand Technology Infrastructure Program" is a facility to evaluate the performance and service life of different kinds of pavements subjected to traffic loading under full - controlled environment. While simulating the environments of frigid zone and permafrost zone, the accurate control of air temperature, road temperature and roadbed temperature are the key points and also aporias for the designment. In this paper, numerical simulations are used to determine the design parameters of the frozen soil simulation system. At first, a brief introduction of the Full - Scale Accelerate Loading System was given. Then, the temperature control method of frozen soil simulation system was proposed. Finally, by using finite element simulations, the optimal design of frozen soil simulation system was obtained. This proposed design, which was obtained by finite element simulations, provided significant referents to the ultimate design of the environment simulation system.Keywords: China, finite element simulation, frozen soilsimulation system, preliminary design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1583291 PRO-Teaching – Sharing Ideas to Develop Capabilities
Authors: Steve J. Drew, Christopher J. Klopper
Abstract:
In this paper, the action research driven design of a context relevant, developmental peer review of teaching model, its implementation strategy and its impact at an Australian university is presented. PRO-Teaching realizes an innovative process that triangulates contemporaneous teaching quality data from a range of stakeholders including students, discipline academics, learning and teaching expert academics, and teacher reflection to create reliable evidence of teaching quality. Data collected over multiple classroom observations allows objective reporting on development differentials in constructive alignment, peer, and student evaluations. Further innovation is realized in the application of this highly structured developmental process to provide summative evidence of sufficient validity to support claims for professional advancement and learning and teaching awards. Design decision points and contextual triggers are described within the operating domain. Academics and developers seeking to introduce structured peer review of teaching into their organization will find this paper a useful reference.Keywords: Development loop, Multiple data sources, Objective reporting, Peer review of teaching.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1763290 Performance Comparison of Particle Swarm Optimization with Traditional Clustering Algorithms used in Self-Organizing Map
Authors: Anurag Sharma, Christian W. Omlin
Abstract:
Self-organizing map (SOM) is a well known data reduction technique used in data mining. It can reveal structure in data sets through data visualization that is otherwise hard to detect from raw data alone. However, interpretation through visual inspection is prone to errors and can be very tedious. There are several techniques for the automatic detection of clusters of code vectors found by SOM, but they generally do not take into account the distribution of code vectors; this may lead to unsatisfactory clustering and poor definition of cluster boundaries, particularly where the density of data points is low. In this paper, we propose the use of an adaptive heuristic particle swarm optimization (PSO) algorithm for finding cluster boundaries directly from the code vectors obtained from SOM. The application of our method to several standard data sets demonstrates its feasibility. PSO algorithm utilizes a so-called U-matrix of SOM to determine cluster boundaries; the results of this novel automatic method compare very favorably to boundary detection through traditional algorithms namely k-means and hierarchical based approach which are normally used to interpret the output of SOM.Keywords: cluster boundaries, clustering, code vectors, data mining, particle swarm optimization, self-organizing maps, U-matrix.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1909289 The Research and Application of M/M/1/N Queuing Model with Variable Input Rates, Variable Service Rates and Impatient Customers
Authors: Quanru Pan
Abstract:
How to maintain the service speeds for the business to make the biggest profit is a problem worthy of study, which is discussed in this paper with the use of queuing theory. An M/M/1/N queuing model with variable input rates, variable service rates and impatient customers is established, and the following conclusions are drawn: the stationary distribution of the model, the relationship between the stationary distribution and the probability that there are n customers left in the system when a customer leaves (not including the customer who leaves himself), the busy period of the system, the average operating cycle, the loss probability for the customers not entering the system while they arriving at the system, the mean of the customers who leaves the system being for impatient, the loss probability for the customers not joining the queue due to the limited capacity of the system and many other indicators. This paper also indicates that the following conclusion is not correct: the more customers the business serve, the more profit they will get. At last, this paper points out the appropriate service speeds the business should keep to make the biggest profit.Keywords: variable input rates, impatient customer, variable servicerates, profit maximization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1963288 A Bi-Objective Stochastic Mathematical Model for Agricultural Supply Chain Network
Authors: Mohammad Mahdi Paydar, Armin Cheraghalipour, Mostafa Hajiaghaei-Keshteli
Abstract:
Nowadays, in advanced countries, agriculture as one of the most significant sectors of the economy, plays an important role in its political and economic independence. Due to farmers' lack of information about products' demand and lack of proper planning for harvest time, annually the considerable amount of products is corrupted. Besides, in this paper, we attempt to improve these unfavorable conditions via designing an effective supply chain network that tries to minimize total costs of agricultural products along with minimizing shortage in demand points. To validate the proposed model, a stochastic optimization approach by using a branch and bound solver of the LINGO software is utilized. Furthermore, to accumulate the data of parameters, a case study in Mazandaran province placed in the north of Iran has been applied. Finally, using ɛ-constraint approach, a Pareto front is obtained and one of its Pareto solutions as best solution is selected. Then, related results of this solution are explained. Finally, conclusions and suggestions for the future research are presented.Keywords: Perishable products, stochastic optimization, agricultural supply chain, ɛ-constraint.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1003287 Improvement of Central Composite Design in Modeling and Optimization of Simulation Experiments
Authors: A. Nuchitprasittichai, N. Lerdritsirikoon, T. Khamsing
Abstract:
Simulation modeling can be used to solve real world problems. It provides an understanding of a complex system. To develop a simplified model of process simulation, a suitable experimental design is required to be able to capture surface characteristics. This paper presents the experimental design and algorithm used to model the process simulation for optimization problem. The CO2 liquefaction based on external refrigeration with two refrigeration circuits was used as a simulation case study. Latin Hypercube Sampling (LHS) was purposed to combine with existing Central Composite Design (CCD) samples to improve the performance of CCD in generating the second order model of the system. The second order model was then used as the objective function of the optimization problem. The results showed that adding LHS samples to CCD samples can help capture surface curvature characteristics. Suitable number of LHS sample points should be considered in order to get an accurate nonlinear model with minimum number of simulation experiments.Keywords: Central composite design, CO2 liquefaction, Latin Hypercube Sampling, simulation – based optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 741286 The Effects of Cross-Border Use of Drones in Nigerian National Security
Authors: H. P. Kerry
Abstract:
Drone technology has become a significant discourse in a nation’s national security, while this technology could constitute a danger to national security on the one hand, on the other hand, it is used in developed and developing countries for border security, and in some cases, for protection of security agents and migrants. In the case of Nigeria, drones are used by the military to monitor and tighten security around the borders. However, terrorist groups have devised a means to utilize the technology to their advantage. Therefore, the potential danger in the widespread proliferation of this technology has become a myriad of risks. The research on the effects of cross-border use of drones in Nigerian national security looks at the negative and positive consequences of using drone technology. The study employs the use of interviews and relevant documents to obtain data while the study applied the Just War theory to justify the reason why countries use force; it further buttresses the points with what the realist theory thinks about the use of force. In conclusion, the paper recommends that the Nigerian government through the National Assembly should pass a bill for the establishment of a law that will guide the use of armed and unarmed drones in Nigeria enforced by the Nigeria Civil Aviation Authority and the office of the National Security Adviser.
Keywords: Armed drones, cross-border, drones, national security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1147285 Mode III Interlaminar Fracture in Woven Glass/Epoxy Composite Laminates
Authors: Farhad Asgari Mehrabadi, Mohammad Reza Khoshravan
Abstract:
In the present study, fracture behavior of woven fabric-reinforced glass/epoxy composite laminates under mode III crack growth was experimentally investigated and numerically modeled. Two methods were used for the calculation of the strain energy release rate: the experimental compliance calibration (CC) method and the Virtual Crack Closure Technique (VCCT). To achieve this aim ECT (Edge Crack Torsion) was used to evaluate fracture toughness in mode III loading (out of plane-shear) at different crack lengths. Load–displacement and associated energy release rates were obtained for various case of interest. To calculate fracture toughness JIII, two criteria were considered including non-linearity and maximum points in load-displacement curve and it is observed that JIII increases with the crack length increase. Both the experimental compliance method and the virtual crack closure technique proved applicable for the interpretation of the fracture mechanics data of woven glass/epoxy laminates in mode III.Keywords: Mode III, Fracture, Composite, Crack growth Finite Element.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2533284 Forecast Based on an Empirical Probability Function with an Adjusted Error Using Propagation of Error
Authors: Oscar Javier Herrera, Manuel Ángel Camacho
Abstract:
This paper addresses a cutting edge method of business demand forecasting, based on an empirical probability function when the historical behavior of the data is random. Additionally, it presents error determination based on the numerical method technique ‘propagation of errors.’ The methodology was conducted characterization and process diagnostics demand planning as part of the production management, then new ways to predict its value through techniques of probability and to calculate their mistake investigated, it was tools used numerical methods. All this based on the behavior of the data. This analysis was determined considering the specific business circumstances of a company in the sector of communications, located in the city of Bogota, Colombia. In conclusion, using this application it was possible to obtain the adequate stock of the products required by the company to provide its services, helping the company reduce its service time, increase the client satisfaction rate, reduce stock which has not been in rotation for a long time, code its inventory, and plan reorder points for the replenishment of stock.Keywords: Demand Forecasting, Empirical Distribution, Propagation of Error.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1844283 A Study about the Distribution of the Spanning Ratios of Yao Graphs
Authors: Maryam Hsaini, Mostafa Nouri-Baygi
Abstract:
A critical problem in wireless sensor networks is limited battery and memory of nodes. Therefore, each node in the network could maintain only a subset of its neighbors to communicate with. This will increase the battery usage in the network because each packet should take more hops to reach its destination. In order to tackle these problems, spanner graphs are defined. Since each node has a small degree in a spanner graph and the distance in the graph is not much greater than its actual geographical distance, spanner graphs are suitable candidates to be used for the topology of a wireless sensor network. In this paper, we study Yao graphs and their behavior for a randomly selected set of points. We generate several random point sets and compare the properties of their Yao graphs with the complete graph. Based on our data sets, we obtain several charts demonstrating how Yao graphs behave for a set of randomly chosen point set. As the results show, the stretch factor of a Yao graph follows a normal distribution. Furthermore, the stretch factor is in average far less than the worst case stretch factor proved for Yao graphs in previous results. Furthermore, we use Yao graph for a realistic point set and study its stretch factor in real world.
Keywords: Wireless sensor network, spanner graph, Yao Graph.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 597282 Identification, Prediction and Detection of the Process Fault in a Cement Rotary Kiln by Locally Linear Neuro-Fuzzy Technique
Authors: Masoud Sadeghian, Alireza Fatehi
Abstract:
In this paper, we use nonlinear system identification method to predict and detect process fault of a cement rotary kiln. After selecting proper inputs and output, an input-output model is identified for the plant. To identify the various operation points in the kiln, Locally Linear Neuro-Fuzzy (LLNF) model is used. This model is trained by LOLIMOT algorithm which is an incremental treestructure algorithm. Then, by using this method, we obtained 3 distinct models for the normal and faulty situations in the kiln. One of the models is for normal condition of the kiln with 15 minutes prediction horizon. The other two models are for the two faulty situations in the kiln with 7 minutes prediction horizon are presented. At the end, we detect these faults in validation data. The data collected from White Saveh Cement Company is used for in this study.Keywords: Cement Rotary Kiln, Fault Detection, Delay Estimation Method, Locally Linear Neuro Fuzzy Model, LOLIMOT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673281 A Differential Calculus Based Image Steganography with Crossover
Authors: Srilekha Mukherjee, Subha Ash, Goutam Sanyal
Abstract:
Information security plays a major role in uplifting the standard of secured communications via global media. In this paper, we have suggested a technique of encryption followed by insertion before transmission. Here, we have implemented two different concepts to carry out the above-specified tasks. We have used a two-point crossover technique of the genetic algorithm to facilitate the encryption process. For each of the uniquely identified rows of pixels, different mathematical methodologies are applied for several conditions checking, in order to figure out all the parent pixels on which we perform the crossover operation. This is done by selecting two crossover points within the pixels thereby producing the newly encrypted child pixels, and hence the encrypted cover image. In the next lap, the first and second order derivative operators are evaluated to increase the security and robustness. The last lap further ensures reapplication of the crossover procedure to form the final stego-image. The complexity of this system as a whole is huge, thereby dissuading the third party interferences. Also, the embedding capacity is very high. Therefore, a larger amount of secret image information can be hidden. The imperceptible vision of the obtained stego-image clearly proves the proficiency of this approach.Keywords: Steganography, Crossover, Differential Calculus, Peak Signal to Noise Ratio, Cross-correlation Coefficient.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1394280 Mixture Design Experiment on Flow Behaviour of O/W Emulsions as Affected by Polysaccharide Interactions
Authors: Nor Hayati Ibrahim, Yaakob B. Che Man, Chin Ping Tan, Nor Aini Idris
Abstract:
Interaction effects of xanthan gum (XG), carboxymethyl cellulose (CMC), and locust bean gum (LBG) on the flow properties of oil-in-water emulsions were investigated by a mixture design experiment. Blends of XG, CMC and LBG were prepared according to an augmented simplex-centroid mixture design (10 points) and used at 0.5% (wt/wt) in the emulsion formulations. An appropriate mathematical model was fitted to express each response as a function of the proportions of the blend components that are able to empirically predict the response to any blend of combination of the components. The synergistic interaction effect of the ternary XG:CMC:LBG blends at approximately 33-67% XG levels was shown to be much stronger than that of the binary XG:LBG blend at 50% XG level (p < 0.05). Nevertheless, an antagonistic interaction effect became significant as CMC level in blends was more than 33% (p < 0.05). Yield stress and apparent viscosity (at 10 s-1) responses were successfully fitted with a special quartic model while flow behaviour index and consistency coefficient were fitted with a full quartic model (R2 adjusted ≥ 0.90). This study found that a mixture design approach could serve as a valuable tool in better elucidating and predicting the interaction effects beyond the conventional twocomponent blends.Keywords: O/W emulsions, flow behavior, polysaccharideinteraction, mixture design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2220279 Comparing Data Analysis, Communication and Information Technologies Expertise Levels in Undergraduate Psychology Students
Authors: Ana Cázares
Abstract:
Aims for this study: first, to compare the expertise level in data analysis, communication and information technologies in undergraduate psychology students. Second, to verify the factor structure of E-ETICA (Escala de Experticia en Tecnologias de la Informacion, la Comunicacion y el Análisis or Data Analysis, Communication and Information'Expertise Scale) which had shown an excellent internal consistency (α= 0.92) as well as a simple factor structure. Three factors, Complex, Basic Information and Communications Technologies and E-Searching and Download Abilities, explains 63% of variance. In the present study, 260 students (119 juniors and 141 seniors) were asked to respond to ETICA (16 items Likert scale of five points 1: null domain to 5: total domain). The results show that both junior and senior students report having very similar expertise level; however, E-ETICA presents a different factor structure for juniors and four factors explained also 63% of variance: Information E-Searching, Download and Process; Data analysis; Organization; and Communication technologies.Keywords: Data analysis, Information, Communications Technologies, Expertise'Levels.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1286278 The Importance of Development in Laboratory Diagnosis at the Intersection
Authors: Agus Sahri, Cahya Putra Dinata, Faishal Andhi Rokhman
Abstract:
Intersection is a critical area on a highway which is a place of conflict points and congestion due to the meeting of two or more roads. Conflicts that occur at the intersection include diverging, merging, weaving, and crossing. To deal with these conflicts, a crossing control system is needed, at a plot of intersection there are two control systems namely signal intersections and non-signalized intersections. The control system at a plot of intersection can affect the intersection performance. In Indonesia there are still many intersections with poor intersection performance. In analyzing the parameters to measure the performance of a plot of intersection in Indonesia, it is guided by the 1997 Indonesian Road Capacity Manual. For this reason, this study aims to develop laboratory diagnostics at plot intersections to analyze parameters that can affect the performance of an intersection. The research method used is research and development. The laboratory diagnosis includes anamnesis, differential diagnosis, inspection, diagnosis, prognosis, specimens, analysis and sample data analysts. It is expected that this research can encourage the development and application of laboratory diagnostics at a plot of intersection in Indonesia so that intersections can function optimally.
Keywords: Intersection, laboratory diagnostic, control systems, Indonesia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 753277 Analysing the Elementary Science and Technology Coursebook and Student Workbook in Terms of Constructivism
Authors: Nil Duban
Abstract:
The curriculum of the primary school science course was redesigned on the basis of constructivism in 2005-2006 academic years, in Turkey. In this context, the name of this course has been changed as “Science and Technology"; and both content and course books, students workbooks for this course have been redesigned in light of constructivism. The aim of this study is to determine whether the Science and Technology course books and student work books for primary school 5th grade are appropriate for the constructivism by evaluating them in terms of the fundamental principles of constructivism. In this study, out of qualitative research methods, documentation technique (i.e. document analysis) is applied; while selecting samples, criterion-sampling is used out of purposeful sampling techniques. When the Science and Technology course book and workbook for the 5th grade in primary education are examined, it is seen that both books complete each other in certain areas. Consequently, it can be claimed that in spite of some inadequate and missing points in the course book and workbook of the primary school Science and Technology course for the 5th grade students, these books are attempted to be designed in terms of the principles of constructivism. To overcome the inadequacies in the books, it can be suggested to redesign them. In addition to them, not to ignore the technology dimension of the course, the activities that encourage the students to prepare projects using technology cycle should be included.
Keywords: Constructivism, coursebooks, science and technology education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1960276 Attribute Weighted Class Complexity: A New Metric for Measuring Cognitive Complexity of OO Systems
Authors: Dr. L. Arockiam, A. Aloysius
Abstract:
In general, class complexity is measured based on any one of these factors such as Line of Codes (LOC), Functional points (FP), Number of Methods (NOM), Number of Attributes (NOA) and so on. There are several new techniques, methods and metrics with the different factors that are to be developed by the researchers for calculating the complexity of the class in Object Oriented (OO) software. Earlier, Arockiam et.al has proposed a new complexity measure namely Extended Weighted Class Complexity (EWCC) which is an extension of Weighted Class Complexity which is proposed by Mishra et.al. EWCC is the sum of cognitive weights of attributes and methods of the class and that of the classes derived. In EWCC, a cognitive weight of each attribute is considered to be 1. The main problem in EWCC metric is that, every attribute holds the same value but in general, cognitive load in understanding the different types of attributes cannot be the same. So here, we are proposing a new metric namely Attribute Weighted Class Complexity (AWCC). In AWCC, the cognitive weights have to be assigned for the attributes which are derived from the effort needed to understand their data types. The proposed metric has been proved to be a better measure of complexity of class with attributes through the case studies and experimentsKeywords: Software Complexity, Attribute Weighted Class Complexity, Weighted Class Complexity, Data Type
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2121275 On-line Lao Handwritten Recognition with Proportional Invariant Feature
Authors: Khampheth Bounnady, Boontee Kruatrachue, Somkiat Wangsiripitak
Abstract:
This paper proposed high level feature for online Lao handwritten recognition. This feature must be high level enough so that the feature is not change when characters are written by different persons at different speed and different proportion (shorter or longer stroke, head, tail, loop, curve). In this high level feature, a character is divided in to sequence of curve segments where a segment start where curve reverse rotation (counter clockwise and clockwise). In each segment, following features are gathered cumulative change in direction of curve (- for clockwise), cumulative curve length, cumulative length of left to right, right to left, top to bottom and bottom to top ( cumulative change in X and Y axis of segment). This feature is simple yet robust for high accuracy recognition. The feature can be gather from parsing the original time sampling sequence X, Y point of the pen location without re-sampling. We also experiment on other segmentation point such as the maximum curvature point which was widely used by other researcher. Experiments results show that the recognition rates are at 94.62% in comparing to using maximum curvature point 75.07%. This is due to a lot of variations of turning points in handwritten.
Keywords: Handwritten feature, chain code, Lao handwritten recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2032274 A Structural Support Vector Machine Approach for Biometric Recognition
Authors: Vishal Awasthi, Atul Kumar Agnihotri
Abstract:
Face is a non-intrusive strong biometrics for identification of original and dummy facial by different artificial means. Face recognition is extremely important in the contexts of computer vision, psychology, surveillance, pattern recognition, neural network, content based video processing. The availability of a widespread face database is crucial to test the performance of these face recognition algorithms. The openly available face databases include face images with a wide range of poses, illumination, gestures and face occlusions but there is no dummy face database accessible in public domain. This paper presents a face detection algorithm based on the image segmentation in terms of distance from a fixed point and template matching methods. This proposed work is having the most appropriate number of nodal points resulting in most appropriate outcomes in terms of face recognition and detection. The time taken to identify and extract distinctive facial features is improved in the range of 90 to 110 sec. with the increment of efficiency by 3%.Keywords: Face recognition, Principal Component Analysis, PCA, Linear Discriminant Analysis, LDA, Improved Support Vector Machine, iSVM, elastic bunch mapping technique.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 493273 Gabriel-constrained Parametric Surface Triangulation
Authors: Oscar E. Ruiz, Carlos Cadavid, Juan G. Lalinde, Ricardo Serrano, Guillermo Peris-Fajarnes
Abstract:
The Boundary Representation of a 3D manifold contains FACES (connected subsets of a parametric surface S : R2 -! R3). In many science and engineering applications it is cumbersome and algebraically difficult to deal with the polynomial set and constraints (LOOPs) representing the FACE. Because of this reason, a Piecewise Linear (PL) approximation of the FACE is needed, which is usually represented in terms of triangles (i.e. 2-simplices). Solving the problem of FACE triangulation requires producing quality triangles which are: (i) independent of the arguments of S, (ii) sensitive to the local curvatures, and (iii) compliant with the boundaries of the FACE and (iv) topologically compatible with the triangles of the neighboring FACEs. In the existing literature there are no guarantees for the point (iii). This article contributes to the topic of triangulations conforming to the boundaries of the FACE by applying the concept of parameterindependent Gabriel complex, which improves the correctness of the triangulation regarding aspects (iii) and (iv). In addition, the article applies the geometric concept of tangent ball to a surface at a point to address points (i) and (ii). Additional research is needed in algorithms that (i) take advantage of the concepts presented in the heuristic algorithm proposed and (ii) can be proved correct.Keywords: surface triangulation, conforming triangulation, surfacesampling, Gabriel complex.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1662272 Online Brands: A Comparative Study of World Top Ranked Universities with Science and Technology Programs
Authors: Zullina H. Shaari, Amzairi Amar, Abdul Mutalib Embong, Hezlina Hashim
Abstract:
University websites are considered as one of the brand primary touch points for multiple stakeholders, but most of them did not have great designs to create favorable impressions. Some of the elements that web designers should carefully consider are the appearance, the content, the functionality, usability and search engine optimization. However, priority should be placed on website simplicity and negative space. In terms of content, previous research suggests that universities should include reputation, learning environment, graduate career prospects, image destination, cultural integration, and virtual tour on their websites. The study examines how top 200 world ranking science and technology-based universities present their brands online and whether the websites capture the content dimensions. Content analysis of the websites revealed that the top ranking universities captured these dimensions at varying degree. Besides, the UK-based university had better priority on website simplicity and negative space compared to the Malaysian-based university.
Keywords: Science and technology programs, top-ranked universities, online brands, university websites.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2292271 Chinese Language Teaching as a Second Language: Immersion Teaching
Authors: Lee Bih Ni, Kiu Su Na
Abstract:
This paper discusses the Chinese Language Teaching as a Second Language by focusing on Immersion Teaching. Researchers used narrative literature review to describe the current states of both art and science in focused areas of inquiry. Immersion teaching comes with a standard that teachers must reliably meet. Chinese language-immersion instruction consists of language and content lessons, including functional usage of the language, academic language, authentic language, and correct Chinese sociocultural language. Researchers used narrative literature reviews to build a scientific knowledge base. Researchers collected all the important points of discussion, and put them here with reference to the specific field where this paper is originally based on. The findings show that Chinese Language in immersion teaching is not like standard foreign language classroom; immersion setting provides more opportunities to teach students colloquial language than academic. Immersion techniques also introduce a language’s cultural and social contexts in a meaningful and memorable way. It is particularly important that immersion teachers connect classwork with real-life experiences. Immersion also includes more elements of discovery and inquiry based learning than do other kinds of instructional practices. Students are always and consistently interpreted the conclusions and context clues.Keywords: A second language, Chinese language teaching, immersion teaching, instructional strategies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2182270 The Effectiveness of University’s Strategic Plan for Sustainability through Collaborative Platform’s Deliberation Matrix
Authors: Ashiquer Rahman
Abstract:
The paper focuses on the significance of the university's sustainability strategic plan and emphasizes the usefulness of the collaborative platform-based deliberation matrix. It will equip the university's leadership to handle impending tactics and challenges with the sustainability of the university’s strategic plan. The study addresses the significance of a set of reference points that will precede operational activities for multi-stakeholder multi-criteria evaluation on the optimal standards of Sustainable University, as well as potential action for the strategic blueprint of Sustainable University. It makes reference to the university’s sustainability strategy plan’s effectiveness through a collaborative platform and deliberation matrix. The paper outlines the conceptual framing of a sustainable university by implementing a strategic plan over the collaborative platform and deliberation matrix. Optimistically, these will be a milestone in higher education; a pathway to prepare for the University’s upcoming implementation of its sustainability strategy. In fact, the collaborative platform and deliberation matrix both are enhancement needles for institutional cooperation to the completive world.
Keywords: Sustainable strategies, institutional cooperation, multi-stakeholder multi-criteria assessment, collaborative platform, innovative method and tools.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 81