Search results for: linear programming problem
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10494

Search results for: linear programming problem

7824 Modelling Asymmetric Magnetic Recording Heads with an Underlayer Using Superposition

Authors: Ammar Edress Mohamed, Mustafa Aziz, David Wright

Abstract:

This paper analyses and calculates the head fields of asymmetrical 2D magnetic recording heads when the soft-underlayer is present using the appropriate Green's function to derive the surface potential/field by utilising the surface potential for asymmetrical head without underlayer. The results follow closely the corners, while the gap region shows a linear behaviour for d/g < 0.5 compared with the calculated fields from finite-element.

Keywords: magnetic recording, finite elements, asymmetrical magnetic heads, superposition, Laplace's equation

Procedia PDF Downloads 385
7823 Age Estimation from Upper Anterior Teeth by Pulp/Tooth Ratio Using Peri-Apical X-Rays among Egyptians

Authors: Fatma Mohamed Magdy Badr El Dine, Amr Mohamed Abd Allah

Abstract:

Introduction: Age estimation of individuals is one of the crucial steps in forensic practice. Different traditional methods rely on the length of the diaphysis of long bones of limbs, epiphyseal-diaphyseal union, fusion of the primary ossification centers as well as dental eruption. However, there is a growing need for the development of precise and reliable methods to estimate age, especially in cases where dismembered corpses, burnt bodies, purified or fragmented parts are recovered. Teeth are the hardest and indestructible structure in the human body. In recent years, assessment of pulp/tooth area ratio, as an indirect quantification of secondary dentine deposition has received a considerable attention. However, scanty work has been done in Egypt in terms of applicability of pulp/tooth ratio for age estimation. Aim of the Work: The present work was designed to assess the Cameriere’s method for age estimation from pulp/tooth ratio of maxillary canines, central and lateral incisors among a sample from Egyptian population. In addition, to formulate regression equations to be used as population-based standards for age determination. Material and Methods: The present study was conducted on 270 peri-apical X-rays of maxillary canines, central and lateral incisors (collected from 131 males and 139 females aged between 19 and 52 years). The pulp and tooth areas were measured using the Adobe Photoshop software program and the pulp/tooth area ratio was computed. Linear regression equations were determined separately for canines, central and lateral incisors. Results: A significant correlation was recorded between the pulp/tooth area ratio and the chronological age. The linear regression analysis revealed a coefficient of determination (R² = 0.824 for canine, 0.588 for central incisor and 0.737 for lateral incisor teeth). Three regression equations were derived. Conclusion: As a conclusion, the pulp/tooth ratio is a useful technique for estimating age among Egyptians. Additionally, the regression equation derived from canines gave better result than the incisors.

Keywords: age determination, canines, central incisors, Egypt, lateral incisors, pulp/tooth ratio

Procedia PDF Downloads 179
7822 Problem Based Learning and Teaching by Example in Dimensioning of Mechanisms: Feedback

Authors: Nicolas Peyret, Sylvain Courtois, Gaël Chevallier

Abstract:

This article outlines the development of the Project Based Learning (PBL) at the level of a last year’s Bachelor’s Degree. This form of pedagogy has for objective to allow a better involving of the students from the beginning of the module. The theoretical contributions are introduced during the project to solving a technological problem. The module in question is the module of mechanical dimensioning method of Supméca a French engineering school. This school issues a Master’s Degree. While the teaching methods used in primary and secondary education are frequently renewed in France at the instigation of teachers and inspectors, higher education remains relatively traditional in its practices. Recently, some colleagues have felt the need to put the application back at the heart of their theoretical teaching. This need is induced by the difficulty of covering all the knowledge deductively before its application. It is therefore tempting to make the students 'learn by doing', even if it doesn’t cover some parts of the theoretical knowledge. The other argument that supports this type of learning is the lack of motivation the students have for the magisterial courses. The role-play allowed scenarios favoring interaction between students and teachers… However, this pedagogical form known as 'pedagogy by project' is difficult to apply in the first years of university studies because of the low level of autonomy and individual responsibility that the students have. The question of what the student actually learns from the initial program as well as the evaluation of the competences acquired by the students in this type of pedagogy also remains an open problem. Thus we propose to add to the pedagogy by project format a regressive part of interventionism by the teacher based on pedagogy by example. This pedagogical scenario is based on the cognitive load theory and Bruner's constructivist theory. It has been built by relying on the six points of the encouragement process defined by Bruner, with a concrete objective, to allow the students to go beyond the basic skills of dimensioning and allow them to acquire the more global skills of engineering. The implementation of project-based teaching coupled with pedagogy by example makes it possible to compensate for the lack of experience and autonomy of first-year students, while at the same time involving them strongly in the first few minutes of the module. In this project, students have been confronted with the real dimensioning problems and are able to understand the links and influences between parameter variations and dimensioning, an objective that we did not reach in classical teaching. It is this form of pedagogy which allows to accelerate the mastery of basic skills and so spend more time on the engineer skills namely the convergence of each dimensioning in order to obtain a validated mechanism. A self-evaluation of the project skills acquired by the students will also be presented.

Keywords: Bruner's constructivist theory, mechanisms dimensioning, pedagogy by example, problem based learning

Procedia PDF Downloads 188
7821 Security as Human Value: Issue of Human Rights in Indian Sub-Continental Operations

Authors: Pratyush Vatsala, Sanjay Ahuja

Abstract:

The national security and human rights are related terms as there is nothing like absolute security or absolute human right. If we are committed to security, human right is a problem and also a solution, and if we deliberate on human rights, security is a problem but also part of the solution. Ultimately, we have to maintain a balance between the two co-related terms. As more and more armed forces are being deployed by the government within the nation for maintaining peace and security, using force against its own citizen, the search for a judicious balance between intent and action needs to be emphasized. Notwithstanding that a nation state needs complete political independence; the search for security is a driving force behind unquestioned sovereignty. If security is a human value, it overlaps the value of freedom, order, and solidarity. Now, the question needs to be explored, to what extent human rights can be compromised in the name of security in Kashmir or Mizoram like places. The present study aims to explore the issue of maintaining a balance between the use of power and good governance as human rights, providing security as a human value. This paper has been prepared with an aim of strengthening the understanding of the complex and multifaceted relationship between human rights and security forces operating for conflict management and identifies some of the critical human rights issues raised in the context of security forces operations highlighting the relevant human rights principles and standards in which Security as human value be respected at all times and in particular in the context of security forces operations in India.

Keywords: Kashmir, Mizoram, security, value, human right

Procedia PDF Downloads 270
7820 Speaking Difficulties Encountered by EFL Learners in Secondary School in Morocco

Authors: Bellali Assia, Bellali Fatima

Abstract:

Speaking is one of the most difficult English skills for non-English learners. This study investigated English-speaking difficulties encountered by non-English secondary school students in a private school in Casablanca, Morocco. The subjects were students of 63 (male and female) from 2ed year classes level. It also aims to investigate the degree of main speaking difficulties and the factors effecting non-English students to speak English. This research used a descriptive qualitative and quantitative approach with a questionnaire and an interview to collect the data. In linguistically related difficulties, there were four difficulties, namely vocabulary, grammar, conversation and pronunciation. The results revealed that there were 40.32% of students agreed that they do not have sufficient grammar knowledge, 45.16% of students agreed that they do not have enough vocabulary, 45.90% of students agreed that they have difficulty in conversation, and 39.34% of students agreed that they have poor pronunciation. Also, the results indicated that 63.33 % of students agreed that they have problems with self-confidence. The factors causing the problem of speaking English in this study were lack of general knowledge, lack of speaking practice, fear of mistakes and grammar practice, low participation, shyness, nervousness, fear of criticism, and unfamiliar word pronunciation. Furthermore, recommendations and suggestions were presented to solve the problem and eliminate difficulties for teachers and students.

Keywords: English speaking, difficulties, factors, non-English students

Procedia PDF Downloads 11
7819 Determining the Materiality of an Undisclosed Fact: An Onerous Duty on the Assured

Authors: Adekemi Adebowale

Abstract:

The duty of disclosure in Nigerian insurance law is in need of reform. The materiality of an undisclosed fact (notwithstanding that it was an honest and innocent non-disclosure) currently entitles insurers to avoid insurance policies, leaving an insured with an uncovered loss. While the test of materiality requires an insured to voluntarily disclose facts that will influence an insurer's decision without proper guidelines from the insurer, the insurer is only expected to prove that the undisclosed fact had influenced its judgment in fixing the premium or determining whether to accept the risk. This problem places an onerous duty on the assured to volunteer to the insurer every material fact even though the insured only has a slight idea about the mind of a hypothetical prudent insurer. This paper explores the modern approach to revisiting the problem of an insured’s pre-contractual obligation to determine material facts in Nigerian insurance law. The aim is to build upon the change in the structure of insurance contract obligations in other common law jurisdictions such as the United Kingdom. The doctrinal and comparative methodology captures the burden imposed on the insured under the existing Nigerian insurance law. It finds that the continued application of the law leaves the insured in the weakest position, and he stands to lose in a contract supposedly created for his benefit. It is apparent that if this problem remains unresolved, the over-all consequence will contribute to a significant decline in the insurance contract, which may affect the Nigerian economy. The paper aims to evaluate the risks of the continuous application of the traditional law, which does not keep with the pace of modern insurance practice. It will ultimately produce a legally compliant reform, along with a significant deviation from the archaic structure that exists in the Nigerian insurance law. This paper forms part of an on-going PhD research on "The insured’s pre-contractual duty of utmost of utmost good faith". The outcome from the research to date finds that the insured bears the burden of the obligation to act in utmost good faith where it concerns disclosure of material facts.

Keywords: disclosure, materiality, Nigeria, United Kingdom, utmost good faith

Procedia PDF Downloads 114
7818 Rigorous Photogrammetric Push-Broom Sensor Modeling for Lunar and Planetary Image Processing

Authors: Ahmed Elaksher, Islam Omar

Abstract:

Accurate geometric relation algorithms are imperative in Earth and planetary satellite and aerial image processing, particularly for high-resolution images that are used for topographic mapping. Most of these satellites carry push-broom sensors. These sensors are optical scanners equipped with linear arrays of CCDs. These sensors have been deployed on most EOSs. In addition, the LROC is equipped with two push NACs that provide 0.5 meter-scale panchromatic images over a 5 km swath of the Moon. The HiRISE carried by the MRO and the HRSC carried by MEX are examples of push-broom sensor that produces images of the surface of Mars. Sensor models developed in photogrammetry relate image space coordinates in two or more images with the 3D coordinates of ground features. Rigorous sensor models use the actual interior orientation parameters and exterior orientation parameters of the camera, unlike approximate models. In this research, we generate a generic push-broom sensor model to process imageries acquired through linear array cameras and investigate its performance, advantages, and disadvantages in generating topographic models for the Earth, Mars, and the Moon. We also compare and contrast the utilization, effectiveness, and applicability of available photogrammetric techniques and softcopies with the developed model. We start by defining an image reference coordinate system to unify image coordinates from all three arrays. The transformation from an image coordinate system to a reference coordinate system involves a translation and three rotations. For any image point within the linear array, its image reference coordinates, the coordinates of the exposure center of the array in the ground coordinate system at the imaging epoch (t), and the corresponding ground point coordinates are related through the collinearity condition that states that all these three points must be on the same line. The rotation angles for each CCD array at the epoch t are defined and included in the transformation model. The exterior orientation parameters of an image line, i.e., coordinates of exposure station and rotation angles, are computed by a polynomial interpolation function in time (t). The parameter (t) is the time at a certain epoch from a certain orbit position. Depending on the types of observations, coordinates, and parameters may be treated as knowns or unknowns differently in various situations. The unknown coefficients are determined in a bundle adjustment. The orientation process starts by extracting the sensor position and, orientation and raw images from the PDS. The parameters of each image line are then estimated and imported into the push-broom sensor model. We also define tie points between image pairs to aid the bundle adjustment model, determine the refined camera parameters, and generate highly accurate topographic maps. The model was tested on different satellite images such as IKONOS, QuickBird, and WorldView-2, HiRISE. It was found that the accuracy of our model is comparable to those of commercial and open-source software, the computational efficiency of the developed model is high, the model could be used in different environments with various sensors, and the implementation process is much more cost-and effort-consuming.

Keywords: photogrammetry, push-broom sensors, IKONOS, HiRISE, collinearity condition

Procedia PDF Downloads 60
7817 Failure Mechanism of Slip-Critical Connections on Curved Surface

Authors: Bae Doobyong, Yoo Jaejun, Park Ilgyu, Choi Seowon, Oh Chang Kook

Abstract:

Variation of slip coefficient in slip-critical connections of curved plates. This paper presents the results of analytical investigations of slip coefficients in slip-critical bolted connections of curved plates. It may depend on the contact stress distribution at interface and the flexibility of spliced plate. Non-linear FEM analyses have been made to simulate the behavior of bolted connections of curved plates with various radiuses of curvature and thicknesses.

Keywords: slip coefficient, curved plates, slip-critical bolted connection, radius of curvature

Procedia PDF Downloads 506
7816 A Study of Anthropometric Correlation between Upper and Lower Limb Dimensions in Sudanese Population

Authors: Altayeb Abdalla Ahmed

Abstract:

Skeletal phenotype is a product of a balanced interaction between genetics and environmental factors throughout different life stages. Therefore, interlimb proportions are variable between populations. Although interlimb proportion indices have been used in anthropology in assessing the influence of various environmental factors on limbs, an extensive literature review revealed that there is a paucity of published research assessing interlimb part correlations and possibility of reconstruction. Hence, this study aims to assess the relationships between upper and lower limb parts and develop regression formulae to reconstruct the parts from one another. The left upper arm length, ulnar length, wrist breadth, hand length, hand breadth, tibial length, bimalleolar breadth, foot length, and foot breadth of 376 right-handed subjects, comprising 187 males and 189 females (aged 25-35 years), were measured. Initially, the data were analyzed using basic univariate analysis and independent t-tests; then sex-specific simple and multiple linear regression models were used to estimate upper limb parts from lower limb parts and vice-versa. The results of this study indicated significant sexual dimorphism for all variables. The results indicated a significant correlation between the upper and lower limbs parts (p < 0.01). Linear and multiple (stepwise) regression equations were developed to reconstruct the limb parts in the presence of a single or multiple dimension(s) from the other limb. Multiple stepwise regression equations generated better reconstructions than simple equations. These results are significant in forensics as it can aid in identification of multiple isolated limb parts particularly during mass disasters and criminal dismemberment. Although a DNA analysis is the most reliable tool for identification, its usage has multiple limitations in undeveloped countries, e.g., cost, facility availability, and trained personnel. Furthermore, it has important implication in plastic and orthopedic reconstructive surgeries. This study is the only reported study assessing the correlation and prediction capabilities between many of the upper and lower dimensions. The present study demonstrates a significant correlation between the interlimb parts in both sexes, which indicates a possibility to reconstruction using regression equations.

Keywords: anthropometry, correlation, limb, Sudanese

Procedia PDF Downloads 289
7815 Autonomous Landing of UAV on Moving Platform: A Mathematical Approach

Authors: Mortez Alijani, Anas Osman

Abstract:

Recently, the popularity of Unmanned aerial vehicles (UAVs) has skyrocketed amidst the unprecedented events and the global pandemic, as they play a key role in both the security and health sectors, through surveillance, taking test samples, transportation of crucial goods and spreading awareness among civilians. However, the process of designing and producing such aerial robots is suppressed by the internal and external constraints that pose serious challenges. Landing is one of the key operations during flight, especially, the autonomous landing of UAVs on a moving platform is a scientifically complex engineering problem. Typically having a successful automatic landing of UAV on a moving platform requires accurate localization of landing, fast trajectory planning, and robust control planning. To achieve these goals, the information about the autonomous landing process such as the intersection point, the position of platform/UAV and inclination angle are more necessary. In this study, the mathematical approach to this problem in the X-Y axis based on the inclination angle and position of UAV in the landing process have been presented. The experimental results depict the accurate position of the UAV, intersection between UAV and moving platform and inclination angle in the landing process, allowing prediction of the intersection point.

Keywords: autonomous landing, inclination angle, unmanned aerial vehicles, moving platform, X-Y axis, intersection point

Procedia PDF Downloads 161
7814 Mathematical and Numerical Analysis of a Nonlinear Cross Diffusion System

Authors: Hassan Al Salman

Abstract:

We consider a nonlinear parabolic cross diffusion model arising in applied mathematics. A fully practical piecewise linear finite element approximation of the model is studied. By using entropy-type inequalities and compactness arguments, existence of a global weak solution is proved. Providing further regularity of the solution of the model, some uniqueness results and error estimates are established. Finally, some numerical experiments are performed.

Keywords: cross diffusion model, entropy-type inequality, finite element approximation, numerical analysis

Procedia PDF Downloads 378
7813 Numerical Study of Fatigue Crack Growth at a Web Stiffener of Ship Structural Details

Authors: Wentao He, Jingxi Liu, De Xie

Abstract:

It is necessary to manage the fatigue crack growth (FCG) once those cracks are detected during in-service inspections. In this paper, a simulation program (FCG-System) is developed utilizing the commercial software ABAQUS with its object-oriented programming interface to simulate the fatigue crack path and to compute the corresponding fatigue life. In order to apply FCG-System in large-scale marine structures, the substructure modeling technique is integrated in the system under the consideration of structural details and load shedding during crack growth. Based on the nodal forces and nodal displacements obtained from finite element analysis, a formula for shell elements to compute stress intensity factors is proposed in the view of virtual crack closure technique. The cracks initiating from the intersection of flange and the end of the web-stiffener are investigated for fatigue crack paths and growth lives under water pressure loading and axial force loading, separately. It is found that the FCG-System developed by authors could be an efficient tool to perform fatigue crack growth analysis on marine structures.

Keywords: crack path, fatigue crack, fatigue live, FCG-system, virtual crack closure technique

Procedia PDF Downloads 566
7812 Number of Parameters of Anantharam's Model with Single-Input Single-Output Case

Authors: Kazuyoshi Mori

Abstract:

In this paper, we consider the parametrization of Anantharam’s model within the framework of the factorization approach. In the parametrization, we investigate the number of required parameters of Anantharam’s model. We consider single-input single-output systems in this paper. By the investigation, we find three cases that are (1) there exist plants which require only one parameter and (2) two parameters, and (3) the number of parameters is at most three.

Keywords: linear systems, parametrization, coprime factorization, number of parameters

Procedia PDF Downloads 209
7811 Heinz-Type Inequalities in Hilbert Spaces

Authors: Jin Liang, Guanghua Shi

Abstract:

In this paper, we are concerned with the further refinements of the Heinz operator inequalities in Hilbert spaces. Our purpose is to derive several new Heinz-type operator inequalities. First, with the help of the Taylor series of some hyperbolic functions, we obtain some refinements of the ordering relations among Heinz means defined by Bhatia with different parameters, which would be more suitable in obtaining the corresponding operator inequalities. Second, we present some generalizations of Heinz operator inequalities. Finally, we give a matrix version of the Heinz inequality for the Hilbert-Schmidt norm.

Keywords: Hilbert space, means inequality, norm inequality, positive linear operator

Procedia PDF Downloads 260
7810 A Robust Optimization Model for Multi-Objective Closed-Loop Supply Chain

Authors: Mohammad Y. Badiee, Saeed Golestani, Mir Saman Pishvaee

Abstract:

In recent years consumers and governments have been pushing companies to design their activities in such a way as to reduce negative environmental impacts by producing renewable product or threat free disposal policy more and more. It is therefore important to focus more accurate to the optimization of various aspect of total supply chain. Modeling a supply chain can be a challenging process due to the fact that there are a large number of factors that need to be considered in the model. The use of multi-objective optimization can lead to overcome those problems since more information is used when designing the model. Uncertainty is inevitable in real world. Considering uncertainty on parameters in addition to use multi-objectives are ways to give more flexibility to the decision making process since the process can take into account much more constraints and requirements. In this paper we demonstrate a stochastic scenario based robust model to cope with uncertainty in a closed-loop multi-objective supply chain. By applying the proposed model in a real world case, the power of proposed model in handling data uncertainty is shown.

Keywords: supply chain management, closed-loop supply chain, multi-objective optimization, goal programming, uncertainty, robust optimization

Procedia PDF Downloads 409
7809 Using Statistical Significance and Prediction to Test Long/Short Term Public Services and Patients' Cohorts: A Case Study in Scotland

Authors: Raptis Sotirios

Abstract:

Health and social care (HSc) services planning and scheduling are facing unprecedented challenges due to the pandemic pressure and also suffer from unplanned spending that is negatively impacted by the global financial crisis. Data-driven can help to improve policies, plan and design services provision schedules using algorithms assist healthcare managers’ to face unexpected demands using fewer resources. The paper discusses services packing using statistical significance tests and machine learning (ML) to evaluate demands similarity and coupling. This is achieved by predicting the range of the demand (class) using ML methods such as CART, random forests (RF), and logistic regression (LGR). The significance tests Chi-Squared test and Student test are used on data over a 39 years span for which HSc services data exist for services delivered in Scotland. The demands are probabilistically associated through statistical hypotheses that assume that the target service’s demands are statistically dependent on other demands as a NULL hypothesis. This linkage can be confirmed or not by the data. Complementarily, ML methods are used to linearly predict the above target demands from the statistically found associations and extend the linear dependence of the target’s demand to independent demands forming, thus groups of services. Statistical tests confirm ML couplings making the prediction also statistically meaningful and prove that a target service can be matched reliably to other services, and ML shows these indicated relationships can also be linear ones. Zero paddings were used for missing years records and illustrated better such relationships both for limited years and in the entire span offering long term data visualizations while limited years groups explained how well patients numbers can be related in short periods or can change over time as opposed to behaviors across more years. The prediction performance of the associations is measured using Receiver Operating Characteristic(ROC) AUC and ACC metrics as well as the statistical tests, Chi-Squared and Student. Co-plots and comparison tables for RF, CART, and LGR as well as p-values and Information Exchange(IE), are provided showing the specific behavior of the ML and of the statistical tests and the behavior using different learning ratios. The impact of k-NN and cross-correlation and C-Means first groupings is also studied over limited years and the entire span. It was found that CART was generally behind RF and LGR, but in some interesting cases, LGR reached an AUC=0 falling below CART, while the ACC was as high as 0.912, showing that ML methods can be confused padding or by data irregularities or outliers. On average, 3 linear predictors were sufficient, LGR was found competing RF well, and CART followed with the same performance at higher learning ratios. Services were packed only if when significance level(p-value) of their association coefficient was more than 0.05. Social factors relationships were observed between home care services and treatment of old people, birth weights, alcoholism, drug abuse, and emergency admissions. The work found that different HSc services can be well packed as plans of limited years, across various services sectors, learning configurations, as confirmed using statistical hypotheses.

Keywords: class, cohorts, data frames, grouping, prediction, prob-ability, services

Procedia PDF Downloads 222
7808 DPED Trainee Teachers' Views and Practice on Mathematics Lesson Study in Bangladesh

Authors: Mihir Halder

Abstract:

The main aim and objective of the eighteen-month long Diploma in Primary Education (DPED) teacher education training course for in-service primary teachers in Bangladesh is to acquire professional knowledge as well as make them proficient in professional practice. The training, therefore, introduces a variety of theoretical and practical approaches as well as some professional development activities—lesson study being one of them. But, in the field of mathematics teaching, even after implementing the lesson study method, the desired practical teaching skills of the teachers have not been developed. In addition, elementary students also remain quite raw in mathematics. Although there have been various studies to solve the problem, the need for the teachers' views on mathematical ideas has not been taken into consideration. The researcher conducted the research to find out the cause of the discussed problem. In this case, two teams of nine DPED trainee teachers and two instructors conducted two lesson studies in two schools located in the city and town of Khulna Province, Bangladesh. The researcher observed group lesson planning by trainee teachers, followed by a trainee teacher teaching the planned lesson plan to an actual mathematics classroom, and finally, post-teaching reflective discussion in each lesson study. Two DPED instructors acted as mentors in the lesson study. DPED trainee teachers and instructors were asked about mathematical concepts and classroom practices through questionnaires as well as videotaped mathematics classroom teaching. For this study, the DPED mathematics course, curriculum, and assessment activities were analyzed. In addition, the mathematics lesson plans prepared by the trainee teachers for the lesson study and their pre-teaching and post-teaching reflective discussions were analyzed by some analysis categories and rubrics. As a result, it was found that the trainee teachers' views of mathematics are not mature, and therefore, their mathematics teaching practice is not appropriate. Therefore, in order to improve teachers' mathematics teaching, the researcher recommended including some action-oriented aspects in each phase of mathematics lesson study in DPED—for example, emphasizing mathematics concepts of the trainee teachers, preparing appropriate teaching materials, presenting lessons using the problem-solving method, using revised rubrics for assessing mathematics lesson study, etc.

Keywords: mathematics lesson study, knowledge of mathematics, knowledge of teaching mathematics, teachers' views

Procedia PDF Downloads 66
7807 Deciding Graph Non-Hamiltonicity via a Closure Algorithm

Authors: E. R. Swart, S. J. Gismondi, N. R. Swart, C. E. Bell

Abstract:

We present an heuristic algorithm that decides graph non-Hamiltonicity. All graphs are directed, each undirected edge regarded as a pair of counter directed arcs. Each of the n! Hamilton cycles in a complete graph on n+1 vertices is mapped to an n-permutation matrix P where p(u,i)=1 if and only if the ith arc in a cycle enters vertex u, starting and ending at vertex n+1. We first create exclusion set E by noting all arcs (u, v) not in G, sufficient to code precisely all cycles excluded from G i.e. cycles not in G use at least one arc not in G. Members are pairs of components of P, {p(u,i),p(v,i+1)}, i=1, n-1. A doubly stochastic-like relaxed LP formulation of the Hamilton cycle decision problem is constructed. Each {p(u,i),p(v,i+1)} in E is coded as variable q(u,i,v,i+1)=0 i.e. shrinks the feasible region. We then implement the Weak Closure Algorithm (WCA) that tests necessary conditions of a matching, together with Boolean closure to decide 0/1 variable assignments. Each {p(u,i),p(v,j)} not in E is tested for membership in E, and if possible, added to E (q(u,i,v,j)=0) to iteratively maximize |E|. If the WCA constructs E to be maximal, the set of all {p(u,i),p(v,j)}, then G is decided non-Hamiltonian. Only non-Hamiltonian G share this maximal property. Ten non-Hamiltonian graphs (10 through 104 vertices) and 2000 randomized 31 vertex non-Hamiltonian graphs are tested and correctly decided non-Hamiltonian. For Hamiltonian G, the complement of E covers a matching, perhaps useful in searching for cycles. We also present an example where the WCA fails.

Keywords: Hamilton cycle decision problem, computational complexity theory, graph theory, theoretical computer science

Procedia PDF Downloads 368
7806 Challenge Based Learning Approach for a Craft Mezcal Kiln Energetic Redesign

Authors: Jonathan A. Sánchez Muñoz, Gustavo Flores Eraña, Juan M. Silva

Abstract:

Mexican Mezcal industry has reached attention during the last decade due to it has been a popular beverage demanded by North American and European markets, reaching popularity due to its crafty character. Despite its wide demand, productive processes are still made with rudimentary equipment, and there is a lack of evidence to improve kiln energy efficiency. Tec21 is a challenge-based learning curricular model implemented by Tecnológico de Monterrey since 2019, where each formation unit requires an industrial partner. “Problem processes solution” is a formation unity designed for mechatronics engineers, where students apply the acquired knowledge in thermofluids and apply electronic. During five weeks, students are immersed in an industrial problem to obtain a proper level of competencies according to formation unit designers. This work evaluates the competencies acquired by the student through qualitative research methodology. Several evaluation instruments (report, essay, and poster) were selected to evaluate etic argumentation, principles of sustainability, implemented actions, process modelling, and redesign feasibility.

Keywords: applied electronic, challenge based learning, competencies, mezcal industry, thermofluids

Procedia PDF Downloads 116
7805 Centering Critical Sociology for Social Justice and Inclusive Education

Authors: Al Karim Datoo

Abstract:

Abstract— The presentation argues for an urgent case to center and integrate critical sociology in enriching potency of educational thought and practice to counteract inequalities and social injustices. COVID phenomenon has starkly exposed burgeoning of social-economic inequalities and widening marginalities which have been historically and politically constructed through deep-seated social and power imbalances and injustices in the world. What potent role could education possibly play to combat these issues? A point of departure for this paper highlights increasing reductionist and exclusionary ‘mind-set’ of education that has been developed through trends in education such as: the commodification of knowledge, standardisation, homogenization, and reification which are products of the positivist ideology of knowledge coopted to serve capitalist interests. To redress these issues of de-contextualization and de-humanization of education, it is emphasized that there is an urgent need to center the role of interpretive and critical epistemologies and pedagogies of social sciences. In this regard, notions of problem-posing versus problem-solving, generative themes, instrumental versus emancipatory reasoning will be discussed. The presentation will conclude by illustrating the pedagogic utility of these critically oriented notions to counteract the social reproduction of exclusionary and inequality in and through education.

Keywords: Critical pedagogy, social justice, inclusion , education

Procedia PDF Downloads 109
7804 Analytical Investigation of Modeling and Simulation of Different Combinations of Sinusoidal Supplied Autotransformer under Linear Loading Conditions

Authors: M. Salih Taci, N. Tayebi, I. Bozkır

Abstract:

This paper investigates the operation of a sinusoidal supplied autotransformer on the different states of magnetic polarity of primary and secondary terminals for four different step-up and step-down analytical conditions. In this paper, a new analytical modeling and equations for dot-marked and polarity-based step-up and step-down autotransformer are presented. These models are validated by the simulation of current and voltage waveforms for each state. PSpice environment was used for simulation.

Keywords: autotransformer modeling, autotransformer simulation, step-up autotransformer, step-down autotransformer, polarity

Procedia PDF Downloads 307
7803 Identification and Classification of Fiber-Fortified Semolina by Near-Infrared Spectroscopy (NIR)

Authors: Amanda T. Badaró, Douglas F. Barbin, Sofia T. Garcia, Maria Teresa P. S. Clerici, Amanda R. Ferreira

Abstract:

Food fortification is the intentional addition of a nutrient in a food matrix and has been widely used to overcome the lack of nutrients in the diet or increasing the nutritional value of food. Fortified food must meet the demand of the population, taking into account their habits and risks that these foods may cause. Wheat and its by-products, such as semolina, has been strongly indicated to be used as a food vehicle since it is widely consumed and used in the production of other foods. These products have been strategically used to add some nutrients, such as fibers. Methods of analysis and quantification of these kinds of components are destructive and require lengthy sample preparation and analysis. Therefore, the industry has searched for faster and less invasive methods, such as Near-Infrared Spectroscopy (NIR). NIR is a rapid and cost-effective method, however, it is based on indirect measurements, yielding high amount of data. Therefore, NIR spectroscopy requires calibration with mathematical and statistical tools (Chemometrics) to extract analytical information from the corresponding spectra, as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA). PCA is well suited for NIR, once it can handle many spectra at a time and be used for non-supervised classification. Advantages of the PCA, which is also a data reduction technique, is that it reduces the data spectra to a smaller number of latent variables for further interpretation. On the other hand, LDA is a supervised method that searches the Canonical Variables (CV) with the maximum separation among different categories. In LDA, the first CV is the direction of maximum ratio between inter and intra-class variances. The present work used a portable infrared spectrometer (NIR) for identification and classification of pure and fiber-fortified semolina samples. The fiber was added to semolina in two different concentrations, and after the spectra acquisition, the data was used for PCA and LDA to identify and discriminate the samples. The results showed that NIR spectroscopy associate to PCA was very effective in identifying pure and fiber-fortified semolina. Additionally, the classification range of the samples using LDA was between 78.3% and 95% for calibration and 75% and 95% for cross-validation. Thus, after the multivariate analysis such as PCA and LDA, it was possible to verify that NIR associated to chemometric methods is able to identify and classify the different samples in a fast and non-destructive way.

Keywords: Chemometrics, fiber, linear discriminant analysis, near-infrared spectroscopy, principal component analysis, semolina

Procedia PDF Downloads 206
7802 Embedded Hardware and Software Design of Omnidirectional Autonomous Robotic Platform Suitable for Advanced Driver Assistance Systems Testing with Focus on Modularity and Safety

Authors: Ondrej Lufinka, Jan Kaderabek, Juraj Prstek, Jiri Skala, Kamil Kosturik

Abstract:

This paper deals with the problem of using Autonomous Robotic Platforms (ARP) for the ADAS (Advanced Driver Assistance Systems) testing in automotive. There are different possibilities of the testing already in development, and lately, the autonomous robotic platforms are beginning to be used more and more widely. Autonomous Robotic Platform discussed in this paper explores the hardware and software design possibilities related to the field of embedded systems. The paper focuses on its chapters on the introduction of the problem in general; then, it describes the proposed prototype concept and its principles from the embedded HW and SW point of view. It talks about the key features that can be used for the innovation of these platforms (e.g., modularity, omnidirectional movement, common and non-traditional sensors used for localization, synchronization of more platforms and cars together, or safety mechanisms). In the end, the future possible development of the project is discussed as well.

Keywords: advanced driver assistance systems, ADAS, autonomous robotic platform, embedded systems, hardware, localization, modularity, multiple robots synchronization, omnidirectional movement, safety mechanisms, software

Procedia PDF Downloads 138
7801 Microarray Data Visualization and Preprocessing Using R and Bioconductor

Authors: Ruchi Yadav, Shivani Pandey, Prachi Srivastava

Abstract:

Microarrays provide a rich source of data on the molecular working of cells. Each microarray reports on the abundance of tens of thousands of mRNAs. Virtually every human disease is being studied using microarrays with the hope of finding the molecular mechanisms of disease. Bioinformatics analysis plays an important part of processing the information embedded in large-scale expression profiling studies and for laying the foundation for biological interpretation. A basic, yet challenging task in the analysis of microarray gene expression data is the identification of changes in gene expression that are associated with particular biological conditions. Careful statistical design and analysis are essential to improve the efficiency and reliability of microarray experiments throughout the data acquisition and analysis process. One of the most popular platforms for microarray analysis is Bioconductor, an open source and open development software project based on the R programming language. This paper describes specific procedures for conducting quality assessment, visualization and preprocessing of Affymetrix Gene Chip and also details the different bioconductor packages used to analyze affymetrix microarray data and describe the analysis and outcome of each plots.

Keywords: microarray analysis, R language, affymetrix visualization, bioconductor

Procedia PDF Downloads 472
7800 Quantified Metabolomics for the Determination of Phenotypes and Biomarkers across Species in Health and Disease

Authors: Miroslava Cuperlovic-Culf, Lipu Wang, Ketty Boyle, Nadine Makley, Ian Burton, Anissa Belkaid, Mohamed Touaibia, Marc E. Surrette

Abstract:

Metabolic changes are one of the major factors in the development of a variety of diseases in various species. Metabolism of agricultural plants is altered the following infection with pathogens sometimes contributing to resistance. At the same time, pathogens use metabolites for infection and progression. In humans, metabolism is a hallmark of cancer development for example. Quantified metabolomics data combined with other omics or clinical data and analyzed using various unsupervised and supervised methods can lead to better diagnosis and prognosis. It can also provide information about resistance as well as contribute knowledge of compounds significant for disease progression or prevention. In this work, different methods for metabolomics quantification and analysis from Nuclear Magnetic Resonance (NMR) measurements that are used for investigation of disease development in wheat and human cells will be presented. One-dimensional 1H NMR spectra are used extensively for metabolic profiling due to their high reliability, wide range of applicability, speed, trivial sample preparation and low cost. This presentation will describe a new method for metabolite quantification from NMR data that combines alignment of spectra of standards to sample spectra followed by multivariate linear regression optimization of spectra of assigned metabolites to samples’ spectra. Several different alignment methods were tested and multivariate linear regression result has been compared with other quantification methods. Quantified metabolomics data can be analyzed in the variety of ways and we will present different clustering methods used for phenotype determination, network analysis providing knowledge about the relationships between metabolites through metabolic network as well as biomarker selection providing novel markers. These analysis methods have been utilized for the investigation of fusarium head blight resistance in wheat cultivars as well as analysis of the effect of estrogen receptor and carbonic anhydrase activation and inhibition on breast cancer cell metabolism. Metabolic changes in spikelet’s of wheat cultivars FL62R1, Stettler, MuchMore and Sumai3 following fusarium graminearum infection were explored. Extensive 1D 1H and 2D NMR measurements provided information for detailed metabolite assignment and quantification leading to possible metabolic markers discriminating resistance level in wheat subtypes. Quantification data is compared to results obtained using other published methods. Fusarium infection induced metabolic changes in different wheat varieties are discussed in the context of metabolic network and resistance. Quantitative metabolomics has been used for the investigation of the effect of targeted enzyme inhibition in cancer. In this work, the effect of 17 β -estradiol and ferulic acid on metabolism of ER+ breast cancer cells has been compared to their effect on ER- control cells. The effect of the inhibitors of carbonic anhydrase on the observed metabolic changes resulting from ER activation has also been determined. Metabolic profiles were studied using 1D and 2D metabolomic NMR experiments, combined with the identification and quantification of metabolites, and the annotation of the results is provided in the context of biochemical pathways.

Keywords: metabolic biomarkers, metabolic network, metabolomics, multivariate linear regression, NMR quantification, quantified metabolomics, spectral alignment

Procedia PDF Downloads 333
7799 A Hybrid Feature Selection and Deep Learning Algorithm for Cancer Disease Classification

Authors: Niousha Bagheri Khulenjani, Mohammad Saniee Abadeh

Abstract:

Learning from very big datasets is a significant problem for most present data mining and machine learning algorithms. MicroRNA (miRNA) is one of the important big genomic and non-coding datasets presenting the genome sequences. In this paper, a hybrid method for the classification of the miRNA data is proposed. Due to the variety of cancers and high number of genes, analyzing the miRNA dataset has been a challenging problem for researchers. The number of features corresponding to the number of samples is high and the data suffer from being imbalanced. The feature selection method has been used to select features having more ability to distinguish classes and eliminating obscures features. Afterward, a Convolutional Neural Network (CNN) classifier for classification of cancer types is utilized, which employs a Genetic Algorithm to highlight optimized hyper-parameters of CNN. In order to make the process of classification by CNN faster, Graphics Processing Unit (GPU) is recommended for calculating the mathematic equation in a parallel way. The proposed method is tested on a real-world dataset with 8,129 patients, 29 different types of tumors, and 1,046 miRNA biomarkers, taken from The Cancer Genome Atlas (TCGA) database.

Keywords: cancer classification, feature selection, deep learning, genetic algorithm

Procedia PDF Downloads 107
7798 Fatigue Analysis of Spread Mooring Line

Authors: Chanhoe Kang, Changhyun Lee, Seock-Hee Jun, Yeong-Tae Oh

Abstract:

Offshore floating structure under the various environmental conditions maintains a fixed position by mooring system. Environmental conditions, vessel motions and mooring loads are applied to mooring lines as the dynamic tension. Because global responses of mooring system in deep water are specified as wave frequency and low frequency response, they should be calculated from the time-domain analysis due to non-linear dynamic characteristics. To take into account all mooring loads, environmental conditions, added mass and damping terms at each time step, a lot of computation time and capacities are required. Thus, under the premise that reliable fatigue damage could be derived through reasonable analysis method, it is necessary to reduce the analysis cases through the sensitivity studies and appropriate assumptions. In this paper, effects in fatigue are studied for spread mooring system connected with oil FPSO which is positioned in deep water of West Africa offshore. The target FPSO with two Mbbls storage has 16 spread mooring lines (4 bundles x 4 lines). The various sensitivity studies are performed for environmental loads, type of responses, vessel offsets, mooring position, loading conditions and riser behavior. Each parameter applied to the sensitivity studies is investigated from the effects of fatigue damage through fatigue analysis. Based on the sensitivity studies, the following results are presented: Wave loads are more dominant in terms of fatigue than other environment conditions. Wave frequency response causes the higher fatigue damage than low frequency response. The larger vessel offset increases the mean tension and so it results in the increased fatigue damage. The external line of each bundle shows the highest fatigue damage by the governed vessel pitch motion due to swell wave conditions. Among three kinds of loading conditions, ballast condition has the highest fatigue damage due to higher tension. The riser damping occurred by riser behavior tends to reduce the fatigue damage. The various analysis results obtained from these sensitivity studies can be used for a simplified fatigue analysis of spread mooring line as the reference.

Keywords: mooring system, fatigue analysis, time domain, non-linear dynamic characteristics

Procedia PDF Downloads 330
7797 Development of Basic Patternmaking Using Parametric Modelling and AutoLISP

Authors: Haziyah Hussin, Syazwan Abdul Samad, Rosnani Jusoh

Abstract:

This study is aimed towards the automisation of basic patternmaking for traditional clothes for the purpose of mass production using AutoCAD to apply AutoLISP feature under software Hazi Attire. A standard dress form (industrial form) with the size of small (S), medium (M) and large (L) size is measured using full body scanning machine. Later, the pattern for the clothes is designed parametrically based on the measured dress form. Hazi Attire program is used within the framework of AutoCAD to generate the basic pattern of front bodice, back bodice, front skirt, back skirt and sleeve block (sloper). The generation of pattern is based on the parameters inputted by user, whereby in this study, the parameters were determined based on the measured size of dress form. The finalized pattern parameter shows that the pattern fit perfectly on the dress form. Since the pattern is generated almost instantly, these proved that using the AutoLISP programming, the manufacturing lead time for the mass production of the traditional clothes can be decreased.

Keywords: apparel, AutoLISP, Malay traditional clothes, pattern ganeration

Procedia PDF Downloads 247
7796 Density functional (DFT), Study of the Structural and Phase Transition of ThC and ThN: LDA vs GGA Computational

Authors: Hamza Rekab Djabri, Salah Daoud

Abstract:

The present paper deals with the computational of structural and electronic properties of ThC and ThN compounds using density functional theory within generalized-gradient (GGA) apraximation and local density approximation (LDA). We employ the full potential linear muffin-tin orbitals (FP-LMTO) as implemented in the Lmtart code. We have used to examine structure parameter in eight different structures such as in NaCl (B1), CsCl (B2), ZB (B3), NiAs (B8), PbO (B10), Wurtzite (B4) , HCP (A3) βSn (A5) structures . The equilibrium lattice parameter, bulk modulus, and its pressure derivative were presented for all calculated phases. The calculated ground state properties are in good agreement with available experimental and theoretical results.

Keywords: DFT, GGA, LDA, properties structurales, ThC, ThN

Procedia PDF Downloads 92
7795 Using Closed Frequent Itemsets for Hierarchical Document Clustering

Authors: Cheng-Jhe Lee, Chiun-Chieh Hsu

Abstract:

Due to the rapid development of the Internet and the increased availability of digital documents, the excessive information on the Internet has led to information overflow problem. In order to solve these problems for effective information retrieval, document clustering in text mining becomes a popular research topic. Clustering is the unsupervised classification of data items into groups without the need of training data. Many conventional document clustering methods perform inefficiently for large document collections because they were originally designed for relational database. Therefore they are impractical in real-world document clustering and require special handling for high dimensionality and high volume. We propose the FIHC (Frequent Itemset-based Hierarchical Clustering) method, which is a hierarchical clustering method developed for document clustering, where the intuition of FIHC is that there exist some common words for each cluster. FIHC uses such words to cluster documents and builds hierarchical topic tree. In this paper, we combine FIHC algorithm with ontology to solve the semantic problem and mine the meaning behind the words in documents. Furthermore, we use the closed frequent itemsets instead of only use frequent itemsets, which increases efficiency and scalability. The experimental results show that our method is more accurate than those of well-known document clustering algorithms.

Keywords: FIHC, documents clustering, ontology, closed frequent itemset

Procedia PDF Downloads 389