Search results for: TCP (Tool Center Point)
11009 A Review of Applying Serious Games on Learning
Authors: Carlos Oliveira, Ulrick Pimentel
Abstract:
Digital games have conquered a growing space in the lives of children, adolescents and adults. In this perspective, the use of this resource has shown to be an important strategy that facilitates the learning process. This research is a literature review on the use of serious games in teaching, which shows the characteristics of these games, the benefits and possible harms that this resource can produce, in addition to the possible methods of evaluating the effectiveness of this resource in teaching. The results point out that Serious Games have significant potential as a tool for instruction. However, their effectiveness in terms of learning outcomes is still poorly studied, mainly due to the complexity involved in evaluating intangible measures.Keywords: serious games, learning, application, literature review
Procedia PDF Downloads 30711008 Study on the Stages of Knowledge Flow in Central Libraries of Tehran Universities by the Pattern of American Productivity & Quality Center
Authors: Amir Reza Asnafi, Ehsan Tajabadi, Mohsen Hajizeinolabedini
Abstract:
The purpose of this study is to identify the concept of knowledge flow in central libraries of Tehran universities in by the pattern of American Productivity & Quality Center (APQC). The present study is an applied and descriptive survey in terms of its purpose and the methodology used. In this study, APQC framework was used for data collection. The study population is managers and supervisors of central libraries’ departments of public universities of Tehran belonging to the Ministry of Science, Research and Technology. These libraries include: Central Libraries of Al-Zahra University, Amir Kabir, Tarbiat Modarres, Tehran, Khajeh Nasir Toosi University of Technology, Shahed, Sharif, Shahid Beheshti, Allameh Tabataba'i University, Iran University of Science and Technology. Due to the limited number of members of the community, sampling was not performed and the census was conducted instead. The study of knowledge flow in central libraries of public universities in Tehran showed that in seven dimensions of knowledge flow of APQC, these libraries are far from desirable level and to achieve the ideal point, many activities in the field of knowledge flow need to be made, therefore suggestions were made in this study to reach the desired level. One Sample t Test in this research showed that these libraries are at a poor level in terms of these factors: in the dimensions of creation, identification and use of knowledge at a medium level and in the aspects of knowledge acquisition, review, sharing and access and also Manova test or Multivariable Analyze of Variance proved that there was no significant difference between the dimensions of knowledge flow between these libraries and the status of the knowledge flow in these libraries is at the same level as well. Except for the knowledge creation aspect that is slightly different in this regard that was mentioned before.Keywords: knowledge flow, knowledge management, APQC, Tehran’s academic university libraries
Procedia PDF Downloads 15711007 Fixed Point Iteration of a Damped and Unforced Duffing's Equation
Authors: Paschal A. Ochang, Emmanuel C. Oji
Abstract:
The Duffing’s Equation is a second order system that is very important because they are fundamental to the behaviour of higher order systems and they have applications in almost all fields of science and engineering. In the biological area, it is useful in plant stem dependence and natural frequency and model of the Brain Crash Analysis (BCA). In Engineering, it is useful in the study of Damping indoor construction and Traffic lights and to the meteorologist it is used in the prediction of weather conditions. However, most Problems in real life that occur are non-linear in nature and may not have analytical solutions except approximations or simulations, so trying to find an exact explicit solution may in general be complicated and sometimes impossible. Therefore we aim to find out if it is possible to obtain one analytical fixed point to the non-linear ordinary equation using fixed point analytical method. We started by exposing the scope of the Duffing’s equation and other related works on it. With a major focus on the fixed point and fixed point iterative scheme, we tried different iterative schemes on the Duffing’s Equation. We were able to identify that one can only see the fixed points to a Damped Duffing’s Equation and not to the Undamped Duffing’s Equation. This is because the cubic nonlinearity term is the determining factor to the Duffing’s Equation. We finally came to the results where we identified the stability of an equation that is damped, forced and second order in nature. Generally, in this research, we approximate the solution of Duffing’s Equation by converting it to a system of First and Second Order Ordinary Differential Equation and using Fixed Point Iterative approach. This approach shows that for different versions of Duffing’s Equations (damped), we find fixed points, therefore the order of computations and running time of applied software in all fields using the Duffing’s equation will be reduced.Keywords: damping, Duffing's equation, fixed point analysis, second order differential, stability analysis
Procedia PDF Downloads 28911006 Throughput of Point Coordination Function (PCF)
Authors: Faisel Eltuhami Alzaalik, Omar Imhemed Alramli, Ahmed Mohamed Elaieb
Abstract:
The IEEE 802.11 defines two modes of MAC, distributed coordination function (DCF) and point coordination function (PCF) mode. The first sub-layer of the MAC is the distributed coordination function (DCF). A contention algorithm is used via DCF to provide access to all traffic. The point coordination function (PCF) is the second sub-layer used to provide contention-free service. PCF is upper DCF and it uses features of DCF to establish guarantee access of its users. Some papers and researches that have been published in this technology were reviewed in this paper, as well as talking briefly about the distributed coordination function (DCF) technology. The simulation of the PCF function have been applied by using a simulation program called network simulator (NS2) and have been found out the throughput of a transmitter system by using this function.Keywords: DCF, PCF, throughput, NS2
Procedia PDF Downloads 57311005 The Use of Online Courses as a Tool for Teaching in Education for Youth and Adults
Authors: Elineuda Do Socorro Santos Picanço Sousa, Ana Kerlly Souza da Costa
Abstract:
This paper presents the analysis of the information society as a plural, inclusive and participatory society, where it is necessary to give all citizens, especially young people, the right skills in order to develop skills so that they can understand and use information through of contemporary technologies; well as carry out a critical analysis, using and producing information and all sorts of messages and / or informational language codes. This conviction inspired this article, whose aim is to present current trends in the use of technology in distance education applied as an alternative and / or supplement to classroom teaching for Youth and Adults, concepts and actions, seeking to contribute to its development in the state of Amapá and specifically, the Center for Professional of Amapá Teaching Professor Josinete Oliveira Barroso - CEPAJOB.Keywords: youth and adults education, Ead. Professional Education, online courses, CEPAJOB
Procedia PDF Downloads 64011004 Study of Behavior Tribological Cutting Tools Based on Coating
Authors: A. Achour L. Chekour, A. Mekroud
Abstract:
Tribology, the science of lubrication, friction and wear, plays an important role in science "crossroads" initiated by the recent developments in the industry. Its multidisciplinary nature reinforces its scientific interest. It covers all the sciences that deal with the contact between two solids loaded and relative motion. It is thus one of the many intersections more clearly established disciplines such as solid mechanics and the fluids, rheological, thermal, materials science and chemistry. As for his experimental approach, it is based on the physical and processing signals and images. The optimization of operating conditions by cutting tool must contribute significantly to the development and productivity of advanced automation of machining techniques because their implementation requires sufficient knowledge of how the process and in particular the evolution of tool wear. In addition, technological advances have developed the use of very hard materials, refractory difficult machinability, requiring highly resistant materials tools. In this study, we present the behavior wear a machining tool during the roughing operation according to the cutting parameters. The interpretation of the experimental results is based mainly on observations and analyzes of sharp edges e tool using the latest techniques: scanning electron microscopy (SEM) and optical rugosimetry laser beam.Keywords: friction, wear, tool, cutting
Procedia PDF Downloads 32911003 Hunting Ban, Unfortunate Decisions for the Bear Population in Romania
Authors: Alexandru Gridan, Georgeta Ionescu, Ovidiu Ionescu, Ramon Jurj, George Sirbu, Mihai Fedorca
Abstract:
The Brown Bear population size in Romania is approximately 7300-7600 individuals, which is projected to be 3000 individuals over the ecological carrying capacity. The Habitats Directive imposed certain protection rules on European Union (EU) Member States with Brown Bear populations. These however allow countries like Sweden, Croatia, Slovakia, Estonia to hunting as management tool, harvesting up to 10% of the surplus bear population annually. From the point Romania joined the EU to 2016, active conservation management has contributed to maintaining the highest and most genetically diverse Brown Bear population in Europe. Importantly, there has been good coexistence between people and bears and low levels of human-bear conflict. After social pressure and campaigning by some non-governmental organisations citing issues over monitoring, the environment minister decided in September 2016 to stop the use of hunting as a management tool for bears. Against this background, this paper provides a set of recommendations to resolve the current conflict in Romania. These include the need for collaborative decision-making to reduce conflicts between stakeholders and mechanisms to reduce current human-bear conflicts, which have increased by 50 percent in the past year.Keywords: bear, bear population, bear management, wildlife conflict
Procedia PDF Downloads 17911002 Generalized Central Paths for Convex Programming
Authors: Li-Zhi Liao
Abstract:
The central path has played the key role in the interior point method. However, the convergence of the central path may not be true even in some convex programming problems with linear constraints. In this paper, the generalized central paths are introduced for convex programming. One advantage of the generalized central paths is that the paths will always converge to some optimal solutions of the convex programming problem for any initial interior point. Some additional theoretical properties for the generalized central paths will be also reported.Keywords: central path, convex programming, generalized central path, interior point method
Procedia PDF Downloads 32311001 Data Centers’ Temperature Profile Simulation Optimized by Finite Elements and Discretization Methods
Authors: José Alberto García Fernández, Zhimin Du, Xinqiao Jin
Abstract:
Nowadays, data center industry faces strong challenges for increasing the speed and data processing capacities while at the same time is trying to keep their devices a suitable working temperature without penalizing that capacity. Consequently, the cooling systems of this kind of facilities use a large amount of energy to dissipate the heat generated inside the servers, and developing new cooling techniques or perfecting those already existing would be a great advance in this type of industry. The installation of a temperature sensor matrix distributed in the structure of each server would provide the necessary information for collecting the required data for obtaining a temperature profile instantly inside them. However, the number of temperature probes required to obtain the temperature profiles with sufficient accuracy is very high and expensive. Therefore, other less intrusive techniques are employed where each point that characterizes the server temperature profile is obtained by solving differential equations through simulation methods, simplifying data collection techniques but increasing the time to obtain results. In order to reduce these calculation times, complicated and slow computational fluid dynamics simulations are replaced by simpler and faster finite element method simulations which solve the Burgers‘ equations by backward, forward and central discretization techniques after simplifying the energy and enthalpy conservation differential equations. The discretization methods employed for solving the first and second order derivatives of the obtained Burgers‘ equation after these simplifications are the key for obtaining results with greater or lesser accuracy regardless of the characteristic truncation error.Keywords: Burgers' equations, CFD simulation, data center, discretization methods, FEM simulation, temperature profile
Procedia PDF Downloads 16511000 Criterion-Referenced Test Reliability through Threshold Loss Agreement: Fuzzy Logic Analysis Approach
Authors: Mohammad Ali Alavidoost, Hossein Bozorgian
Abstract:
Criterion-referenced tests (CRTs) are designed to measure student performance against a fixed set of predetermined criteria or learning standards. The reliability of such tests cannot be based on internal reliability. Threshold loss agreement is one way to calculate the reliability of CRTs. However, the selection of master and non-master in such agreement is determined by the threshold point. The problem is if the threshold point witnesses a minute change, the selection of master and non-master may have a drastic change, leading to the change in reliability results. Therefore, in this study, the Fuzzy logic approach is employed as a remedial procedure for data analysis to obviate the threshold point problem. Forty-one Iranian students were selected; the participants were all between 20 and 30 years old. A quantitative approach was used to address the research questions. In doing so, a quasi-experimental design was utilized since the selection of the participants was not randomized. Based on the Fuzzy logic approach, the threshold point would be more stable during the analysis, resulting in rather constant reliability results and more precise assessment.Keywords: criterion-referenced tests, threshold loss agreement, threshold point, fuzzy logic approach
Procedia PDF Downloads 36610999 Using Audit Tools to Maintain Data Quality for ACC/NCDR PCI Registry Abstraction
Authors: Vikrum Malhotra, Manpreet Kaur, Ayesha Ghotto
Abstract:
Background: Cardiac registries such as ACC Percutaneous Coronary Intervention Registry require high quality data to be abstracted, including data elements such as nuclear cardiology, diagnostic coronary angiography, and PCI. Introduction: The audit tool created is used by data abstractors to provide data audits and assess the accuracy and inter-rater reliability of abstraction performed by the abstractors for a health system. This audit tool solution has been developed across 13 registries, including ACC/NCDR registries, PCI, STS, Get with the Guidelines. Methodology: The data audit tool was used to audit internal registry abstraction for all data elements, including stress test performed, type of stress test, data of stress test, results of stress test, risk/extent of ischemia, diagnostic catheterization detail, and PCI data elements for ACC/NCDR PCI registries. This is being used across 20 hospital systems internally and providing abstraction and audit services for them. Results: The data audit tool had inter-rater reliability and accuracy greater than 95% data accuracy and IRR score for the PCI registry in 50 PCI registry cases in 2021. Conclusion: The tool is being used internally for surgical societies and across hospital systems. The audit tool enables the abstractor to be assessed by an external abstractor and includes all of the data dictionary fields for each registry.Keywords: abstraction, cardiac registry, cardiovascular registry, registry, data
Procedia PDF Downloads 10310998 Optimization of Surface Roughness by Taguchi’s Method for Turning Process
Authors: Ashish Ankus Yerunkar, Ravi Terkar
Abstract:
Study aimed at evaluating the best process environment which could simultaneously satisfy requirements of both quality as well as productivity with special emphasis on reduction of cutting tool flank wear, because reduction in flank wear ensures increase in tool life. The predicted optimal setting ensured minimization of surface roughness. Purpose of this paper is focused on the analysis of optimum cutting conditions to get lowest surface roughness in turning SCM 440 alloy steel by Taguchi method. Design for the experiment was done using Taguchi method and 18 experiments were designed by this process and experiments conducted. The results are analyzed using ANOVA method. Taguchi method has depicted that the depth of cut has significant role to play in producing lower surface roughness followed by feed. The Cutting speed has lesser role on surface roughness from the tests. The vibrations of the machine tool, tool chattering are the other factors which may contribute poor surface roughness to the results and such factors ignored for analyses. The inferences by this method will be useful to other researches for similar type of study and may be vital for further research on tool vibrations, cutting forces etc.Keywords: surface roughness (ra), machining, dry turning, taguchi method, turning process, anova method, mahr perthometer
Procedia PDF Downloads 36610997 Model of Learning Center on OTOP Production Process Based on Sufficiency Economic Philosophy
Authors: Chutikarn Sriviboon, Witthaya Mekhum
Abstract:
The purposes of this research were to analyze and evaluate successful factors in OTOP production process for the developing of learning center on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality. The research has been designed as a qualitative study to gather information from 30 OTOP producers in Bangkontee District, Samudsongkram Province. They were all interviewed on 3 main parts. Part 1 was about the production process including 1) production 2) product development 3) the community strength 4) marketing possibility and 5) product quality. Part 2 evaluated appropriate successful factors including 1) the analysis of the successful factors 2) evaluate the strategy based on Sufficiency Economic Philosophy and 3) the model of learning center on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality. The results showed that the production did not affect the environment with potential in continuing standard quality production. They used the raw materials in the country. On the aspect of product and community strength in the past 1 year, it was found that there was no appropriate packaging showing product identity according to global market standard. They needed the training on packaging especially for food and drink products. On the aspect of product quality and product specification, it was found that the products were certified by the local OTOP standard. There should be a responsible organization to help the uncertified producers pass the standard. However, there was a problem on food contamination which was hazardous to the consumers. The producers should cooperate with the government sector or educational institutes involving with food processing to reach FDA standard. The results from small group discussion showed that the community expected high education and better standard living. Some problems reported by the community included informal debt and drugs in the community. There were 8 steps in developing the model of learning center on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality.Keywords: production process, OTOP, sufficiency economic philosophy, learning center
Procedia PDF Downloads 37510996 A Design of Elliptic Curve Cryptography Processor based on SM2 over GF(p)
Authors: Shiji Hu, Lei Li, Wanting Zhou, DaoHong Yang
Abstract:
The data encryption, is the foundation of today’s communication. On this basis, how to improve the speed of data encryption and decryption is always a problem that scholars work for. In this paper, we proposed an elliptic curve crypto processor architecture based on SM2 prime field. In terms of hardware implementation, we optimized the algorithms in different stages of the structure. In finite field modulo operation, we proposed an optimized improvement of Karatsuba-Ofman multiplication algorithm, and shorten the critical path through pipeline structure in the algorithm implementation. Based on SM2 recommended prime field, a fast modular reduction algorithm is used to reduce 512-bit wide data obtained from the multiplication unit. The radix-4 extended Euclidean algorithm was used to realize the conversion between affine coordinate system and Jacobi projective coordinate system. In the parallel scheduling of point operations on elliptic curves, we proposed a three-level parallel structure of point addition and point double based on the Jacobian projective coordinate system. Combined with the scalar multiplication algorithm, we added mutual pre-operation to the point addition and double point operation to improve the efficiency of the scalar point multiplication. The proposed ECC hardware architecture was verified and implemented on Xilinx Virtex-7 and ZYNQ-7 platforms, and each 256-bit scalar multiplication operation took 0.275ms. The performance for handling scalar multiplication is 32 times that of CPU(dual-core ARM Cortex-A9).Keywords: Elliptic curve cryptosystems, SM2, modular multiplication, point multiplication.
Procedia PDF Downloads 9510995 Study on Errors in Estimating the 3D Gaze Point for Different Pupil Sizes Using Eye Vergences
Authors: M. Pomianek, M. Piszczek, M. Maciejewski
Abstract:
The binocular eye tracking technology is increasingly being used in industry, entertainment and marketing analysis. In the case of virtual reality, eye tracking systems are already the basis for user interaction with the environment. In such systems, the high accuracy of determining the user's eye fixation point is very important due to the specificity of the virtual reality head-mounted display (HMD). Often, however, there are unknown errors occurring in the used eye tracking technology, as well as those resulting from the positioning of the devices in relation to the user's eyes. However, can the virtual environment itself influence estimation errors? The paper presents mathematical analyses and empirical studies of the determination of the fixation point and errors resulting from the change in the size of the pupil in response to the intensity of the displayed scene. The article contains both static laboratory tests as well as on the real user. Based on the research results, optimization solutions were proposed that would reduce the errors of gaze estimation errors. Studies show that errors in estimating the fixation point of vision can be minimized both by improving the pupil positioning algorithm in the video image and by using more precise methods to calibrate the eye tracking system in three-dimensional space.Keywords: eye tracking, fixation point, pupil size, virtual reality
Procedia PDF Downloads 13010994 NABERS Indoor Environment - a Rating Tool to Benchmark the IEQ of Australian Office Commercial Buildings
Authors: Kazi Hossain
Abstract:
The National Australian Built Environment Rating System (NABERS) is the key industry standard for measuring and benchmarking environmental performance of existing buildings in Australia. Developed and run by the New South Wales government, NABERS measures the operational efficiency of different types of buildings by using a set of tools that provide an easy to understand graphical rating outcome ranged from 0 to 6 stars. This set of tools also include a tool called NABERS IE which enables tenants or building managers to benchmark their buildings indoor environment quality against the national market. Launched in 2009, the number NABERS IE ratings have steadily increased from 10 certified ratings in 2011 to 43 in 2013. However there is a massive uptake of over 50 ratings alone in 2014 making the number of ratings to reach over 100. This paper outlines the methodology used to create this tool, a statistical overview of the tool, and the driving factor that motivates the building owners and managers to use this tool every year to rate their buildings.Keywords: Acoustic comfort, Indoor air quality, Indoor Environment, NABERS, National Australian Built Environment Rating System, Performance rating, Rating System, Thermal comfort, Ventilation effectiveness, Visual comfort.
Procedia PDF Downloads 55510993 A Compressor Map Optimizing Tool for Prediction of Compressor Off-Design Performance
Authors: Zhongzhi Hu, Jie Shen, Jiqiang Wang
Abstract:
A high precision aeroengine model is needed when developing the engine control system. Compared with other main components, the axial compressor is the most challenging component to simulate. In this paper, a compressor map optimizing tool based on the introduction of a modifiable β function is developed for FWorks (FADEC Works). Three parameters (d density, f fitting coefficient, k₀ slope of the line β=0) are introduced to the β function to make it modifiable. The comparison of the traditional β function and the modifiable β function is carried out for a certain type of compressor. The interpolation errors show that both methods meet the modeling requirements, while the modifiable β function can predict compressor performance more accurately for some areas of the compressor map where the users are interested in.Keywords: beta function, compressor map, interpolation error, map optimization tool
Procedia PDF Downloads 26410992 Stagnation Point Flow Over a Stretching Cylinder with Variable Thermal Conductivity and Slip Conditions
Authors: M. Y. Malik, Farzana Khan
Abstract:
In this article, we discuss the behavior of viscous fluid near stagnation point over a stretching cylinder with variable thermal conductivity. The effects of slip conditions are also encountered. Thermal conductivity is considered as a linear function of temperature. By using homotopy analysis method and Fehlberg method we compare the graphical results for both momentum and energy equations. The effect of different parameters on velocity and temperature fields are shown graphically.Keywords: slip conditions, stretching cylinder, heat generation/absorption, stagnation point flow, variable thermal conductivity
Procedia PDF Downloads 42010991 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction
Authors: Radul Shishkov, Orlin Davchev
Abstract:
The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction
Procedia PDF Downloads 5910990 Performance Analysis of Proprietary and Non-Proprietary Tools for Regression Testing Using Genetic Algorithm
Authors: K. Hema Shankari, R. Thirumalaiselvi, N. V. Balasubramanian
Abstract:
The present paper addresses to the research in the area of regression testing with emphasis on automated tools as well as prioritization of test cases. The uniqueness of regression testing and its cyclic nature is pointed out. The difference in approach between industry, with business model as basis, and academia, with focus on data mining, is highlighted. Test Metrics are discussed as a prelude to our formula for prioritization; a case study is further discussed to illustrate this methodology. An industrial case study is also described in the paper, where the number of test cases is so large that they have to be grouped as Test Suites. In such situations, a genetic algorithm proposed by us can be used to reconfigure these Test Suites in each cycle of regression testing. The comparison is made between a proprietary tool and an open source tool using the above-mentioned metrics. Our approach is clarified through several tables.Keywords: APFD metric, genetic algorithm, regression testing, RFT tool, test case prioritization, selenium tool
Procedia PDF Downloads 43310989 Realization of Soliton Phase Characteristics in 10 Gbps, Single Channel, Uncompensated Telecommunication System
Authors: A. Jawahar
Abstract:
In this paper, the dependence of soliton pulses with respect to phase in a 10 Gbps, single channel, dispersion uncompensated telecommunication system was studied. The characteristic feature of periodic soliton interaction was noted at the Interaction point (I=6202.5Km) in one collision length of L=12405.1 Km. The interaction point is located for 10Gbps system with an initial relative spacing (qo) of soliton as 5.28 using Perturbation theory. It is shown that, when two in-phase solitons are launched, they interact at the point I=6202.5 Km, but the interaction could be restricted with introduction of different phase initially. When the phase of the input solitons increases, the deviation of soliton pulses at the I also increases. We have successfully demonstrated this effect in a telecommunication set-up in terms of Quality factor (Q), where the Q=0 for in-phase soliton. The Q was noted to be 125.9, 38.63, 47.53, 59.60, 161.37, and 78.04 for different phases such as 10o, 20o, 30o, 45o, 60o and 90o degrees respectively at Interaction point I.Keywords: Soliton interaction, Initial relative spacing, phase, Perturbation theory and telecommunication system
Procedia PDF Downloads 46910988 Learning Resource Management of the Royal Court Courtier in the Reign of King Rama V
Authors: Chanaphop Vannaolarn, Weena Eiamprapai
Abstract:
Thai noblewomen and lady-in-waiting in the era of King Rama V stayed only inside the palace. King Rama V decided to build Dusit Palace in 1897 and another palace called Suan Sunandha in 1900 after his royal visit to Europe. This palace became the residence for noblewomen in the court until the change of political system in 1932. The study about noblewomen in the palace can educate people about how our nation was affected by western civilization in terms of architecture, food, outfit and recreations. It is a way to develop the modern society by studying the great historical value of the past. A learning center about noblewomen will not only provide knowledge but also create bond and patriotic feeling among Thais.Keywords: noblewomen, palace, management, learning center
Procedia PDF Downloads 36010987 Understanding Nanocarrier Efficacy in Drug Delivery Systems Using Molecular Dynamics
Authors: Maedeh Rahimnejad, Bahman Vahidi, Bahman Ebrahimi Hoseinzadeh, Fatemeh Yazdian, Puria Motamed Fath, Roghieh Jamjah
Abstract:
Introduction: The intensive labor and high cost of developing new vehicles for controlled drug delivery highlights the need for a change in their discovery process. Computational models can be used to accelerate experimental steps and control the high cost of experiments. Methods: In this work, to better understand the interaction of anti-cancer drug and the nanocarrier with the cell membrane, we have done molecular dynamics simulation using NAMD. We have chosen paclitaxel for the drug molecule and dipalmitoylphosphatidylcholine (DPPC) as a natural phospholipid nanocarrier. Results: Next, center of mass (COM) between molecules and the van der Waals interaction energy close to the cell membrane has been analyzed. Furthermore, the simulation results of the paclitaxel interaction with the cell membrane and the interaction of DPPC as a nanocarrier loaded by the drug with the cell membrane have been compared. Discussion: Analysis by molecular dynamics (MD) showed that not only the energy between the nanocarrier and the cell membrane is low, but also the center of mass amount decreases in the nanocarrier and the cell membrane system during the interaction; therefore they show significantly better interaction in comparison to the individual drug with the cell membrane.Keywords: anti-cancer drug, center of mass, interaction energy, molecular dynamics simulation, nanocarrier
Procedia PDF Downloads 29410986 Study on The Model of Microscopic Contact Parameters for Grinding M300 Using Elastic Abrasive Tool
Authors: Wu Xiaojun, Liu Ruiping, Yu Xingzhan, Wu Qian
Abstract:
In precision grinding, utilizing the elastic matrix ball has higher processing efficiency and better superficial quality than traditional grinding. The diversity of characteristics which elastic abrasive tool contact with bend surface results in irregular wear abrasion,and abrasive tool machining status get complicated. There is no theoretical interpretation that parameters affect the grinding accuracy.Aiming at corrosion resistance, wear resistance and other characteristics of M 300 material, it is often used as a material on aerospace precision components. The paper carried out grinding and polishing experiments by using material of M 300,to theoretically show the relationship between stress magnitude and grinding efficiency,and predict the optimal combination of grinding parameter for effective grinding, just for the high abrasion resistance features of M 300, analyzing the micro-contact of elastic ball abrasive tool (Whetstone), using mathematical methods deduce the functional relationship between residual peak removal rate and the main parameters which impact the grinding accuracy on the plane case.Thus laying the foundation for the study of elastic abrasive prediction and compensation.Keywords: flexible abrasive tool, polishing parameters, Hertz theory, removal rate
Procedia PDF Downloads 54410985 Globally Attractive Mild Solutions for Non-Local in Time Subdiffusion Equations of Neutral Type
Authors: Jorge Gonzalez Camus, Carlos Lizama
Abstract:
In this work is proved the existence of at least one globally attractive mild solution to the Cauchy problem, for fractional evolution equation of neutral type, involving the fractional derivate in Caputo sense. An almost sectorial operator on a Banach space X and a kernel belonging to a large class appears in the equation, which covers many relevant cases from physics applications, in particular, the important case of time - fractional evolution equations of neutral type. The main tool used in this work was the Hausdorff measure of noncompactness and fixed point theorems, specifically Darbo-type. Initially, the equation is a Cauchy problem, involving a fractional derivate in Caputo sense. Then, is formulated the equivalent integral version, and defining a convenient functional, using the analytic integral resolvent operator, and verifying the hypothesis of the fixed point theorem of Darbo type, give us the existence of mild solution for the initial problem. Furthermore, each mild solution is globally attractive, a property that is desired in asymptotic behavior for that solution.Keywords: attractive mild solutions, integral Volterra equations, neutral type equations, non-local in time equations
Procedia PDF Downloads 15410984 Design and Analysis of Deep Excavations
Authors: Barham J. Nareeman, Ilham I. Mohammed
Abstract:
Excavations in urban developed area are generally supported by deep excavation walls such as; diaphragm wall, bored piles, soldier piles and sheet piles. In some cases, these walls may be braced by internal braces or tie back anchors. Tie back anchors are by far the predominant method for wall support, the large working space inside the excavation provided by a tieback anchor system has a significant construction advantage. This paper aims to analyze a deep excavation bracing system of contiguous pile wall braced by pre-stressed tie back anchors, which is a part of a huge residential building project, located in Turkey/Gaziantep province. The contiguous pile wall will be constructed with a length of 270 m that consists of 285 piles, each having a diameter of 80 cm, and a center to center spacing of 95 cm. The deformation analysis was carried out by a finite element analysis tool using PLAXIS. In the analysis, beam element method together with an elastic perfect plastic soil model and Soil Hardening Model was used to design the contiguous pile wall, the tieback anchor system, and the soil. The two soil clusters which are limestone and a filled soil were modelled with both Hardening soil and Mohr Coulomb models. According to the basic design, both soil clusters are modelled as drained condition. The simulation results show that the maximum horizontal movement of the walls and the maximum settlement of the ground are convenient with 300 individual case histories which are ranging between 1.2mm and 2.3mm for walls, and 15mm and 6.5mm for the settlements. It was concluded that tied-back contiguous pile wall can be satisfactorily modelled using Hardening soil model.Keywords: deep excavation, finite element, pre-stressed tie back anchors, contiguous pile wall, PLAXIS, horizontal deflection, ground settlement
Procedia PDF Downloads 25310983 Alloying Effect on Hot Workability of M42 High Speed Steel
Authors: Jung-Ho Moon, Tae Kwon Ha
Abstract:
In the present study, the effect of Si, Al, Ti, Zr, and Nb addition on the microstructure and hot workability of cast M42 tool steels, basically consisting of 1.0C, 0.2Mn, 3.8Cr, 1.5W, 8.5Co, 9.2Mo, and 1.0V in weight percent has been investigated. Tool steels containing Si of 0.25 and 0.5 wt.%, Al of 0.06 and 0.12 wt.%, Ti of 0.3 wt.%, Zr of 0.3 wt.%, and Nb of 0.3 wt.% were cast into ingots of 140 mm´ 140 mm´ 330 mm by vacuum induction melting. After solution treatment at 1150°C for 1.5 hrs. followed by furnace cooling, hot rolling at 1180 °C was conducted on the ingots. Addition of titanium, zirconium and niobium was found to retard the decomposition of the eutectic carbides and result in the deterioration of hot workability of the tool steels, while addition of aluminium and silicon showed relatively well decomposed carbide structure and resulted in sound hot rolled plates.Keywords: high speed steels, alloying elements, eutectic carbides, microstructure, hot workability
Procedia PDF Downloads 34910982 Development of Prediction Tool for Sound Absorption and Sound Insulation for Sound Proof Properties
Authors: Yoshio Kurosawa, Takao Yamaguchi
Abstract:
High frequency automotive interior noise above 500 Hz considerably affects automotive passenger comfort. To reduce this noise, sound insulation material is often laminated on body panels or interior trim panels. For a more effective noise reduction, the sound reduction properties of this laminated structure need to be estimated. We have developed a new calculate tool that can roughly calculate the sound absorption and insulation properties of laminate structure and handy for designers. In this report, the outline of this tool and an analysis example applied to floor mat are introduced.Keywords: automobile, acoustics, porous material, transfer matrix method
Procedia PDF Downloads 50710981 Autonomic Recovery Plan with Server Virtualization
Authors: S. Hameed, S. Anwer, M. Saad, M. Saady
Abstract:
For autonomic recovery with server virtualization, a cogent plan that includes recovery techniques and backups with virtualized servers can be developed instead of assigning an idle server to backup operations. In addition to hardware cost reduction and data center trail, the disaster recovery plan can ensure system uptime and to meet objectives of high availability, recovery time, recovery point, server provisioning, and quality of services. This autonomic solution would also support disaster management, testing, and development of the recovery site. In this research, a workflow plan is proposed for supporting disaster recovery with virtualization providing virtual monitoring, requirements engineering, solution decision making, quality testing, and disaster management. This recovery model would make disaster recovery a lot easier, faster, and less error prone.Keywords: autonomous intelligence, disaster recovery, cloud computing, server virtualization
Procedia PDF Downloads 15910980 Proposal for an Inspection Tool for Damaged Structures after Disasters
Authors: Karim Akkouche, Amine Nekmouche, Leyla Bouzid
Abstract:
This study focuses on the development of a multifunctional Expert System (ES) called post-seismic damage inspection tool (PSDIT), a powerful tool which allows the evaluation, the processing, and the archiving of the collected data stock after earthquakes. PSDIT can be operated by two user types; an ordinary user (ingineer, expert, or architect) for the damage visual inspection and an administrative user for updating the knowledge and / or for adding or removing the ordinary user. The knowledge acquisition is driven by a hierarchical knowledge model, the Information from investigation reports and those acquired through feedback from expert / engineer questionnaires are part.Keywords: .disaster, damaged structures, damage assessment, expert system
Procedia PDF Downloads 80