Search results for: optimal shape design
13552 Loading Forces following Addition of 5% Cu in Nickel-Titanium Alloy Used for Orthodontics
Authors: Aphinan Phukaoluan, Surachai Dechkunakorn, Niwat Anuwongnukroh, Anak Khantachawana, Pongpan Kaewtathip, Julathep Kajornchaiyakul, Wassana Wichai
Abstract:
Aims: This study aims to address the amount of force delivered by a NiTiCu orthodontic wire with a ternary composition ratio of 46.0 Ni: 49.0 Ti: 5.0 Cu and to compare the results with a commercial NiTiCu 35 °C orthodontic archwire. Materials and Methods: Nickel (purity 99.9%), Titanium (purity 99.9%), and Copper (purity 99.9%) were used in this study with the atomic weight ratio 46.0 Ni: 49.0 Ti: 5.0 Cu. The elements were melted to form an alloy using an electrolytic arc furnace in argon gas atmosphere and homogenized at 800 °C for 1 hr. The alloys were subsequently sliced into thin plates (1.5mm) by EDM wire cutting machine to obtain the specimens and were cold-rolled with 30% followed by heat treatment in a furnace at 400 °C for 1 hour. Then, the three newly fabricated NiTiCu specimens were cut in nearly identical wire sizes of 0.016 inch x0.022 inch. Commercial preformed Ormco NiTiCu35 °C archwire with size 0.016 inch x 0.022 inches were used for comparative purposes. Three-point bending test was performed using a Universal Testing Machine to investigate the force of the load-deflection curve at oral temperature (36 °C+ 1) with deflection points at 0.25, 0.5, 0.75, 1.0. 1.25, and 1.5 mm. Descriptive statistics was used to evaluate each variables and independent t-test was used to analyze the differences between the groups. Results: Both NiTiCu wires presented typical superelastic properties as observed from the load-deflection curve. The average force was 341.70 g for loading, and 264.18 g for unloading for 46.0 Ni: 49.0 Ti: 5.0 Cu wire. Similarly, the values were 299.88 g for loading, and 201.96 g for unloading of Ormco NiTiCu35°C. There were significant differences (p < 0.05) in mean loading and unloading forces between the two NiTiCu wires. The deflection forces in loading and unloading force for Ormco NiTiCu at each point were less than 46.0 Ni: 49.0 Ti: 5.0 Cu wire, except at the deflection point of 0.25mm. Regarding the force difference between each deflection point of loading and unloading force, Ormco NiTiCu35 °C exerted less force than 46.0 Ni: 49.0 Ti: 5.0 Cu wire, except at difference deflection at 1.5-1.25 mm of unloading force. However, there were still within the acceptable limits for orthodontic use. Conclusion: The fabricated ternary alloy of 46.0 Ni: 49.0 Ti: 5.0 Cu (atomic weight) with 30% reduction and heat treatment at 400°C for 1 hr. and Ormco 35 °C NiTiCu presented the characteristics of the shape memory in their wire form. The unloading forces of both NiTiCu wires were in the range of orthodontic use. This should be a good foundation for further studies towards development of new orthodontic NiTiCu archwires.Keywords: loading force, ternary alloy, NiTiCu, shape memory, orthodontic wire
Procedia PDF Downloads 28613551 Designing Nanowire Based Honeycomb Photonic Crystal Surface Emitting Lasers
Authors: Balthazar Temu, Zhao Yan, Bogdan-Petrin Ratiu, Sang Soon Oh, Qiang Li
Abstract:
Photonic Crystal Surface Emitting Lasers (PCSELs) are structures which are made up of a periodically repeating patterns with a unit cell consisting of changes in refractive index. The variation in refractive index can be achieved by etching air holes in a semiconductor material to get hole based PCSELs or by growing nanowires to get nanowire based PCSELs. As opposed to hole based PCSELs, nanowire based PCSELs can be integrated on silicon platform without threading dislocations, thanks to the small area of the nanowire that is in contact with silicon substrate that relaxes the strain. Nanowire based PCSELs reported in the literature have been designed using a triangular, square or honeycomb patterns. The triangular and square pattern PCSELs have limited degrees of freedom in tuning the design parameters which hinders the ability to design high quality factor (Q-factor) and/or variable wavelength devices. Nanowire based PCSELs designed using triangular and square patterns have been reported with the lasing thresholds of 130 kW/〖cm〗^2 and 7 kW/〖cm〗^2 respectively. On the other hand the honeycomb pattern gives more degrees of freedom in tuning the design parameters, which can allow one to design high Q-factor devices. A deformed honeycomb pattern device was reported with lasing threshold of 6.25 W/〖cm〗^2 corresponding to a simulated Q-factor of 5.84X〖10〗^5.Despite this achievement, the design principles which can lead to realization of even higher Q-factor honeycomb pattern PCSELs have not yet been investigated. In this work we study how the resonance wavelength and the Q-factor of three different resonance modes of the device vary when their design parameters are tuned. Through this study we establish the design and simulation of devices operating in 970nm wavelength band, O band and in the C band with quality factors up to 7X〖10〗^7 . We also investigate the quality factors of undeformed device and establish that the band edge close to 970nm can attain high quality factor when the device is undeformed and the quality factor degrades as the device is deformed.Keywords: honeycomb PCSEL, nanowire laser, photonic crystal laser, simulation of photonic crystal surface emitting laser
Procedia PDF Downloads 1413550 Fashion Appropriation: A Study in Awareness of Crossing Cultural Boundaries in Design
Authors: Anahita Suri
Abstract:
Myriad cultures form the warp and weft of the fabric of this world. The last century saw mass migration of people across geographical boundaries, owing to industrialization and globalization. These people took with them their cultures, costumes, traditions, and folklore, which mingled with the local cultures to create something new and place it in a different context to make it contemporary. With the surge in population and growth of the fashion industry, there has been an increasing demand for innovative and individual fashion, from street markets to luxury brands. Exhausted by local influences, designers take inspiration from the so called ‘low’ culture and create artistic products, place it in a different context, and the end-product is categorized as ‘high’ culture. It is challenging as to why a design/culture is ‘high’ or ‘low’. Who decides which works, practices, activities, etc., are ‘high’ and which are ‘low’? The justification for this distinction is often found not in the design itself but the context attached to it. Also, the concept of high/ low is relative to time- what is ‘high’ today can be ‘low’ tomorrow and ‘high’ again the day after. This raises certain concerns. Firstly, it is sad that a culture which offers inspiration is looked down upon as ‘low’ culture. Secondly, it is ironic because the so designated ‘high’ culture is a manipulation of the truth from the authentic ‘low’ culture, which is capable of true expression. When you borrow from a different culture, you pretend to be authentic because you actually are not. Finally, it is important to be aware of crossing cultural boundaries and the context attached to a design/product so as to use it a responsible way that communicates the design without offending anyone. Is it ok for a person’s cultural identity to become another person’s fashion accessory? This essay explores the complex, multi-layered subject of fashion appropriation and aims to provoke debate over cultural ‘borrowing’ and create awareness that commodification of cultural symbols and iconography in fashion is inappropriate and offensive and not the same as ‘celebrating cultural differences’.Keywords: context, culture, fashion appropriation, inoffensive, responsible
Procedia PDF Downloads 12713549 Investigating the Effect of Handicrafts Recreation on the Interior Design of Traditional Arts Gallery
Authors: Amir Masoud Dabagh, Mahsa Khaleghi
Abstract:
The world has entered a new phase of cultural, social, economic, and so on in the last two centuries. Apart from its positive benefits and achievements to the world, it has also incurred many costs, most of which can be mentioned as destroying or at least diminishing the role of the costumes, traditions and authentic culture of the past communities. Understanding what lasts in traditional arts is vital and worthy of study because receiving it and embracing art and forms of art using that last the artistic creation removes the age-old color and smell of its face, making it immortal and persistent in all ages. This paper attempts to present traditional art concepts and solutions for interior design with the approach of handicrafts recreation as a symbol and manifestation of national identity and proof of ancient civilizations, which is at the center of tourists' attention today. The research method is a descriptive-analytical one that first explores the theoretical foundations of research, which are the concepts of recreation and traditional arts, and analyzes the process of recreation that conceals the recollection of past experiences as well as the dynamics and creativity.Keywords: recreation, handicrafts, interior design, concept, traditional arts
Procedia PDF Downloads 11313548 Increase of Sensitivity in 3D Suspended Polymeric Microfluidic Platform through Lateral Misalignment
Authors: Ehsan Yazdanpanah Moghadam, Muthukumaran Packirisamy
Abstract:
In the present study, a design of the suspended polymeric microfluidic platform is introduced that is fabricated with three polymeric layers. Changing the microchannel plane to be perpendicular to microcantilever plane, drastically decreases moment of inertia in that direction. In addition, the platform is made of polymer (around five orders of magnitude less compared to silicon). It causes significant increase in the sensitivity of the cantilever deflection. Next, although the dimensions of this platform are constant, by misaligning the embedded microchannels laterally in the suspended microfluidic platform, the sensitivity can be highly increased. The investigation is studied on four fluids including water, seawater, milk, and blood for flow ranges from low rate of 5 to 70 µl/min to obtain the best design with the highest sensitivity. The best design in this study shows the sensitivity increases around 50% for water, seawater, milk, and blood at the flow rate of 70 µl/min by just misaligning the embedded microchannels in the suspended polymeric microfluidic platform.Keywords: microfluidic, MEMS, biosensor, microresonator
Procedia PDF Downloads 22513547 Optimal Control of Generators and Series Compensators within Multi-Space-Time Frame
Authors: Qian Chen, Lin Xu, Ping Ju, Zhuoran Li, Yiping Yu, Yuqing Jin
Abstract:
The operation of power grid is becoming more and more complex and difficult due to its rapid development towards high voltage, long distance, and large capacity. For instance, many large-scale wind farms have connected to power grid, where their fluctuation and randomness is very likely to affect the stability and safety of the grid. Fortunately, many new-type equipments based on power electronics have been applied to power grid, such as UPFC (Unified Power Flow Controller), TCSC (Thyristor Controlled Series Compensation), STATCOM (Static Synchronous Compensator) and so on, which can help to deal with the problem above. Compared with traditional equipment such as generator, new-type controllable devices, represented by the FACTS (Flexible AC Transmission System), have more accurate control ability and respond faster. But they are too expensive to use widely. Therefore, on the basis of the comparison and analysis of the controlling characteristics between traditional control equipment and new-type controllable equipment in both time and space scale, a coordinated optimizing control method within mutil-time-space frame is proposed in this paper to bring both kinds of advantages into play, which can better both control ability and economical efficiency. Firstly, the coordination of different space sizes of grid is studied focused on the fluctuation caused by large-scale wind farms connected to power grid. With generator, FSC (Fixed Series Compensation) and TCSC, the coordination method on two-layer regional power grid vs. its sub grid is studied in detail. The coordination control model is built, the corresponding scheme is promoted, and the conclusion is verified by simulation. By analysis, interface power flow can be controlled by generator and the specific line power flow between two-layer regions can be adjusted by FSC and TCSC. The smaller the interface power flow adjusted by generator, the bigger the control margin of TCSC, instead, the total consumption of generator is much higher. Secondly, the coordination of different time sizes is studied to further the amount of the total consumption of generator and the control margin of TCSC, where the minimum control cost can be acquired. The coordination method on two-layer ultra short-term correction vs. AGC (Automatic Generation Control) is studied with generator, FSC and TCSC. The optimal control model is founded, genetic algorithm is selected to solve the problem, and the conclusion is verified by simulation. Finally, the aforementioned method within multi-time-space scale is analyzed with practical cases, and simulated on PSASP (Power System Analysis Software Package) platform. The correctness and effectiveness are verified by the simulation result. Moreover, this coordinated optimizing control method can contribute to the decrease of control cost and will provide reference to the following studies in this field.Keywords: FACTS, multi-space-time frame, optimal control, TCSC
Procedia PDF Downloads 26713546 Seismic Assessment of Non-Structural Component Using Floor Design Spectrum
Authors: Amin Asgarian, Ghyslaine McClure
Abstract:
Experiences in the past earthquakes have clearly demonstrated the necessity of seismic design and assessment of Non-Structural Components (NSCs) particularly in post-disaster structures such as hospitals, power plants, etc. as they have to be permanently functional and operational. Meeting this objective is contingent upon having proper seismic performance of both structural and non-structural components. Proper seismic design, analysis, and assessment of NSCs can be attained through generation of Floor Design Spectrum (FDS) in a similar fashion as target spectrum for structural components. This paper presents the developed methodology to generate FDS directly from corresponding Uniform Hazard Spectrum (UHS) (i.e. design spectra for structural components). The methodology is based on the experimental and numerical analysis of a database of 27 real Reinforced Concrete (RC) buildings which are located in Montreal, Canada. The buildings were tested by Ambient Vibration Measurements (AVM) and their dynamic properties have been extracted and used as part of the approach. Database comprises 12 low-rises, 10 medium-rises, and 5 high-rises and they are mostly designated as post-disaster\emergency shelters by the city of Montreal. The buildings are subjected to 20 compatible seismic records to UHS of Montreal and Floor Response Spectra (FRS) are developed for every floors in two horizontal direction considering four different damping ratios of NSCs (i.e. 2, 5, 10, and 20 % viscous damping). Generated FRS (approximately 132’000 curves) are statistically studied and the methodology is proposed to generate the FDS directly from corresponding UHS. The approach is capable of generating the FDS for any selection of floor level and damping ratio of NSCs. It captures the effect of: dynamic interaction between primary (structural) and secondary (NSCs) systems, higher and torsional modes of primary structure. These are important improvements of this approach compared to conventional methods and code recommendations. Application of the proposed approach are represented here through two real case-study buildings: one low-rise building and one medium-rise. The proposed approach can be used as practical and robust tool for seismic assessment and design of NSCs especially in existing post-disaster structures.Keywords: earthquake engineering, operational and functional components, operational modal analysis, seismic assessment and design
Procedia PDF Downloads 21313545 Expanding the Atelier: Design Lead Academic Project Using Immersive User-Generated Mobile Images and Augmented Reality
Authors: David Sinfield, Thomas Cochrane, Marcos Steagall
Abstract:
While there is much hype around the potential and development of mobile virtual reality (VR), the two key critical success factors are the ease of user experience and the development of a simple user-generated content ecosystem. Educational technology history is littered with the debris of over-hyped revolutionary new technologies that failed to gain mainstream adoption or were quickly superseded. Examples include 3D television, interactive CDROMs, Second Life, and Google Glasses. However, we argue that this is the result of curriculum design that substitutes new technologies into pre-existing pedagogical strategies that are focused upon teacher-delivered content rather than exploring new pedagogical strategies that enable student-determined learning or heutagogy. Visual Communication design based learning such as Graphic Design, Illustration, Photography and Design process is heavily based on the traditional forms of the classroom environment whereby student interaction takes place both at peer level and indeed teacher based feedback. In doing so, this makes for a healthy creative learning environment, but does raise other issue in terms of student to teacher learning ratios and reduced contact time. Such issues arise when students are away from the classroom and cannot interact with their peers and teachers and thus we see a decline in creative work from the student. Using AR and VR as a means of stimulating the students and to think beyond the limitation of the studio based classroom this paper will discuss the outcomes of a student project considering the virtual classroom and the techniques involved. The Atelier learning environment is especially suited to the Visual Communication model as it deals with the creative processing of ideas that needs to be shared in a collaborative manner. This has proven to have been a successful model over the years, in the traditional form of design education, but has more recently seen a shift in thinking as we move into a more digital model of learning and indeed away from the classical classroom structure. This study focuses on the outcomes of a student design project that employed Augmented Reality and Virtual Reality technologies in order to expand the dimensions of the classroom beyond its physical limits. Augmented Reality when integrated into the learning experience can improve the learning motivation and engagement of students. This paper will outline some of the processes used and the findings from the semester-long project that took place.Keywords: augmented reality, blogging, design in community, enhanced learning and teaching, graphic design, new technologies, virtual reality, visual communications
Procedia PDF Downloads 24013544 Enhancing the Performance of Automatic Logistic Centers by Optimizing the Assignment of Material Flows to Workstations and Flow Racks
Authors: Sharon Hovav, Ilya Levner, Oren Nahum, Istvan Szabo
Abstract:
In modern large-scale logistic centers (e.g., big automated warehouses), complex logistic operations performed by human staff (pickers) need to be coordinated with the operations of automated facilities (robots, conveyors, cranes, lifts, flow racks, etc.). The efficiency of advanced logistic centers strongly depends on optimizing picking technologies in synch with the facility/product layout, as well as on optimal distribution of material flows (products) in the system. The challenge is to develop a mathematical operations research (OR) tool that will optimize system cost-effectiveness. In this work, we propose a model that describes an automatic logistic center consisting of a set of workstations located at several galleries (floors), with each station containing a known number of flow racks. The requirements of each product and the working capacity of stations served by a given set of workers (pickers) are assumed as predetermined. The goal of the model is to maximize system efficiency. The proposed model includes two echelons. The first is the setting of the (optimal) number of workstations needed to create the total processing/logistic system, subject to picker capacities. The second echelon deals with the assignment of the products to the workstations and flow racks, aimed to achieve maximal throughputs of picked products over the entire system given picker capacities and budget constraints. The solutions to the problems at the two echelons interact to balance the overall load in the flow racks and maximize overall efficiency. We have developed an operations research model within each echelon. In the first echelon, the problem of calculating the optimal number of workstations is formulated as a non-standard bin-packing problem with capacity constraints for each bin. The problem arising in the second echelon is presented as a constrained product-workstation-flow rack assignment problem with non-standard mini-max criteria in which the workload maximum is calculated across all workstations in the center and the exterior minimum is calculated across all possible product-workstation-flow rack assignments. The OR problems arising in each echelon are proved to be NP-hard. Consequently, we find and develop heuristic and approximation solution algorithms based on exploiting and improving local optimums. The LC model considered in this work is highly dynamic and is recalculated periodically based on updated demand forecasts that reflect market trends, technological changes, seasonality, and the introduction of new items. The suggested two-echelon approach and the min-max balancing scheme are shown to work effectively on illustrative examples and real-life logistic data.Keywords: logistics center, product-workstation, assignment, maximum performance, load balancing, fast algorithm
Procedia PDF Downloads 22813543 Contribution to Experiments of a Free Surface Supercritical Flow over an Uneven Bottom
Authors: M. Bougamouza, M. Bouhadef, T. Zitoun
Abstract:
The aim of this study is to examine, through experimentation in the laboratory, the supercritical flow in the presence of an obstacle in a rectangular channel. The supercritical regime in the whole hydraulic channel is achieved by adding a convergent. We will observe the influence of the obstacle shape and dimension on the characteristics of the supercritical flow, mainly the free-surface elevation and the velocity profile. The velocity measurements have been conducted with the one dimension laser anemometry technique.Keywords: experiments, free-surface flow, hydraulic channel, uneven bottom, laser anemometry, supercritical regime
Procedia PDF Downloads 25113542 Ramp Rate and Constriction Factor Based Dual Objective Economic Load Dispatch Using Particle Swarm Optimization
Authors: Himanshu Shekhar Maharana, S. K .Dash
Abstract:
Economic Load Dispatch (ELD) proves to be a vital optimization process in electric power system for allocating generation amongst various units to compute the cost of generation, the cost of emission involving global warming gases like sulphur dioxide, nitrous oxide and carbon monoxide etc. In this dissertation, we emphasize ramp rate constriction factor based particle swarm optimization (RRCPSO) for analyzing various performance objectives, namely cost of generation, cost of emission, and a dual objective function involving both these objectives through the experimental simulated results. A 6-unit 30 bus IEEE test case system has been utilized for simulating the results involving improved weight factor advanced ramp rate limit constraints for optimizing total cost of generation and emission. This method increases the tendency of particles to venture into the solution space to ameliorate their convergence rates. Earlier works through dispersed PSO (DPSO) and constriction factor based PSO (CPSO) give rise to comparatively higher computational time and less good optimal solution at par with current dissertation. This paper deals with ramp rate and constriction factor based well defined ramp rate PSO to compute various objectives namely cost, emission and total objective etc. and compares the result with DPSO and weight improved PSO (WIPSO) techniques illustrating lesser computational time and better optimal solution.Keywords: economic load dispatch (ELD), constriction factor based particle swarm optimization (CPSO), dispersed particle swarm optimization (DPSO), weight improved particle swarm optimization (WIPSO), ramp rate and constriction factor based particle swarm optimization (RRCPSO)
Procedia PDF Downloads 38213541 Computer-Aided Depression Screening: A Literature Review on Optimal Methodologies for Mental Health Screening
Authors: Michelle Nighswander
Abstract:
Suicide can be a tragic response to mental illness. It is difficult for people to disclose or discuss suicidal impulses. The stigma surrounding mental health can create a reluctance to seek help for mental illness. Patients may feel pressure to exhibit a socially desirable demeanor rather than reveal these issues, especially if they sense their healthcare provider is pressed for time or does not have an extensive history with their provider. Overcoming these barriers can be challenging. Although there are several validated depression and suicide risk instruments, varying processes used to administer these tools may impact the truthfulness of the responses. A literature review was conducted to find evidence of the impact of the environment on the accuracy of depression screening. Many investigations do not describe the environment and fewer studies use a comparison design. However, three studies demonstrated that computerized self-reporting might be more likely to elicit truthful and accurate responses due to increased privacy when responding compared to a face-to-face interview. These studies showed patients reported positive reactions to computerized screening for other stigmatizing health conditions such as alcohol use during pregnancy. Computerized self-screening for depression offers the possibility of more privacy and patient reflection, which could then send a targeted message of risk to the healthcare provider. This could potentially increase the accuracy while also increasing time efficiency for the clinic. Considering the persistent effects of mental health stigma, how these screening questions are posed can impact patients’ responses. This literature review analyzes trends in depression screening methodologies, the impact of setting on the results and how this may assist in overcoming one barrier caused by stigma.Keywords: computerized self-report, depression, mental health stigma, suicide risk
Procedia PDF Downloads 13113540 Potentials of Additive Manufacturing: An Approach to Increase the Flexibility of Production Systems
Authors: A. Luft, S. Bremen, N. Balc
Abstract:
The task of flexibility planning and design, just like factory planning, for example, is to create the long-term systemic framework that constitutes the restriction for short-term operational management. This is a strategic challenge since, due to the decision defect character of the underlying flexibility problem, multiple types of flexibility need to be considered over the course of various scenarios, production programs, and production system configurations. In this context, an evaluation model has been developed that integrates both conventional and additive resources on a basic task level and allows the quantification of flexibility enhancement in terms of mix and volume flexibility, complexity reduction, and machine capacity. The model helps companies to decide in early decision-making processes about the potential gains of implementing additive manufacturing technologies on a strategic level. For companies, it is essential to consider both additive and conventional manufacturing beyond pure unit costs. It is necessary to achieve an integrative view of manufacturing that incorporates both additive and conventional manufacturing resources and quantifies their potential with regard to flexibility and manufacturing complexity. This also requires a structured process for the strategic production systems design that spans the design of various scenarios and allows for multi-dimensional and comparative analysis. A respective guideline for the planning of additive resources on a strategic level is being laid out in this paper.Keywords: additive manufacturing, production system design, flexibility enhancement, strategic guideline
Procedia PDF Downloads 12413539 Digital Homeostasis: Tangible Computing as a Multi-Sensory Installation
Authors: Andrea Macruz
Abstract:
This paper explores computation as a process for design by examining how computers can become more than an operative strategy in a designer's toolkit. It documents this, building upon concepts of neuroscience and Antonio Damasio's Homeostasis Theory, which is the control of bodily states through feedback intended to keep conditions favorable for life. To do this, it follows a methodology through algorithmic drawing and discusses the outcomes of three multi-sensory design installations, which culminated from a course in an academic setting. It explains both the studio process that took place to create the installations and the computational process that was developed, related to the fields of algorithmic design and tangible computing. It discusses how designers can use computational range to achieve homeostasis related to sensory data in a multi-sensory installation. The outcomes show clearly how people and computers interact with different sensory modalities and affordances. They propose using computers as meta-physical stabilizers rather than tools.Keywords: algorithmic drawing, Antonio Damasio, emotion, homeostasis, multi-sensory installation, neuroscience
Procedia PDF Downloads 10913538 Design and Analysis of a Clustered Nozzle Configuration and Comparison of Its Thrust
Authors: Abdul Hadi Butt, Asfandyar Arshad
Abstract:
The purpose of this paper is to study the thrust variation in different configurations of clustered nozzles. It involves the design and analysis of clustered configuration of nozzles using Ansys fluent. Clustered nozzles with different configurations are simulated and compared on basis of effective exhaust thrust. Mixing length for the flow interaction is also calculated. Further clustered configurations are analyzed over different altitudes. An optimum value of the thrust among different configurations is proposed at the end of comparisons.Keywords: CD nozzle, cluster, thrust, fluent, ANSYS
Procedia PDF Downloads 40213537 A Holistic Approach for Technical Product Optimization
Authors: Harald Lang, Michael Bader, A. Buchroithner
Abstract:
Holistic methods covering the development process as a whole – e.g. systems engineering – have established themselves in product design. However, technical product optimization, representing improvements in efficiency and/or minimization of loss, usually applies to single components of a system. A holistic approach is being defined based on a hierarchical point of view of systems engineering. This is subsequently presented using the example of an electromechanical flywheel energy storage system for automotive applications.Keywords: design, product development, product optimization, systems engineering
Procedia PDF Downloads 62613536 The Rule of Architectural Firms in Enhancing Building Energy Efficiency in Emerging Countries: Processes and Tools Evaluation of Architectural Firms in Egypt
Authors: Mahmoud F. Mohamadin, Ahmed Abdel Malek, Wessam Said
Abstract:
Achieving energy efficient architecture in general, and in emerging countries in particular, is a challenging process that requires the contribution of various governmental, institutional, and individual entities. The rule of architectural design is essential in this process as it is considered as one of the earliest steps on the road to sustainability. Architectural firms have a moral and professional responsibility to respond to these challenges and deliver buildings that consume less energy. This study aims to evaluate the design processes and tools in practice of Egyptian architectural firms based on a limited survey to investigate if their processes and methods can lead to projects that meet the Egyptian Code of Energy Efficiency Improvement. A case study of twenty architectural firms in Cairo was selected and categorized according to their scale; large-scale, medium-scale, and small-scale. A questionnaire was designed and distributed to the firms, and personal meetings with the firms’ representatives took place. The questionnaire answered three main points; the design processes adopted, the usage of performance-based simulation tools, and the usage of BIM tools for energy efficiency purposes. The results of the study revealed that only little percentage of the large-scale firms have clear strategies for building energy efficiency in their building design, however the application is limited to certain project types, or according to the client request. On the other hand, the percentage of medium-scale firms is much less, and it is almost absent in the small-scale ones. This demonstrates the urgent need of enhancing the awareness of the Egyptian architectural design community of the great importance of implementing these methods starting from the early stages of the building design. Finally, the study proposed recommendations for such firms to be able to create a healthy built environment and improve the quality of life in emerging countries.Keywords: architectural firms, emerging countries, energy efficiency, performance-based simulation tools
Procedia PDF Downloads 28413535 Identification of Vehicle Dynamic Parameters by Using Optimized Exciting Trajectory on 3- DOF Parallel Manipulator
Authors: Di Yao, Gunther Prokop, Kay Buttner
Abstract:
Dynamic parameters, including the center of gravity, mass and inertia moments of vehicle, play an essential role in vehicle simulation, collision test and real-time control of vehicle active systems. To identify the important vehicle dynamic parameters, a systematic parameter identification procedure is studied in this work. In the first step of the procedure, a conceptual parallel manipulator (virtual test rig), which possesses three rotational degrees-of-freedom, is firstly proposed. To realize kinematic characteristics of the conceptual parallel manipulator, the kinematic analysis consists of inverse kinematic and singularity architecture is carried out. Based on the Euler's rotation equations for rigid body dynamics, the dynamic model of parallel manipulator and derivation of measurement matrix for parameter identification are presented subsequently. In order to reduce the sensitivity of parameter identification to measurement noise and other unexpected disturbances, a parameter optimization process of searching for optimal exciting trajectory of parallel manipulator is conducted in the following section. For this purpose, the 321-Euler-angles defined by parameterized finite-Fourier-series are primarily used to describe the general exciting trajectory of parallel manipulator. To minimize the condition number of measurement matrix for achieving better parameter identification accuracy, the unknown coefficients of parameterized finite-Fourier-series are estimated by employing an iterative algorithm based on MATLAB®. Meanwhile, the iterative algorithm will ensure the parallel manipulator still keeps in an achievable working status during the execution of optimal exciting trajectory. It is showed that the proposed procedure and methods in this work can effectively identify the vehicle dynamic parameters and could be an important application of parallel manipulator in the fields of parameter identification and test rig development.Keywords: parameter identification, parallel manipulator, singularity architecture, dynamic modelling, exciting trajectory
Procedia PDF Downloads 26713534 The Tramway in French Cities: Complication of Public Spaces and Complexity of the Design Process
Authors: Elisa Maître
Abstract:
The redeployment of tram networks in French cities has considerably modified public spaces and the way citizens use them. Above and beyond the image that trams have of contributing to the sustainable urban development, the question of safety for users in these spaces has not been studied much. This study is based on an analysis of use of public spaces laid out for trams, from the standpoint of legibility and safety concerns. The study also examines to what extent the complexity of the design process, with many interactions between numerous and varied players in this process has a role in the genesis of these problems. This work is mainly based on the analysis of links between the uses of these re-designed public spaces (through observations, interviews of users and accident studies) and the analysis of the design conditions and processes of the projects studied (mainly based on interviews with the actors of these projects). Practical analyses were based three points of view: that of the planner, that of the user (based on observations and interviews) and that of the road safety expert. The cities of Montpellier, Marseille and Nice are the three fields of study on which the demonstration of this thesis is based. On part, the results of this study allow showing that the insertion of tram poses some problems complication of public areas of French cities. These complications related to the restructuring of public spaces for the tram, create difficulties of use and safety concerns. On the other hand, interviews depth analyses, fully transcribed, have led us to develop particular dysfunction scenarios in the design process. These elements lead to question the way the legibility and safety of these new forms of public spaces are taken into account. Then, an in-depth analysis of the design processes of public spaces with trams systems would also be a way of better understanding the choices made, the compromises accepted, and the conflicts and constraints at work, weighing on the layout of these spaces. The results presented concerning the impact that spaces laid out for trams have on the difficulty of use, suggest different possibilities for improving the way in which safety for all users is taken into account in designing public spaces.Keywords: public spaces, road layout, users, design process of urban projects
Procedia PDF Downloads 23013533 A Hybrid-Evolutionary Optimizer for Modeling the Process of Obtaining Bricks
Authors: Marius Gavrilescu, Sabina-Adriana Floria, Florin Leon, Silvia Curteanu, Costel Anton
Abstract:
Natural sciences provide a wide range of experimental data whose related problems require study and modeling beyond the capabilities of conventional methodologies. Such problems have solution spaces whose complexity and high dimensionality require correspondingly complex regression methods for proper characterization. In this context, we propose an optimization method which consists in a hybrid dual optimizer setup: a global optimizer based on a modified variant of the popular Imperialist Competitive Algorithm (ICA), and a local optimizer based on a gradient descent approach. The ICA is modified such that intermediate solution populations are more quickly and efficiently pruned of low-fitness individuals by appropriately altering the assimilation, revolution and competition phases, which, combined with an initialization strategy based on low-discrepancy sampling, allows for a more effective exploration of the corresponding solution space. Subsequently, gradient-based optimization is used locally to seek the optimal solution in the neighborhoods of the solutions found through the modified ICA. We use this combined approach to find the optimal configuration and weights of a fully-connected neural network, resulting in regression models used to characterize the process of obtained bricks using silicon-based materials. Installations in the raw ceramics industry, i.e., bricks, are characterized by significant energy consumption and large quantities of emissions. Thus, the purpose of our approach is to determine by simulation the working conditions, including the manufacturing mix recipe with the addition of different materials, to minimize the emissions represented by CO and CH4. Our approach determines regression models which perform significantly better than those found using the traditional ICA for the aforementioned problem, resulting in better convergence and a substantially lower error.Keywords: optimization, biologically inspired algorithm, regression models, bricks, emissions
Procedia PDF Downloads 8213532 Optimizing Fire Tube Boiler Design for Efficient Saturated Steam Production: A Cost-Minimization Approach
Authors: Yoftahe Nigussie Worku
Abstract:
This report unveils a meticulous project focused on the design intricacies of a Fire Tube Boiler tailored for the efficient generation of saturated steam. The overarching objective is to produce 2000kg/h of saturated steam at 12-bar design pressure, achieved through the development of an advanced fire tube boiler. This design is meticulously crafted to harmonize cost-effectiveness and parameter refinement, with a keen emphasis on material selection for component parts, construction materials, and production methods throughout the analytical phases. The analytical process involves iterative calculations, utilizing pertinent formulas to optimize design parameters, including the selection of tube diameters and overall heat transfer coefficients. The boiler configuration incorporates two passes, a strategic choice influenced by tube and shell size considerations. The utilization of heavy oil fuel no. 6, with a higher heating value of 44000kJ/kg and a lower heating value of 41300kJ/kg, results in a fuel consumption of 140.37kg/hr. The boiler achieves an impressive heat output of 1610kW with an efficiency rating of 85.25%. The fluid flow pattern within the boiler adopts a cross-flow arrangement strategically chosen for inherent advantages. Internally, the welding of the tube sheet to the shell, secured by gaskets and welds, ensures structural integrity. The shell design adheres to European Standard code sections for pressure vessels, encompassing considerations for weight, supplementary accessories (lifting lugs, openings, ends, manhole), and detailed assembly drawings. This research represents a significant stride in optimizing fire tube boiler technology, balancing efficiency and safety considerations in the pursuit of enhanced saturated steam production.Keywords: fire tube, saturated steam, material selection, efficiency
Procedia PDF Downloads 8413531 Optimized Passive Heating for Multifamily Dwellings
Authors: Joseph Bostick
Abstract:
A method of decreasing the heating load of HVAC systems in a single-dwelling model of a multifamily building, by controlling movable insulation through the optimization of flux, time, surface incident solar radiation, and temperature thresholds. Simulations are completed using a co-simulation between EnergyPlus and MATLAB as an optimization tool to find optimal control thresholds. Optimization of the control thresholds leads to a significant decrease in total heating energy expenditure.Keywords: energy plus, MATLAB, simulation, energy efficiency
Procedia PDF Downloads 17613530 Interaction Between Task Complexity and Collaborative Learning on Virtual Patient Design: The Effects on Students’ Performance, Cognitive Load, and Task Time
Authors: Fatemeh Jannesarvatan, Ghazaal Parastooei, Jimmy frerejan, Saedeh Mokhtari, Peter Van Rosmalen
Abstract:
Medical and dental education increasingly emphasizes the acquisition, integration, and coordination of complex knowledge, skills, and attitudes that can be applied in practical situations. Instructional design approaches have focused on using real-life tasks in order to facilitate complex learning in both real and simulated environments. The Four component instructional design (4C/ID) model has become a useful guideline for designing instructional materials that improve learning transfer, especially in health profession education. The objective of this study was to apply the 4C/ID model in the creation of virtual patients (VPs) that dental students can use to practice their clinical management and clinical reasoning skills. The study first explored the context and concept of complication factors and common errors for novices and how they can affect the design of a virtual patient program. The study then selected key dental information and considered the content needs of dental students. The design of virtual patients was based on the 4C/ID model's fundamental principles, which included: Designing learning tasks that reflect real patient scenarios and applying different levels of task complexity to challenge students to apply their knowledge and skills in different contexts. Creating varied learning materials that support students during the VP program and are closely integrated with the learning tasks and students' curricula. Cognitive feedback was provided at different levels of the program. Providing procedural information where students followed a step-by-step process from history taking to writing a comprehensive treatment plan. Four virtual patients were designed using the 4C/ID model's principles, and an experimental design was used to test the effectiveness of the principles in achieving the intended educational outcomes. The 4C/ID model provides an effective framework for designing engaging and successful virtual patients that support the transfer of knowledge and skills for dental students. However, there are some challenges and pitfalls that instructional designers should take into account when developing these educational tools.Keywords: 4C/ID model, virtual patients, education, dental, instructional design
Procedia PDF Downloads 8213529 Alvaro Siza’s Design Strategy: An Insight into Critical Regionalism
Authors: Rahmatollah Amirjani
Abstract:
By the emergence of the debate over the failure of Regionalism in the late 1970s, Critical Regionalism was introduced as a different way to respond to the state of architecture in the post-war era. Critical Regionalism is most often understood as a discourse that not only mediates the language of modern architecture with the local cultures but also revives the relation between architecture and spectator as indexed by capitalism. Since the inception of Critical Regionalism, a large number of architectural practices have emerged around the globe; however, the work of the well-known Portuguese architect, Álvaro Siza, is considered as a unique case amongst works associated with the discourse of Critical Regionalism. This paper intends to respond to a number of questions, including; what are the origins of Critical Regionalism? How does Siza’s design strategy correspond to the thematic of Critical Regionalism? How does Siza recover the relation between object and subject in most of his projects? Using Siza’s housing project for the Malagueira district in Évora, Portugal, this article will attempt to answer these questions, and highlight Alvaro Siza’s design procedure which goes beyond the existing discourse of Critical Regionalism and contributes to our understanding of this practice.Keywords: Alvaro Siza, critical regionalism, Malagueira housing, placelessness
Procedia PDF Downloads 18013528 Effects of Four Dietary Oils on Cholesterol and Fatty Acid Composition of Egg Yolk in Layers
Authors: A. F. Agboola, B. R. O. Omidiwura, A. Oyeyemi, E. A. Iyayi, A. S. Adelani
Abstract:
Dietary cholesterol has elicited the most public interest as it relates with coronary heart disease. Thus, humans have been paying more attention to health, thereby reducing consumption of cholesterol enriched food. Egg is considered as one of the major sources of human dietary cholesterol. However, an alternative way to reduce the potential cholesterolemic effect of eggs is to modify the fatty acid composition of the yolk. The effect of palm oil (PO), soybean oil (SO), sesame seed oil (SSO) and fish oil (FO) supplementation in the diets of layers on egg yolk fatty acid, cholesterol, egg production and egg quality parameters were evaluated in a 42-day feeding trial. One hundred and five Isa Brown laying hens of 34 weeks of age were randomly distributed into seven groups of five replicates and three birds per replicate in a completely randomized design. Seven corn-soybean basal diets (BD) were formulated: BD+No oil (T1), BD+1.5% PO (T2), BD+1.5% SO (T3), BD+1.5% SSO (T4), BD+1.5% FO (T5), BD+0.75% SO+0.75% FO (T6) and BD+0.75% SSO+0.75% FO (T7). Five eggs were randomly sampled at day 42 from each replicate to assay for the cholesterol, fatty acid profile of egg yolk and egg quality assessment. Results showed that there were no significant (P>0.05) differences observed in production performance, egg cholesterol and egg quality parameters except for yolk height, albumen height, yolk index, egg shape index, haugh unit, and yolk colour. There were no significant differences (P>0.05) observed in total cholesterol, high density lipoprotein and low density lipoprotein levels of egg yolk across the treatments. However, diets had effect (P<0.05) on TAG (triacylglycerol) and VLDL (very low density lipoprotein) of the egg yolk. The highest TAG (603.78 mg/dl) and VLDL values (120.76 mg/dl) were recorded in eggs of hens on T4 (1.5% sesame seed oil) and was similar to those on T3 (1.5% soybean oil), T5 (1.5% fish oil) and T6 (0.75% soybean oil + 0.75% fish oil). However, results revealed a significant (P<0.05) variations on eggs’ summation of polyunsaturated fatty acid (PUFA). In conclusion, it is suggested that dietary oils could be included in layers’ diets to produce designer eggs low in cholesterol and high in PUFA especially omega-3 fatty acids.Keywords: dietary oils, egg cholesterol, egg fatty acid profile, egg quality parameters
Procedia PDF Downloads 31113527 Algorithmic Obligations: Proactive Liability for AI-Generated Content and Copyright Compliance
Authors: Aleksandra Czubek
Abstract:
As AI systems increasingly shape content creation, existing copyright frameworks face significant challenges in determining liability for AI-generated outputs. Current legal discussions largely focus on who bears responsibility for infringing works, be it developers, users, or entities benefiting from AI outputs. This paper introduces a novel concept of algorithmic obligations, proposing that AI developers be subject to proactive duties that ensure their models prevent copyright infringement before it occurs. Building on principles of obligations law traditionally applied to human actors, the paper suggests a shift from reactive enforcement to proactive legal requirements. AI developers would be legally mandated to incorporate copyright-aware mechanisms within their systems, turning optional safeguards into enforceable standards. These obligations could vary in implementation across international, EU, UK, and U.S. legal frameworks, creating a multi-jurisdictional approach to copyright compliance. This paper explores how the EU’s existing copyright framework, exemplified by the Copyright Directive (2019/790), could evolve to impose a duty of foresight on AI developers, compelling them to embed mechanisms that prevent infringing outputs. By drawing parallels to GDPR’s “data protection by design,” a similar principle could be applied to copyright law, where AI models are designed to minimize copyright risks. In the UK, post-Brexit text and data mining exemptions are seen as pro-innovation but pose risks to copyright protections. This paper proposes a balanced approach, introducing algorithmic obligations to complement these exemptions. AI systems benefiting from text and data mining provisions should integrate safeguards that flag potential copyright violations in real time, ensuring both innovation and protection. In the U.S., where copyright law focuses on human-centric works, this paper suggests an evolution toward algorithmic due diligence. AI developers would have a duty similar to product liability, ensuring that their systems do not produce infringing outputs, even if the outputs themselves cannot be copyrighted. This framework introduces a shift from post-infringement remedies to preventive legal structures, where developers actively mitigate risks. The paper also breaks new ground by addressing obligations surrounding the training data of large language models (LLMs). Currently, training data is often treated under exceptions such as the EU’s text and data mining provisions or U.S. fair use. However, this paper proposes a proactive framework where developers are obligated to verify and document the legal status of their training data, ensuring it is licensed or otherwise cleared for use. In conclusion, this paper advocates for an obligations-centered model that shifts AI-related copyright law from reactive litigation to proactive design. By holding AI developers to a heightened standard of care, this approach aims to prevent infringement at its source, addressing both the outputs of AI systems and the training processes that underlie them.Keywords: ip, technology, copyright, data, infringement, comparative analysis
Procedia PDF Downloads 2013526 Evaluation of Inceptor Design for Manned Multicopter
Authors: Jędrzej Minda
Abstract:
In aviation, a very narrow spectrum of control inceptors exists, namely centre sticks, side-sticks, pedals, and yokes. However, new types of aircraft are emerging, and with them, a need for new inceptors. A manned multicopter created at AGH University of Science and Technology is an aircraft in which the pilot takes a specific orientation in which classical inceptors may be impractical to use. In this paper, a unique kind of control inceptor is described, which aims to provide a handling quality not unlike standard solutions, and provide a firm grip point for the pilot without the risk of involuntary stick movement. Simulations of the pilot-inceptor model were performed in order to compare the dynamic amplification factors of the design described in this paper with the classical one. A functional prototype is built on which drone pilots carry out a comfort-of-use evaluation. This paper provides a general overview of the project, including a literature review, reasoning behind components selection, and mechanism design finalized by conclusions.Keywords: mechanisms, mechatronics, embedded control, serious gaming for training rescue missions, rescue robotics
Procedia PDF Downloads 8313525 Analyzing the Ancient Islamic Architectural Theories: Role of Geometric Proportionality as a Principle of Islamic Design
Authors: Vamsi G.
Abstract:
Majority of the modern-day structures have less aesthetical value with minimum requirements set by foreign tribes. Numerous elements of traditional architecture can be incorporated into modern designs using appropriate principles to improve and enhance the functionality, aesthetics, and usability of any space. This paper reviews the diminishing ancient values of the traditional Islamic architecture. By introducing them into the modern-day structures like commercial, residential and recreational spaces in at least the Islamic states, the functionality of those spaces can be improved. For this, aspects like space planning, aesthetics, scale, hierarchy, value, and patterns are to be experimented with modern day structures. Case studies of few ancient Islamic architectural marvels are done to elaborate the whole. A brief analysis of materials and execution strategies are also a part of this paper. The analysis is formulated and is ready to design or redesign spaces using traditional Islamic principles and Elements of design to improve the quality of the architecture of modern day structures by studying the ancient Islamic architectural theories. For this, sources from the history and evolution of this architecture have been studied. Also, elements and principles of design from case studies of various mosques, forts, tombs, and palaces have been tabulated. All this data accumulated, will help revive the elements decorated by ancient principles in functional and aesthetical ways. By this, one of the most astonishing architectural styles can be conserved, reinstalled into modern day buildings and remembered.Keywords: ancient architecture, architectural history, Islamic architecture, principles and elements
Procedia PDF Downloads 21313524 Optimization of a Flux Switching Permanent Magnet Machine Using Laminated Segmented Rotor
Authors: Seyedmilad Kazemisangdehi, Seyedmehdi Kazemisangdehi
Abstract:
Flux switching permanent magnet machines are considered for wide range of applications because of their outstanding merits including high torque/power densities, high efficiency, simple and robust rotor structure. Therefore, several topologies have been proposed like the PM exited flux switching machine, hybrid excited flux switching type, and so on. Recently, a novel laminated segmented rotor flux switching permanent magnet machine was introduced. It features flux barriers on rotor structure to enhance the performances of machine including torque ripple reduction and also torque and efficiency improvements at the same time. This is while, the design of barriers was not optimized by the authors. Therefore, in this paper three coefficients regarding the position of the barriers are considered for optimization. The effect of each coefficient on the performance of this machine is investigated by finite element method and finally an optimized design of flux barriers based on these three coefficients is proposed from different points of view including electromagnetic torque maximization and cogging torque/torque ripple minimization. At optimum design from maximum developed torque aspect, this machine generates 0.65 Nm torque higher than that of the not-optimized design with an almost 0.4 % improvement in efficiency.Keywords: finite element analysis, FSPM, laminated segmented rotor flux switching permanent magnet machine, optimization
Procedia PDF Downloads 23113523 Customized Design of Amorphous Solids by Generative Deep Learning
Authors: Yinghui Shang, Ziqing Zhou, Rong Han, Hang Wang, Xiaodi Liu, Yong Yang
Abstract:
The design of advanced amorphous solids, such as metallic glasses, with targeted properties through artificial intelligence signifies a paradigmatic shift in physical metallurgy and materials technology. Here, we developed a machine-learning architecture that facilitates the generation of metallic glasses with targeted multifunctional properties. Our architecture integrates the state-of-the-art unsupervised generative adversarial network model with supervised models, allowing the incorporation of general prior knowledge derived from thousands of data points across a vast range of alloy compositions, into the creation of data points for a specific type of composition, which overcame the common issue of data scarcity typically encountered in the design of a given type of metallic glasses. Using our generative model, we have successfully designed copper-based metallic glasses, which display exceptionally high hardness or a remarkably low modulus. Notably, our architecture can not only explore uncharted regions in the targeted compositional space but also permits self-improvement after experimentally validated data points are added to the initial dataset for subsequent cycles of data generation, hence paving the way for the customized design of amorphous solids without human intervention.Keywords: metallic glass, artificial intelligence, mechanical property, automated generation
Procedia PDF Downloads 57