Search results for: equivalent transformation algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4376

Search results for: equivalent transformation algorithms

3416 Continuous Improvement as an Organizational Capability in the Industry 4.0 Era

Authors: Lodgaard Eirin, Myklebust Odd, Eleftheriadis Ragnhild

Abstract:

Continuous improvement is becoming increasingly a prerequisite for manufacturing companies to remain competitive in a global market. In addition, future survival and success will depend on the ability to manage the forthcoming digitalization transformation in the industry 4.0 era. Industry 4.0 promises substantially increased operational effectiveness, were all equipment are equipped with integrated processing and communication capabilities. Subsequently, the interplay of human and technology will evolve and influence the range of worker tasks and demands. Taking into account these changes, the concept of continuous improvement must evolve accordingly. Based on a case study from manufacturing industry, the purpose of this paper is to point out what the concept of continuous improvement will meet and has to take into considering when entering the 4th industrial revolution. In the past, continuous improvement has the focus on a culture of sustained improvement targeting the elimination of waste in all systems and processes of an organization by involving everyone. Today, it has to be evolved into the forthcoming digital transformation and the increased interplay of human and digital communication system to reach its full potential. One main findings of this study, is how digital communication systems will act as an enabler to strengthen the continuous improvement process, by moving from collaboration within individual teams to interconnection of teams along the product value chain. For academics and practitioners, it will help them to identify and prioritize their steps towards an industry 4.0 implementation integrated with focus on continuous improvement.

Keywords: continuous improvement, digital communication system, human-machine-interaction, industry 4.0, team perfomance

Procedia PDF Downloads 184
3415 A Survey on Intelligent Traffic Management with Cooperative Driving in Urban Roads

Authors: B. Karabuluter, O. Karaduman

Abstract:

Traffic management and traffic planning are important issues, especially in big cities. Due to the increase of personal vehicles and the physical constraints of urban roads, the problem of transportation especially in crowded cities over time is revealed. This situation reduces the living standards, and it can put human life at risk because the vehicles such as ambulance, fire department are prevented from reaching their targets. Even if the city planners take these problems into account, emergency planning and traffic management are needed to avoid cases such as traffic congestion, intersections, traffic jams caused by traffic accidents or roadworks. In this study, in smart traffic management issues, proposed solutions using intelligent vehicles acting in cooperation with urban roads are examined. Traffic management is becoming more difficult due to factors such as fatigue, carelessness, sleeplessness, social behavior patterns, and lack of education. However, autonomous vehicles, which remove the problems caused by human weaknesses by providing driving control, are increasing the success of practicing the algorithms developed in city traffic management. Such intelligent vehicles have become an important solution in urban life by using 'swarm intelligence' algorithms and cooperative driving methods to provide traffic flow, prevent traffic accidents, and increase living standards. In this study, studies conducted in this area have been dealt with in terms of traffic jam, intersections, regulation of traffic flow, signaling, prevention of traffic accidents, cooperation and communication techniques of vehicles, fleet management, transportation of emergency vehicles. From these concepts, some taxonomies were made out of the way. This work helps to develop new solutions and algorithms for cities where intelligent vehicles that can perform cooperative driving can take place, and at the same time emphasize the trend in this area.

Keywords: intelligent traffic management, cooperative driving, smart driving, urban road, swarm intelligence, connected vehicles

Procedia PDF Downloads 316
3414 Multiaxial Stress Based High Cycle Fatigue Model for Adhesive Joint Interfaces

Authors: Martin Alexander Eder, Sergei Semenov

Abstract:

Many glass-epoxy composite structures, such as large utility wind turbine rotor blades (WTBs), comprise of adhesive joints with typically thick bond lines used to connect the different components during assembly. Performance optimization of rotor blades to increase power output by simultaneously maintaining high stiffness-to-low-mass ratios entails intricate geometries in conjunction with complex anisotropic material behavior. Consequently, adhesive joints in WTBs are subject to multiaxial stress states with significant stress gradients depending on the local joint geometry. Moreover, the dynamic aero-elastic interaction of the WTB with the airflow generates non-proportional, variable amplitude stress histories in the material. Empiricism shows that a prominent failure type in WTBs is high cycle fatigue failure of adhesive bond line interfaces, which in fact over time developed into a design driver as WTB sizes increase rapidly. Structural optimization employed at an early design stage, therefore, sets high demands on computationally efficient interface fatigue models capable of predicting the critical locations prone for interface failure. The numerical stress-based interface fatigue model presented in this work uses the Drucker-Prager criterion to compute three different damage indices corresponding to the two interface shear tractions and the outward normal traction. The two-parameter Drucker-Prager model was chosen because of its ability to consider shear strength enhancement under compression and shear strength reduction under tension. The governing interface damage index is taken as the maximum of the triple. The damage indices are computed through the well-known linear Palmgren-Miner rule after separate rain flow-counting of the equivalent shear stress history and the equivalent pure normal stress history. The equivalent stress signals are obtained by self-similar scaling of the Drucker-Prager surface whose shape is defined by the uniaxial tensile strength and the shear strength such that it intersects with the stress point at every time step. This approach implicitly assumes that the damage caused by the prevailing multiaxial stress state is the same as the damage caused by an amplified equivalent uniaxial stress state in the three interface directions. The model was implemented as Python plug-in for the commercially available finite element code Abaqus for its use with solid elements. The model was used to predict the interface damage of an adhesively bonded, tapered glass-epoxy composite cantilever I-beam tested by LM Wind Power under constant amplitude compression-compression tip load in the high cycle fatigue regime. Results show that the model was able to predict the location of debonding in the adhesive interface between the webfoot and the cap. Moreover, with a set of two different constant life diagrams namely in shear and tension, it was possible to predict both the fatigue lifetime and the failure mode of the sub-component with reasonable accuracy. It can be concluded that the fidelity, robustness and computational efficiency of the proposed model make it especially suitable for rapid fatigue damage screening of large 3D finite element models subject to complex dynamic load histories.

Keywords: adhesive, fatigue, interface, multiaxial stress

Procedia PDF Downloads 154
3413 Implementation of Successive Interference Cancellation Algorithms in the 5g Downlink

Authors: Mokrani Mohamed Amine

Abstract:

In this paper, we have implemented successive interference cancellation algorithms in the 5G downlink. We have calculated the maximum throughput in Frequency Division Duplex (FDD) mode in the downlink, where we have obtained a value equal to 836932 b/ms. The transmitter is of type Multiple Input Multiple Output (MIMO) with eight transmitting and receiving antennas. Each antenna among eight transmits simultaneously a data rate of 104616 b/ms that contains the binary messages of the three users; in this case, the Cyclic Redundancy Check CRC is negligible, and the MIMO category is the spatial diversity. The technology used for this is called Non-Orthogonal Multiple Access (NOMA) with a Quadrature Phase Shift Keying (QPSK) modulation. The transmission is done in a Rayleigh fading channel with the presence of obstacles. The MIMO Successive Interference Cancellation (SIC) receiver with two transmitting and receiving antennas recovers its binary message without errors for certain values of transmission power such as 50 dBm, with 0.054485% errors when the transmitted power is 20dBm and with 0.00286763% errors for a transmitted power of 32 dBm(in the case of user 1) as well as with 0.0114705% errors when the transmitted power is 20 dBm also with 0.00286763% errors for a power of 24 dBm(in the case of user2) by applying the steps involved in SIC.

Keywords: 5G, NOMA, QPSK, TBS, LDPC, SIC, capacity

Procedia PDF Downloads 88
3412 Comparative Study on Hydrothermal Carbonization as Pre- and Post-treatment of Anaerobic Digestion of Dairy Sludge: Focus on Energy Recovery, Resources Transformation and Hydrochar Utilization

Authors: Mahmood Al Ramahi, G. Keszthelyi-Szabo, S. Beszedes

Abstract:

Hydrothermal carbonization (HTC) is a thermochemical reaction that utilizes saturated water and vapor pressure to convert waste biomass to C-rich products This work evaluated the effect of HTC as a pre- and post-treatment technique to anaerobic digestion (AD) of dairy sludge, as information in this field is still in its infancy, with many research and methodological gaps. HTC effect was evaluated based on energy recovery, nutrients transformation, and sludge biodegradability. The first treatment approach was executed by applying hydrothermal carbonization (HTC) under a range of temperatures, prior to mesophilic anaerobic digestion (AD) of dairy sludge. Results suggested an optimal pretreatment temperature at 210 °C for 30 min. HTC pretreatment increased methane yield and chemical oxygen demand removal. The theoretical model based on Boyle’s equation had a very close match with the experimental results. On the other hand, applying HTC subsequent to AD increased total energy production, as additional energy yield was obtained by the solid fuel (hydrochar) beside the produced biogas. Furthermore, hydrothermal carbonization of AD digestate generated liquid products (HTC digestate) with improved chemical characteristics suggesting their use as liquid fertilizers.

Keywords: hydrothermal carbonization, anaerobic digestion, energy balance, sludge biodegradability, biogas

Procedia PDF Downloads 169
3411 Perfectly Matched Layer Boundary Stabilized Using Multiaxial Stretching Functions

Authors: Adriano Trono, Federico Pinto, Diego Turello, Marcelo A. Ceballos

Abstract:

Numerical modeling of dynamic soil-structure interaction problems requires an adequate representation of the unbounded characteristics of the ground, material non-linearity of soils, and geometrical non-linearities such as large displacements due to rocking of the structure. In order to account for these effects simultaneously, it is often required that the equations of motion are solved in the time domain. However, boundary conditions in conventional finite element codes generally present shortcomings in fully absorbing the energy of outgoing waves. In this sense, the Perfectly Matched Layers (PML) technique allows a satisfactory absorption of inclined body waves, as well as surface waves. However, the PML domain is inherently unstable, meaning that it its instability does not depend upon the discretization considered. One way to stabilize the PML domain is to use multiaxial stretching functions. This development is questionable because some Jacobian terms of the coordinate transformation are not accounted for. For this reason, the resulting absorbing layer element is often referred to as "uncorrected M-PML” in the literature. In this work, the strong formulation of the "corrected M-PML” absorbing layer is proposed using multiaxial stretching functions that incorporate all terms of the coordinate transformation. The results of the stable model are compared with reference solutions obtained from extended domain models.

Keywords: mixed finite elements, multiaxial stretching functions, perfectly matched layer, soil-structure interaction

Procedia PDF Downloads 57
3410 Estimation of the Exergy-Aggregated Value Generated by a Manufacturing Process Using the Theory of the Exergetic Cost

Authors: German Osma, Gabriel Ordonez

Abstract:

The production of metal-rubber spares for vehicles is a sequential process that consists in the transformation of raw material through cutting activities and chemical and thermal treatments, which demand electricity and fossil fuels. The energy efficiency analysis for these cases is mostly focused on studying of each machine or production step, but is not common to study of the quality of the production process achieves from aggregated value viewpoint, which can be used as a quality measurement for determining of impact on the environment. In this paper, the theory of exergetic cost is used for determining of aggregated exergy to three metal-rubber spares, from an exergy analysis and thermoeconomic analysis. The manufacturing processing of these spares is based into batch production technique, and therefore is proposed the use of this theory for discontinuous flows from of single models of workstations; subsequently, the complete exergy model of each product is built using flowcharts. These models are a representation of exergy flows between components into the machines according to electrical, mechanical and/or thermal expressions; they determine the demanded exergy to produce the effective transformation in raw materials (aggregated exergy value), the exergy losses caused by equipment and irreversibilities. The energy resources of manufacturing process are electricity and natural gas. The workstations considered are lathes, punching presses, cutters, zinc machine, chemical treatment tanks, hydraulic vulcanizing presses and rubber mixer. The thermoeconomic analysis was done by workstation and by spare; first of them describes the operation of the components of each machine and where the exergy losses are; while the second of them estimates the exergy-aggregated value for finished product and wasted feedstock. Results indicate that exergy efficiency of a mechanical workstation is between 10% and 60% while this value in the thermal workstations is less than 5%; also that each effective exergy-aggregated value is one-thirtieth of total exergy required for operation of manufacturing process, which amounts approximately to 2 MJ. These troubles are caused mainly by technical limitations of machines, oversizing of metal feedstock that demands more mechanical transformation work, and low thermal insulation of chemical treatment tanks and hydraulic vulcanizing presses. From established information, in this case, it is possible to appreciate the usefulness of theory of exergetic cost for analyzing of aggregated value in manufacturing processes.

Keywords: exergy-aggregated value, exergy efficiency, thermoeconomics, exergy modeling

Procedia PDF Downloads 155
3409 Physical Aspects of Shape Memory and Reversibility in Shape Memory Alloys

Authors: Osman Adiguzel

Abstract:

Shape memory alloys take place in a class of smart materials by exhibiting a peculiar property called the shape memory effect. This property is characterized by the recoverability of two certain shapes of material at different temperatures. These materials are often called smart materials due to their functionality and their capacity of responding to changes in the environment. Shape memory materials are used as shape memory devices in many interdisciplinary fields such as medicine, bioengineering, metallurgy, building industry and many engineering fields. The shape memory effect is performed thermally by heating and cooling after first cooling and stressing treatments, and this behavior is called thermoelasticity. This effect is based on martensitic transformations characterized by changes in the crystal structure of the material. The shape memory effect is the result of successive thermally and stress-induced martensitic transformations. Shape memory alloys exhibit thermoelasticity and superelasticity by means of deformation in the low-temperature product phase and high-temperature parent phase region, respectively. Superelasticity is performed by stressing and releasing the material in the parent phase region. Loading and unloading paths are different in the stress-strain diagram, and the cycling loop reveals energy dissipation. The strain energy is stored after releasing, and these alloys are mainly used as deformation absorbent materials in control of civil structures subjected to seismic events, due to the absorbance of strain energy during any disaster or earthquake. Thermal-induced martensitic transformation occurs thermally on cooling, along with lattice twinning with cooperative movements of atoms by means of lattice invariant shears, and ordered parent phase structures turn into twinned martensite structures, and twinned structures turn into the detwinned structures by means of stress-induced martensitic transformation by stressing the material in the martensitic condition. Thermal induced transformation occurs with the cooperative movements of atoms in two opposite directions, <110 > -type directions on the {110} - type planes of austenite matrix which is the basal plane of martensite. Copper-based alloys exhibit this property in the metastable β-phase region, which has bcc-based structures at high-temperature parent phase field. Lattice invariant shear and twinning is not uniform in copper-based ternary alloys and gives rise to the formation of complex layered structures, depending on the stacking sequences on the close-packed planes of the ordered parent phase lattice. In the present contribution, x-ray diffraction and transmission electron microscopy (TEM) studies were carried out on two copper-based CuAlMn and CuZnAl alloys. X-ray diffraction profiles and electron diffraction patterns reveal that both alloys exhibit superlattice reflections inherited from the parent phase due to the displacive character of martensitic transformation. X-ray diffractograms taken in a long time interval show that diffraction angles and intensities of diffraction peaks change with the aging duration at room temperature. In particular, some of the successive peak pairs providing a special relation between Miller indices come close to each other. This result refers to the rearrangement of atoms in a diffusive manner.

Keywords: shape memory effect, martensitic transformation, reversibility, superelasticity, twinning, detwinning

Procedia PDF Downloads 172
3408 The Human Rights of Women in Brazilian Territory: A Literature Review of the Axes of the National Human Rights Program III

Authors: Ana Luiza Casasanta Garcia, Maria Del Carmen Cortizo

Abstract:

From the classic contractualist and early declarations of modern rights, discussions on policies for the protection and promotion of human rights were highlighted in an attempt to ensure the realization of human dignity and its values, which are (re) negotiated according to the needs evidenced in each historical and contextual moment. Aiming at guaranteeing human rights to Brazilian citizens, created in 2009 and updated in 2010, the Third National Human Rights Program (PNDH III) in force highlights guidelines and recommendations to guarantee human rights, among them, to guarantee the rights of women in Brazil. Based on this document, this article aims to locate historically and culturally the understanding of human rights related to the rights of women in Brazilian territory, from the analysis of the guiding axes of women's rights of the PNDH III. In methodological terms, the qualitative approach and documentary research were used to analyze the data according to the critical discourse analysis. As a result, it has been found that the process of building and maintaining the guarantee of women's human rights needs a reformulation that also shows a social revolution. This is justified by the fact that even with the provision in the PNDH III that, in order to guarantee the rights of women, it is necessary, for example, to adapt the Penal Code to the decriminalization of abortion and the professionalization of prostitution, these points are still very controversial and are not put into practice by the State. Finally, the importance of the critique of politics and the current system of production of understandings in favor of this social transformation is emphasized.

Keywords: human rights of women, social transformation, national human rights program III, public politics

Procedia PDF Downloads 116
3407 A Genetic Algorithm Approach to Solve a Weaving Job Scheduling Problem, Aiming Tardiness Minimization

Authors: Carolina Silva, João Nuno Oliveira, Rui Sousa, João Paulo Silva

Abstract:

This study uses genetic algorithms to solve a job scheduling problem in a weaving factory. The underline problem regards an NP-Hard problem concerning unrelated parallel machines, with sequence-dependent setup times. This research uses real data regarding a weaving industry located in the North of Portugal, with a capacity of 96 looms and a production, on average, of 440000 meters of fabric per month. Besides, this study includes a high level of complexity once most of the real production constraints are applied, and several real data instances are tested. Topics such as data analyses and algorithm performance are addressed and tested, to offer a solution that can generate reliable and due date results. All the approaches will be tested in the operational environment, and the KPIs monitored, to understand the solution's impact on the production, with a particular focus on the total number of weeks of late deliveries to clients. Thus, the main goal of this research is to develop a solution that allows for the production of automatically optimized production plans, aiming to the tardiness minimizing.

Keywords: genetic algorithms, textile industry, job scheduling, optimization

Procedia PDF Downloads 140
3406 The Impacts of the COVID-19 Pandemic on Social Activities and Residential Areas

Authors: Asghar Motea Noparvar

Abstract:

According to the World Health Organization (WHO), the coronavirus disease (COVID-19), which has been characterized as a pandemic since December 2019, is attacking societies in terms of different ways. It means that this is much more than a crisis that is related to human health. It is a human, economic and social crisis. Since December 2019, not only some significant transformations have happened in human life, but also there has been some mental health, daily life activities, and even urban space changes. The purpose of this study is to mention some tangible transformations in society by applying two main restrictions such as “lock down” and “social distancing,” and how people took refuge in their homes and fit every activity there. How this pandemic has been transforming human life and social activities is the main issue of this study. In order to gather the information, review the impacts of COVID-19 on social life by revising the literature and considering the “Risk Society” theory, which is gotten credited by a German sociologist, Ulrich Beck. Additionally, COVID-19 not only had a direct impact on health but also had significant impacts on the economy, education, tourism, the environment, and the construction industry. Therefore, the pandemic caused a disruption in the whole urban system. In this study, the main focused point is the transformation of activities and residential areas. In order to achieve this finding, the literature review is analyzed in the case of COVID-19 and its impacts on social life. To sum up, it can be concluded that a pandemic can change social life along with other transformations that it is able to do.

Keywords: infectious disease, COVID-19, social activities, residential areas, transformation

Procedia PDF Downloads 66
3405 Comparative Performance of Standing Whole Body Monitor and Shielded Chair Counter for In-vivo Measurements

Authors: M. Manohari, S. Priyadharshini, K. Bajeer Sulthan, R. Santhanam, S. Chandrasekaran, B. Venkatraman

Abstract:

In-vivo monitoring facility at Indira Gandhi Centre for Atomic Research (IGCAR), Kalpakkam, caters to the monitoring of internal exposure of occupational radiation workers from various radioactive facilities of IGCAR. Internal exposure measurement is done using Na(Tl) based Scintillation detectors. Two types of whole-body counters, namely Shielded Chair Counter (SC) and Standing Whole-Body Monitor (SWBM), are being used. The shielded Chair is based on a NaI detector of 20.3 cm diameter and 10.15 cm thick. The chair of the system is shielded using lead shots of 10 cm lead equivalent and the detector with 8 cm lead bricks. Counting geometry is sitting geometry. Calibration is done using 95 percentile BOMAB phantom. The minimum Detectable Activity (MDA) for 137Cs for the 60s is 1150 Bq. Standing Wholebody monitor (SWBM) has two NaI(Tl) detectors of size 10.16 x 10.16 x 40.64 cm3 positioned serially, one over the other. It has a shielding thickness of 5cm lead equivalent. Counting is done in standup geometry. Calibration is done with the help of Ortec Phantom, having a uniform distribution of mixed radionuclides for the thyroid, thorax and pelvis. The efficiency of SWBM is 2.4 to 3.5 times higher than that of the shielded chair in the energy range of 279 to 1332 keV. MDA of 250 Bq for 137Cs can be achieved with a counting time of 60s. MDA for 131I in the thyroid was estimated as 100 Bq from the MDA of whole-body for one-day post intake. Standing whole body monitor is better in terms of efficiency, MDA and ease of positioning. In case of emergency situations, the optimal MDAs for in-vivo monitoring service are 1000 Bq for 137Cs and 100 Bq for 131I. Hence, SWBM is more suitable for the rapid screening of workers as well as the public in the case of an emergency. While a person reports for counting, there is a potential for external contamination. In SWBM, there is a feasibility to discriminate them as the subject can be counted in anterior or posterior geometry which is not possible in SC.

Keywords: minimum detectable activity, shielded chair, shielding thickness, standing whole body monitor

Procedia PDF Downloads 29
3404 The Mechanism Study on the Difference between High and Low Voltage Performance of Li3V2(PO4)3

Authors: Enhui Wang, Qingzhu Ou, Yan Tang, Xiaodong Guo

Abstract:

As one of most popular polyanionic compounds in lithium-ion cathode materials, Li3V2(PO4)3 has always suffered from the low rate capability especially during 3~4.8V, which is considered to be related with the ion diffusion resistance and structural transformation during the Li+ de/intercalation. Here, as the change of cut-off voltages, cycling numbers and current densities, the process of SEI interfacial film’s formation-growing- destruction-repair on the surface of the cathode, the structural transformation during the charge and discharge, the de/intercalation kinetics reflected by the electrochemical impedance and the diffusion coefficient, have been investigated in detail. Current density, cycle numbers and cut-off voltage impacting on interfacial film and structure was studied specifically. Firstly, the matching between electrolyte and material was investigated, it turned out that the batteries with high voltage electrolyte showed the best electrochemical performance and high voltage electrolyte would be the best electrolyte. Secondly, AC impedance technology was used to study the changes of interface impedance and lithium ion diffusion coefficient, the results showed that current density, cycle numbers and cut-off voltage influenced the interfacial film together and the one who changed the interfacial properties most was the key factor. Scanning electron microscopy (SEM) analysis confirmed that the attenuation of discharge specific capacity was associated with the destruction and repair process of the SEI film. Thirdly, the X-ray diffraction was used to study the changes of structure, which was also impacted by current density, cycle numbers and cut-off voltage. The results indicated that the cell volume of Li3V2 (PO4 )3 increased as the current density increased; cycle numbers merely influenced the structure of material; the cell volume decreased first and moved back gradually after two Li-ion had been deintercalated as the charging cut-off voltage increased, and increased as the intercalation number of Li-ion increased during the discharging process. Then, the results which studied the changes of interface impedance and lithium ion diffusion coefficient turned out that the interface impedance and lithium ion diffusion coefficient increased when the cut-off voltage passed the voltage platforms and decreased when the cut-off voltage was between voltage platforms. Finally, three-electrode system was first adopted to test the activation energy of the system, the results indicated that the activation energy of the three-electrode system (22.385 KJ /mol) was much smaller than that of two-electrode system (40.064 KJ /mol).

Keywords: cut-off voltage, de/intercalation kinetics, solid electrolyte interphase film, structural transformation

Procedia PDF Downloads 284
3403 Models, Resources and Activities of Project Scheduling Problems

Authors: Jorge A. Ruiz-Vanoye, Ocotlán Díaz-Parra, Alejandro Fuentes-Penna, José J. Hernández-Flores, Edith Olaco Garcia

Abstract:

The Project Scheduling Problem (PSP) is a generic name given to a whole class of problems in which the best form, time, resources and costs for project scheduling are necessary. The PSP is an application area related to the project management. This paper aims at being a guide to understand PSP by presenting a survey of the general parameters of PSP: the Resources (those elements that realize the activities of a project), and the Activities (set of operations or own tasks of a person or organization); the mathematical models of the main variants of PSP and the algorithms used to solve the variants of the PSP. The project scheduling is an important task in project management. This paper contains mathematical models, resources, activities, and algorithms of project scheduling problems. The project scheduling problem has attracted researchers of the automotive industry, steel manufacturer, medical research, pharmaceutical research, telecommunication, industry, aviation industry, development of the software, manufacturing management, innovation and technology management, construction industry, government project management, financial services, machine scheduling, transportation management, and others. The project managers need to finish a project with the minimum cost and the maximum quality.

Keywords: PSP, Combinatorial Optimization Problems, Project Management; Manufacturing Management, Technology Management.

Procedia PDF Downloads 402
3402 Blockchain-Based Decentralized Architecture for Secure Medical Records Management

Authors: Saeed M. Alshahrani

Abstract:

This research integrated blockchain technology to reform medical records management in healthcare informatics. It was aimed at resolving the limitations of centralized systems by establishing a secure, decentralized, and user-centric platform. The system was architected with a sophisticated three-tiered structure, integrating advanced cryptographic methodologies, consensus algorithms, and the Fast Healthcare Interoperability Resources (HL7 FHIR) standard to ensure data security, transaction validity, and semantic interoperability. The research has profound implications for healthcare delivery, patient care, legal compliance, operational efficiency, and academic advancements in blockchain technology and healthcare IT sectors. The methodology adapted in this research comprises of Preliminary Feasibility Study, Literature Review, Design and Development, Cryptographic Algorithm Integration, Modeling the data and testing the system. The research employed a permissioned blockchain with a Practical Byzantine Fault Tolerance (PBFT) consensus algorithm and Ethereum-based smart contracts. It integrated advanced cryptographic algorithms, role-based access control, multi-factor authentication, and RESTful APIs to ensure security, regulate access, authenticate user identities, and facilitate seamless data exchange between the blockchain and legacy healthcare systems. The research contributed to the development of a secure, interoperable, and decentralized system for managing medical records, addressing the limitations of the centralized systems that were in place. Future work will delve into optimizing the system further, exploring additional blockchain use cases in healthcare, and expanding the adoption of the system globally, contributing to the evolution of global healthcare practices and policies.

Keywords: healthcare informatics, blockchain, medical records management, decentralized architecture, data security, cryptographic algorithms

Procedia PDF Downloads 44
3401 Meta-Learning for Hierarchical Classification and Applications in Bioinformatics

Authors: Fabio Fabris, Alex A. Freitas

Abstract:

Hierarchical classification is a special type of classification task where the class labels are organised into a hierarchy, with more generic class labels being ancestors of more specific ones. Meta-learning for classification-algorithm recommendation consists of recommending to the user a classification algorithm, from a pool of candidate algorithms, for a dataset, based on the past performance of the candidate algorithms in other datasets. Meta-learning is normally used in conventional, non-hierarchical classification. By contrast, this paper proposes a meta-learning approach for more challenging task of hierarchical classification, and evaluates it in a large number of bioinformatics datasets. Hierarchical classification is especially relevant for bioinformatics problems, as protein and gene functions tend to be organised into a hierarchy of class labels. This work proposes meta-learning approach for recommending the best hierarchical classification algorithm to a hierarchical classification dataset. This work’s contributions are: 1) proposing an algorithm for splitting hierarchical datasets into new datasets to increase the number of meta-instances, 2) proposing meta-features for hierarchical classification, and 3) interpreting decision-tree meta-models for hierarchical classification algorithm recommendation.

Keywords: algorithm recommendation, meta-learning, bioinformatics, hierarchical classification

Procedia PDF Downloads 293
3400 Striking a Balance between Certainty and Flexibility: The Role of Ubuntu in South African Contract Law

Authors: Yeukai Mupangavanhu

Abstract:

The paper examines the concept of ubuntu and the extent to which it can play a role in ensuring fairness and justice in contractual relationships. Courts are expected to balance sanctity of contract and fairness. Public policy is currently a mechanism which is used by courts when balancing the above two competing interests. It, however, generally favours the freedom and sanctity of contract. The question which is addressed in this paper is whether the concept of ubuntu is an alternative mechanism that may be used to mitigate the sometimes harsh and unfair consequences of the doctrine of freedom and sanctity of contract. A comparative study and case analysis is the methodology that is used in this article. Unfairness in contracts is generally related to the problem of inequality in bargaining power underscored by deeply entrenched social and economic inequalities that are a consequence of apartheid and patriarchy. The transformative nature of the constitution demands the inclusion of African legal ideas and values in the legal order. There is a need for the harmonisation of western ideals which are based on the classical model of law of contract with relevant African principles. In order to attain a transformative legal order that promotes a societal transformation and enhances the lives of everyone courts cannot continue to frown upon African values. Ubuntu has the potential of steering the law of contract in a more equitable direction. The substantive rules of contract law undoubtedly need to be infused with the notion of ubuntu. The reconciliation of Western and African values is at the heart of legal transformation.

Keywords: fairness, sanctity of contract, contractual justice, transformative constitutionalism

Procedia PDF Downloads 235
3399 Numerical Simulation and Analysis of Axially Restrained Steel Cellular Beams in Fire

Authors: Asal Pournaghshband

Abstract:

This paper presents the development of a finite element model to study the large deflection behavior of restrained stainless steel cellular beams at elevated temperature. Cellular beams are widely used for efficient utilization of raw materials to facilitate long spans with faster construction resulting sustainable design solution that can enhance the performance and merit of any construction project. However, their load carrying capacity is less than the equivalent beams without opening due to developing shear-moment interaction at the openings. In structural frames due to elements continuity, such beams are restrained by their adjoining members which has a substantial effect on beams behavior in fire. Stainless steel has also become integral part of the build environment due to its excellent corrosion resistance, whole life-cycle costs, and sustainability. This paper reports the numerical investigations into the effect of structural continuity on the thermo-mechanical performance of restrained steel beams with circle and elongated circle shapes of web opening in fire. The numerical model is firstly validated using existing numerical results from the literature, and then employed to perform a parametric study. The structural continuity is evaluated through the application of different levels of axial restraints on the response of carbon steel and stainless steel cellular beam in fire. The transit temperature for stainless steel cellular beam is shown to be less affected by the level of axial stiffness than the equivalent carbon steel cellular beam. Overall, it was established that whereas stainless steel cellular beams show similar stages of behavior of carbon steel cellular beams in fire, they are capable of withstanding higher temperatures prior to the onset of catenary action in large deflection, despite the higher thermal expansion of stainless steel material.

Keywords: axial restraint, catenary action, cellular beam, fire, numerical modeling, stainless steel, transit temperature

Procedia PDF Downloads 51
3398 Decentralized Peak-Shaving Strategies for Integrated Domestic Batteries

Authors: Corentin Jankowiak, Aggelos Zacharopoulos, Caterina Brandoni

Abstract:

In a context of increasing stress put on the electricity network by the decarbonization of many sectors, energy storage is likely to be the key mitigating element, by acting as a buffer between production and demand. In particular, the highest potential for storage is when connected closer to the loads. Yet, low voltage storage struggles to penetrate the market at a large scale due to the novelty and complexity of the solution, and the competitive advantage of fossil fuel-based technologies regarding regulations. Strong and reliable numerical simulations are required to show the benefits of storage located near loads and promote its development. The present study was restrained from excluding aggregated control of storage: it is assumed that the storage units operate independently to one another without exchanging information – as is currently mostly the case. A computationally light battery model is presented in detail and validated by direct comparison with a domestic battery operating in real conditions. This model is then used to develop Peak-Shaving (PS) control strategies as it is the decentralized service from which beneficial impacts are most likely to emerge. The aggregation of flatter, peak- shaved consumption profiles is likely to lead to flatter and arbitraged profile at higher voltage layers. Furthermore, voltage fluctuations can be expected to decrease if spikes of individual consumption are reduced. The crucial part to achieve PS lies in the charging pattern: peaks depend on the switching on and off of appliances in the dwelling by the occupants and are therefore impossible to predict accurately. A performant PS strategy must, therefore, include a smart charge recovery algorithm that can ensure enough energy is present in the battery in case it is needed without generating new peaks by charging the unit. Three categories of PS algorithms are introduced in detail. First, using a constant threshold or power rate for charge recovery, followed by algorithms using the State Of Charge (SOC) as a decision variable. Finally, using a load forecast – of which the impact of the accuracy is discussed – to generate PS. A performance metrics was defined in order to quantitatively evaluate their operating regarding peak reduction, total energy consumption, and self-consumption of domestic photovoltaic generation. The algorithms were tested on load profiles with a 1-minute granularity over a 1-year period, and their performance was assessed regarding these metrics. The results show that constant charging threshold or power are far from optimal: a certain value is not likely to fit the variability of a residential profile. As could be expected, forecast-based algorithms show the highest performance. However, these depend on the accuracy of the forecast. On the other hand, SOC based algorithms also present satisfying performance, making them a strong alternative when the reliable forecast is not available.

Keywords: decentralised control, domestic integrated batteries, electricity network performance, peak-shaving algorithm

Procedia PDF Downloads 103
3397 From Government-Led to Collective Action: A Case Study of the Transformation of Urban Renewal Governance in Nanjing, China

Authors: Hanjun Hu, Jinxiang Zhang

Abstract:

With the decline of "growthism", China's urbanization process has shifted from the stage of spatial expansion to the stage of optimization of built-up spaces, and urban renewal has gradually become a new wave of China's urban movement in recent years. The ongoing urban renewal movement in China not only needs to generate new motivation for urban development but also solve the backlog of social problems caused by rapid urbanization, which provides an opportunity for the transformation of China's urban governance model. Unlike previous approaches that focused on physical space and functional renewal, such as urban reconstruction, redevelopment, and reuse, the key challenge of urban renewal in the post-growth era lies in coordinating the complex interest relationships between multiple stakeholders. The traditional theoretical frameworks that focus on the structural relations between social groups are insufficient to explain the behavior logic and mutual cooperation mechanism of various groups and individuals in the current urban renewal practices. Therefore, based on the long-term tracking of the urban renewal practices in the Old City of Nanjing (OCN), this paper introduces the "collective action" theory to deeply analyze changes in the urban renewal governance model in OCN and tries to summarize the governance strategies that promote the formation of collective action within recent practices from a micro-scale. The study found that the practice in OCN experienced three different stages "government-led", "growth coalition" and "asymmetric game". With the transformation of government governance concepts, the rise of residents' consciousness of rights, and the wider participation of social organizations in recent years, the urban renewal in OCN is entering a new stage of "collective renewal action". Through the establishment of the renewal organization model, incentive policies, and dynamic negotiation mechanism, urban renewal in OCN not only achieves a relative balance between individual interests and collective interests but also makes the willingness of residents the dominant factor in formulating urban renewal policies. However, the presentation of "collective renewal action" in OCN is still mainly based on typical cases. Although the government is no longer the dominant role, a large number of resident-led collective actions have not yet emerged, which puts forward new research needs for a sustainable governance policy innovation in this action.

Keywords: urban renewal, collective action theory, governance, cooperation mechanism, China

Procedia PDF Downloads 33
3396 Acceleration of Lagrangian and Eulerian Flow Solvers via Graphics Processing Units

Authors: Pooya Niksiar, Ali Ashrafizadeh, Mehrzad Shams, Amir Hossein Madani

Abstract:

There are many computationally demanding applications in science and engineering which need efficient algorithms implemented on high performance computers. Recently, Graphics Processing Units (GPUs) have drawn much attention as compared to the traditional CPU-based hardware and have opened up new improvement venues in scientific computing. One particular application area is Computational Fluid Dynamics (CFD), in which mature CPU-based codes need to be converted to GPU-based algorithms to take advantage of this new technology. In this paper, numerical solutions of two classes of discrete fluid flow models via both CPU and GPU are discussed and compared. Test problems include an Eulerian model of a two-dimensional incompressible laminar flow case and a Lagrangian model of a two phase flow field. The CUDA programming standard is used to employ an NVIDIA GPU with 480 cores and a C++ serial code is run on a single core Intel quad-core CPU. Up to two orders of magnitude speed up is observed on GPU for a certain range of grid resolution or particle numbers. As expected, Lagrangian formulation is better suited for parallel computations on GPU although Eulerian formulation represents significant speed up too.

Keywords: CFD, Eulerian formulation, graphics processing units, Lagrangian formulation

Procedia PDF Downloads 390
3395 Biographical Learning and Its Impact on the Democratization Processes of Post War Societies

Authors: Rudolf Egger

Abstract:

This article shows some results of an ongoing project in Kosova. This project deals with the meaning of social transformation processes in the life-courses of Kosova people. One goal is to create an oral history archive in this country. In the last seven years we did some interpretative work (using narrative interviews) concerning the experiences and meanings of social changes from the perspective of life course. We want to reconstruct the individual possibilities in creating one's life in new social structures. After the terrible massacres of ethnical-territorially defined nationalism in former Yugoslavia it is the main focus to find out something about the many small daily steps which must be done, to build up a kind of “normality” in this country. These steps can be very well reconstructed by narrations, by life stories, because personal experiences are naturally linked with social orders. Each individual story is connected with further stories, in which the collective history will be negotiated and reflected. The view on the biographical narration opens the possibility to analyze the concreteness of the “individual case” in the complexity of collective history. Life stories contain thereby a kind of a transition character, that’s why they can be used for the reconstruction of periods of political transformation. For example: In the individual story we can find very clear the national or mythological character of the Albanian people in Kosova. The shown narrations can be read also as narrative lines in relation to the (re-)interpretation of the past, in which lived life is fixed into history in the so-called collective memory in Kosova.

Keywords: biographical learning, adult education, social change, post war societies

Procedia PDF Downloads 402
3394 Development of Digital Twin Concept to Detect Abnormal Changes in Structural Behaviour

Authors: Shady Adib, Vladimir Vinogradov, Peter Gosling

Abstract:

Digital Twin (DT) technology is a new technology that appeared in the early 21st century. The DT is defined as the digital representation of living and non-living physical assets. By connecting the physical and virtual assets, data are transmitted smoothly, allowing the virtual asset to fully represent the physical asset. Although there are lots of studies conducted on the DT concept, there is still limited information about the ability of the DT models for monitoring and detecting unexpected changes in structural behaviour in real time. This is due to the large computational efforts required for the analysis and an excessively large amount of data transferred from sensors. This paper aims to develop the DT concept to be able to detect the abnormal changes in structural behaviour in real time using advanced modelling techniques, deep learning algorithms, and data acquisition systems, taking into consideration model uncertainties. finite element (FE) models were first developed offline to be used with a reduced basis (RB) model order reduction technique for the construction of low-dimensional space to speed the analysis during the online stage. The RB model was validated against experimental test results for the establishment of a DT model of a two-dimensional truss. The established DT model and deep learning algorithms were used to identify the location of damage once it has appeared during the online stage. Finally, the RB model was used again to identify the damage severity. It was found that using the RB model, constructed offline, speeds the FE analysis during the online stage. The constructed RB model showed higher accuracy for predicting the damage severity, while deep learning algorithms were found to be useful for estimating the location of damage with small severity.

Keywords: data acquisition system, deep learning, digital twin, model uncertainties, reduced basis, reduced order model

Procedia PDF Downloads 81
3393 Application of Machine Learning Models to Predict Couchsurfers on Free Homestay Platform Couchsurfing

Authors: Yuanxiang Miao

Abstract:

Couchsurfing is a free homestay and social networking service accessible via the website and mobile app. Couchsurfers can directly request free accommodations from others and receive offers from each other. However, it is typically difficult for people to make a decision that accepts or declines a request when they receive it from Couchsurfers because they do not know each other at all. People are expected to meet up with some Couchsurfers who are kind, generous, and interesting while it is unavoidable to meet up with someone unfriendly. This paper utilized classification algorithms of Machine Learning to help people to find out the Good Couchsurfers and Not Good Couchsurfers on the Couchsurfing website. By knowing the prior experience, like Couchsurfer’s profiles, the latest references, and other factors, it became possible to recognize what kind of the Couchsurfers, and furthermore, it helps people to make a decision that whether to host the Couchsurfers or not. The value of this research lies in a case study in Kyoto, Japan in where the author has hosted 54 Couchsurfers, and the author collected relevant data from the 54 Couchsurfers, finally build a model based on classification algorithms for people to predict Couchsurfers. Lastly, the author offered some feasible suggestions for future research.

Keywords: Couchsurfing, Couchsurfers prediction, classification algorithm, hospitality tourism platform, hospitality sciences, machine learning

Procedia PDF Downloads 107
3392 Alternative Computational Arrangements on g-Group (g > 2) Profile Analysis

Authors: Emmanuel U. Ohaegbulem, Felix N. Nwobi

Abstract:

Alternative and simple computational arrangements in carrying out multivariate profile analysis when more than two groups (populations) are involved are presented. These arrangements have been demonstrated to not only yield equivalent results for the test statistics (the Wilks lambdas), but they have less computational efforts relative to other arrangements so far presented in the literature; in addition to being quite simple and easy to apply.

Keywords: coincident profiles, g-group profile analysis, level profiles, parallel profiles, repeated measures MANOVA

Procedia PDF Downloads 433
3391 Relay Node Placement for Connectivity Restoration in Wireless Sensor Networks Using Genetic Algorithms

Authors: Hanieh Tarbiat Khosrowshahi, Mojtaba Shakeri

Abstract:

Wireless Sensor Networks (WSNs) consist of a set of sensor nodes with limited capability. WSNs may suffer from multiple node failures when they are exposed to harsh environments such as military zones or disaster locations and lose connectivity by getting partitioned into disjoint segments. Relay nodes (RNs) are alternatively introduced to restore connectivity. They cost more than sensors as they benefit from mobility, more power and more transmission range, enforcing a minimum number of them to be used. This paper addresses the problem of RN placement in a multiple disjoint network by developing a genetic algorithm (GA). The problem is reintroduced as the Steiner tree problem (which is known to be an NP-hard problem) by the aim of finding the minimum number of Steiner points where RNs are to be placed for restoring connectivity. An upper bound to the number of RNs is first computed to set up the length of initial chromosomes. The GA algorithm then iteratively reduces the number of RNs and determines their location at the same time. Experimental results indicate that the proposed GA is capable of establishing network connectivity using a reasonable number of RNs compared to the best existing work.

Keywords: connectivity restoration, genetic algorithms, multiple-node failure, relay nodes, wireless sensor networks

Procedia PDF Downloads 223
3390 Research Action Fields at the Nexus of Digital Transformation and Supply Chain Management: Findings from Practitioner Focus Group Workshops

Authors: Brandtner Patrick, Staberhofer Franz

Abstract:

Logistics and Supply Chain Management are of crucial importance for organisational success. In the era of Digitalization, several implications and improvement potentials for these domains arise, which at the same time could lead to decreased competitiveness and could endanger long-term company success if ignored or neglected. However, empirical research on the issue of Digitalization and benefits purported to it by practitioners is scarce and mainly focused on single technologies or separate, isolated Supply Chain blocks as e.g. distribution logistics or procurement only. The current paper applies a holistic focus group approach to elaborate practitioner use cases at the nexus of the concepts of Supply Chain Management (SCM) and Digitalization. In the course of three focus group workshops with over 45 participants from more than 20 organisations, a comprehensive set of benefit entitlements and areas for improvement in terms of applying digitalization to SCM is developed. The main results of the paper indicate the relevance of Digitalization being realized in practice. In the form of seventeen concrete research action fields, the benefit entitlements are aggregated and transformed into potential starting points for future research projects in this area. The main contribution of this paper is an empirically grounded basis for future research projects and an overview of actual research action fields from practitioners’ point of view.

Keywords: digital supply chain, digital transformation, supply chain management, value networks

Procedia PDF Downloads 150
3389 Analyzing the Factors that Cause Parallel Performance Degradation in Parallel Graph-Based Computations Using Graph500

Authors: Mustafa Elfituri, Jonathan Cook

Abstract:

Recently, graph-based computations have become more important in large-scale scientific computing as they can provide a methodology to model many types of relations between independent objects. They are being actively used in fields as varied as biology, social networks, cybersecurity, and computer networks. At the same time, graph problems have some properties such as irregularity and poor locality that make their performance different than regular applications performance. Therefore, parallelizing graph algorithms is a hard and challenging task. Initial evidence is that standard computer architectures do not perform very well on graph algorithms. Little is known exactly what causes this. The Graph500 benchmark is a representative application for parallel graph-based computations, which have highly irregular data access and are driven more by traversing connected data than by computation. In this paper, we present results from analyzing the performance of various example implementations of Graph500, including a shared memory (OpenMP) version, a distributed (MPI) version, and a hybrid version. We measured and analyzed all the factors that affect its performance in order to identify possible changes that would improve its performance. Results are discussed in relation to what factors contribute to performance degradation.

Keywords: graph computation, graph500 benchmark, parallel architectures, parallel programming, workload characterization.

Procedia PDF Downloads 128
3388 Probability Modeling and Genetic Algorithms in Small Wind Turbine Design Optimization: Mentored Interdisciplinary Undergraduate Research at LaGuardia Community College

Authors: Marina Nechayeva, Malgorzata Marciniak, Vladimir Przhebelskiy, A. Dragutan, S. Lamichhane, S. Oikawa

Abstract:

This presentation is a progress report on a faculty-student research collaboration at CUNY LaGuardia Community College (LaGCC) aimed at designing a small horizontal axis wind turbine optimized for the wind patterns on the roof of our campus. Our project combines statistical and engineering research. Our wind modeling protocol is based upon a recent wind study by a faculty-student research group at MIT, and some of our blade design methods are adopted from a senior engineering project at CUNY City College. Our use of genetic algorithms has been inspired by the work on small wind turbines’ design by David Wood. We combine these diverse approaches in our interdisciplinary project in a way that has not been done before and improve upon certain techniques used by our predecessors. We employ several estimation methods to determine the best fitting parametric probability distribution model for the local wind speed data obtained through correlating short-term on-site measurements with a long-term time series at the nearby airport. The model serves as a foundation for engineering research that focuses on adapting and implementing genetic algorithms (GAs) to engineering optimization of the wind turbine design using Blade Element Momentum Theory. GAs are used to create new airfoils with desirable aerodynamic specifications. Small scale models of best performing designs are 3D printed and tested in the wind tunnel to verify the accuracy of relevant calculations. Genetic algorithms are applied to selected airfoils to determine the blade design (radial cord and pitch distribution) that would optimize the coefficient of power profile of the turbine. Our approach improves upon the traditional blade design methods in that it lets us dispense with assumptions necessary to simplify the system of Blade Element Momentum Theory equations, thus resulting in more accurate aerodynamic performance calculations. Furthermore, it enables us to design blades optimized for a whole range of wind speeds rather than a single value. Lastly, we improve upon known GA-based methods in that our algorithms are constructed to work with XFoil generated airfoils data which enables us to optimize blades using our own high glide ratio airfoil designs, without having to rely upon available empirical data from existing airfoils, such as NACA series. Beyond its immediate goal, this ongoing project serves as a training and selection platform for CUNY Research Scholars Program (CRSP) through its annual Aerodynamics and Wind Energy Research Seminar (AWERS), an undergraduate summer research boot camp, designed to introduce prospective researchers to the relevant theoretical background and methodology, get them up to speed with the current state of our research, and test their abilities and commitment to the program. Furthermore, several aspects of the research (e.g., writing code for 3D printing of airfoils) are adapted in the form of classroom research activities to enhance Calculus sequence instruction at LaGCC.

Keywords: engineering design optimization, genetic algorithms, horizontal axis wind turbine, wind modeling

Procedia PDF Downloads 213
3387 Spatial Design Transformation of Mount Merapi's Dwellings Using Diachronic Approach

Authors: Catharina Dwi Astuti Depari, Gregorius Agung Setyonugroho

Abstract:

In concern for human safety, living in disaster-prone areas is twofold: it is profoundly cataclysmic yet perceptibly contributive. This paradox could be identified in Kalitengah Lor Sub-village community who inhabit Mount Merapi’s most hazardous area, putting them to the highest exposure to eruptions’ cataclysmic impacts. After the devastating incident in 2010, through the Action Plan for Rehabilitation and Reconstruction, the National Government with immediate aid from humanitarian agencies initiated a relocation program by establishing nearly 2,613 temporary shelters throughout the mountain’s region. The problem arose as some of the most affected communities including those in Kalitengah Lor Sub-village, persistently refused to relocate. The obnoxious experience of those living in temporary shelters resulted from the program’s failure to support a long-term living was assumed to instigate the rejection. From the psychological standpoint, this phenomenon reflects the emotional bond between the affected communities with their former dwellings. Regarding this, the paper aims to reveal the factors influencing the emotional attachment of Kalitengah Lor community to their former dwellings including the dwellings’ spatial design transformation prior and post the eruption in 2010. The research adopted Likert five scale-questionnaire comprising a wide range of responses from strongly agree to strongly disagree. The responses were then statistically measured, leading to consensus that provides bases for further interpretations toward the local’s characteristics. Using purposive unit sampling technique, 50 respondents from 217 local households were randomly selected. Questions in the questionnaire were developed with concerns on the aspects of place attachment concept: affection, cognitive, behavior, and perception. Combined with quantitative method, the research adopted diachronic method which was aimed to analyze the spatial design transformation of each dwelling in relation to the inhabitant’s daily activities and personal preferences. The research found that access to natural resources like sand mining, agricultural farms and wood forests, social relationship and physical proximity from house to personal asset like cattle shed, are the dominant factors encouraging the locals to emotionally attached to their former dwellings. Consequently, each dwelling’s spatial design is suffered from changes in which the current house is typically larger in dimension and the bathroom is replaced by public toilet located outside the house’s backyard. Relatively unchanged, the cattle shed is still located in front of the house, the continuous visual relationship, particularly between the living and family room, is maintained, as well as the main orientation of the house towards the local street.

Keywords: diachronic method, former dwellings, local’s characteristics, place attachment, spatial design transformation

Procedia PDF Downloads 150