Search results for: cluster model approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26813

Search results for: cluster model approach

21563 Design and Implementation of Security Middleware for Data Warehouse Signature, Framework

Authors: Mayada Al Meghari

Abstract:

Recently, grid middlewares have provided large integrated use of network resources as the shared data and the CPU to become a virtual supercomputer. In this work, we present the design and implementation of the middleware for Data Warehouse Signature, DWS Framework. The aim of using the middleware in our DWS framework is to achieve the high performance by the parallel computing. This middleware is developed on Alchemi.Net framework to increase the security among the network nodes through the authentication and group-key distribution model. This model achieves the key security and prevents any intermediate attacks in the middleware. This paper presents the flow process structures of the middleware design. In addition, the paper ensures the implementation of security for DWS middleware enhancement with the authentication and group-key distribution model. Finally, from the analysis of other middleware approaches, the developed middleware of DWS framework is the optimal solution of a complete covering of security issues.

Keywords: middleware, parallel computing, data warehouse, security, group-key, high performance

Procedia PDF Downloads 98
21562 Efficient Prediction of Surface Roughness Using Box Behnken Design

Authors: Ajay Kumar Sarathe, Abhinay Kumar

Abstract:

Production of quality products required for specific engineering applications is an important issue. The roughness of the surface plays an important role in the quality of the product by using appropriate machining parameters to eliminate wastage due to over machining. To increase the quality of the surface, the optimum machining parameter setting is crucial during the machining operation. The effect of key machining parameters- spindle speed, feed rate, and depth of cut on surface roughness has been evaluated. Experimental work was carried out using High Speed Steel tool and AlSI 1018 as workpiece material. In this study, the predictive model has been developed using Box-Behnken Design. An experimental investigation has been carried out for this work using BBD for three factors and observed that the predictive model of Ra value is closed to predictive value with a marginal error of 2.8648 %. Developed model establishes a correlation between selected key machining parameters that influence the surface roughness in a AISI 1018. F

Keywords: ANOVA, BBD, optimisation, response surface methodology

Procedia PDF Downloads 143
21561 Quantification and Thermal Behavior of Rice Bran Oil, Sunflower Oil and Their Model Blends

Authors: Harish Kumar Sharma, Garima Sengar

Abstract:

Rice bran oil is considered comparatively nutritionally superior than different fats/oils. Therefore, model blends prepared from pure rice bran oil (RBO) and sunflower oil (SFO) were explored for changes in the different physicochemical parameters. Repeated deep fat frying process was carried out by using dried potato in order to study the thermal behaviour of pure rice bran oil, sunflower oil and their model blends. Pure rice bran oil and sunflower oil had shown good thermal stability during the repeated deep fat frying cycles. Although, the model blends constituting 60% RBO + 40% SFO showed better suitability during repeated deep fat frying than the remaining blended oils. The quantification of pure rice bran oil in the blended oils, physically refined rice bran oil (PRBO): SnF (sunflower oil) was carried by different methods. The study revealed that regression equations based on the oryzanol content, palmitic acid composition and iodine value can be used for the quantification. The rice bran oil can easily be quantified in the blended oils based on the oryzanol content by HPLC even at 1% level. The palmitic acid content in blended oils can also be used as an indicator to quantify rice bran oil at or above 20% level in blended oils whereas the method based on ultrasonic velocity, acoustic impedance and relative association showed initial promise in the quantification.

Keywords: rice bran oil, sunflower oil, frying, quantification

Procedia PDF Downloads 295
21560 The Relevance of Corporate Governance Disclosure in Spanish Public Universities

Authors: Yolanda Ramirez, Angel Tejada, Agustin Baidez

Abstract:

There is currently a growing interest in the improvement of university governance and the disclosure of information on corporate governance processes as an essential part of the transparency and accountability of universities. This paper aims to know the importance given by Spanish university stakeholders to the disclosure of information about structure and mechanism of corporate governance. So as to meet this objective we propose a model for disclosing information on the main aspects of university governance in Spanish universities. This model will be validated using a questionnaire sent to members of the Social Councils of public universities in Spain. Our results show that Spanish university stakeholders attach great importance to the disclosure of specific information on aspects of corporate governance, which would result in improved transparency and accountability. According to the results of this study it may be concluded that the university stakeholders feel that it is relevant to publish information on corporate governance in the university accounting information model.

Keywords: corporate governance, transparency, accountability, universities, Spain

Procedia PDF Downloads 295
21559 Wind Power Forecast Error Simulation Model

Authors: Josip Vasilj, Petar Sarajcev, Damir Jakus

Abstract:

One of the major difficulties introduced with wind power penetration is the inherent uncertainty in production originating from uncertain wind conditions. This uncertainty impacts many different aspects of power system operation, especially the balancing power requirements. For this reason, in power system development planing, it is necessary to evaluate the potential uncertainty in future wind power generation. For this purpose, simulation models are required, reproducing the performance of wind power forecasts. This paper presents a wind power forecast error simulation models which are based on the stochastic process simulation. Proposed models capture the most important statistical parameters recognized in wind power forecast error time series. Furthermore, two distinct models are presented based on data availability. First model uses wind speed measurements on potential or existing wind power plant locations, while the seconds model uses statistical distribution of wind speeds.

Keywords: wind power, uncertainty, stochastic process, Monte Carlo simulation

Procedia PDF Downloads 464
21558 3D Building Model Utilizing Airborne LiDAR Dataset and Terrestrial Photographic Images

Authors: J. Jasmee, I. Roslina, A. Mohammed Yaziz & A.H Juazer Rizal

Abstract:

The need of an effective building information collection method is vital to support a diversity of land development activities. At present, advances in remote sensing such as airborne LiDAR (Light Detection and Ranging) is an established technology for building information collection, location, and elevation of the reflecting laser points towards the construction of 3D building models. In this study, LiDAR datasets and terrestrial photographic images of buildings towards the construction of 3D building models is explored. It is found that, the quantitative accuracy of the constructed 3D building model, namely in the horizontal and vertical components were ± 0.31m (RMSEx,y) and ± 0.145m (RMSEz) respectively. The accuracies were computed based on sixty nine (69) horizontal and twenty (20) vertical surveyed points. As for the qualitative assessment, it is shown that the appearance of the 3D building model is adequate to support the requirements of LOD3 presentation based on the OGC (Open Geospatial Consortium) standard CityGML.

Keywords: LiDAR datasets, DSM, DTM, 3D building models

Procedia PDF Downloads 307
21557 The Impact of a Model's Skin Tone and Ethnic Identification on Consumer Decision Making

Authors: Shanika Y. Koreshi

Abstract:

Sri Lanka housed the lingerie product development and manufacturing subsidiary to renowned brands such as La Senza, Marks & Spencer, H&M, Etam, Lane Bryant, and George. Over the last few years, they have produced local brands such as Amante to cater to the local and regional customers. Past research has identified factors such as quality, price, and design to be vital when marketing lingerie to consumers. However, there has been minimum research that looks into the ethnically targeted market and skin colour within the Asian population. Therefore, the main aim of the research was to identify whether consumer preference for lingerie is influenced by the skin tone of the model wearing it. Moreover, the secondary aim was to investigate if the consumer preference for lingerie is influenced by the consumer’s ethnic identification with the skin tone of the model. An experimental design was used to explore the above aims. The participants constituted of 66 females residing in the western province of Sri Lanka and were gathered via convenience sampling. Six computerized images of a real model were used in the study, and her skin tone was digitally manipulated to express three different skin tones (light, tan and dark). Consumer preferences were measured through a ranking order scale that was constructed via a focus group discussion and ethnic identity was measured by the Multigroup Ethnic Identity Measure-Revised. Wilcoxon signed-rank test, Friedman test, and chi square test of independence were carried out using SPSS version 20. The results indicated that majority of the consumers ethnically identified and preferred the tan skin over the light and dark skin tones. The findings support the existing literature that states there is a preference among consumers when models have a medium skin tone over a lighter skin tone. The preference for a tan skin tone in a model is consistent with the ethnic identification of the Sri Lankan sample. The study implies that lingerie brands should consider the model's skin tones when marketing the brand to different ethnic backgrounds.

Keywords: consumer preference, ethnic identification, lingerie, skin tone

Procedia PDF Downloads 243
21556 Projection of Solar Radiation for the Extreme South of Brazil

Authors: Elison Eduardo Jardim Bierhals, Claudineia Brazil, Rafael Haag, Elton Rossini

Abstract:

This work aims to validate and make the projections of solar energy for the Brazilian period from 2025 to 2100. As the plants designed by the HadGEM2-AO (Global Hadley Model 2 - Atmosphere) General Circulation Model UK Met Office Hadley Center, belonging to Phase 5 of the Intercomparison of Coupled Models (CMIP5). The simulation results of the model are compared with monthly data from 2006 to 2013, measured by a network of meteorological sections of the National Institute of Meteorology (INMET). The performance of HadGEM2-AO is evaluated by the efficiency coefficient (CEF) and bias. The results are shown in the table of maps and maps. HadGEM2-AO, in the most pessimistic scenario, RCP 8.5 had a very good accuracy, presenting efficiency coefficients between 0.94 and 0.98, the perfect setting being Solar radiation, which indicates a horizontal trend, is a climatic alternative for some regions of the Brazilian scenario, especially in spring.

Keywords: climate change, projections, solar radiation, scenarios climate change

Procedia PDF Downloads 138
21555 Mastering Digitization: A Quality-Adapted Digital Transformation Model

Authors: Franziska Schaefer, Marlene Kuhn, Heiner Otten

Abstract:

In the very near future, digitization will be the main challenge a company has to master to survive in a highly competitive market. Developing the right transformation strategy by considering all relevant aspects determines the success or failure of a company. Especially the digital focus on the customer plays a key role in creating sustainable competitive advantages, also leading to new tasks within the quality management. Therefore, quality management needs to be particularly addressed to support the upcoming digital change. In this paper, we present an analysis of existing digital transformation approaches and derive a transformation strategy from a quality management perspective. We identify and classify different transformation dimensions and assess their relevance to quality management tasks, resulting in a quality-adapted digital transformation model. Furthermore, we introduce applicable and customized quality management methods to support the presented digital transformation tasks. With our developed model we provide a digital transformation guideline from a quality perspective to master future disruptive changes.

Keywords: digital transformation, digitization, quality management, strategy

Procedia PDF Downloads 461
21554 Magnetohemodynamic of Blood Flow Having Impact of Radiative Flux Due to Infrared Magnetic Hyperthermia: Spectral Relaxation Approach

Authors: Ebenezer O. Ige, Funmilayo H. Oyelami, Joshua Olutayo-Irheren, Joseph T. Okunlola

Abstract:

Hyperthermia therapy is an adjuvant procedure during which perfused body tissues is subjected to elevated range of temperature in bid to achieve improved drug potency and efficacy of cancer treatment. While a selected class of hyperthermia techniques is shouldered on the thermal radiations derived from single-sourced electro-radiation measures, there are deliberations on conjugating dual radiation field sources in an attempt to improve the delivery of therapy procedure. This paper numerically explores the thermal effectiveness of combined infrared hyperemia having nanoparticle recirculation in the vicinity of imposed magnetic field on subcutaneous strata of a model lesion as ablation scheme. An elaborate Spectral relaxation method (SRM) was formulated to handle equation of coupled momentum and thermal equilibrium in the blood-perfused tissue domain of a spongy fibrous tissue. Thermal diffusion regimes in the presence of external magnetic field imposition were described leveraging on the renowned Roseland diffusion approximation to delineate the impact of radiative flux within the computational domain. The contribution of tissue sponginess was examined using mechanics of pore-scale porosity over a selected of clinical informed scenarios. Our observations showed for a substantial depth of spongy lesion, magnetic field architecture constitute the control regimes of hemodynamics in the blood-tissue interface while facilitating thermal transport across the depth of the model lesion. This parameter-indicator could be utilized to control the dispensing of hyperthermia treatment in intravenous perfused tissue.

Keywords: spectra relaxation scheme, thermal equilibrium, Roseland diffusion approximation, hyperthermia therapy

Procedia PDF Downloads 97
21553 Health Risk Assessment of Exposing to Benzene in Office Building around a Chemical Industry Based on Numerical Simulation

Authors: Majid Bayatian, Mohammadreza Ashouri

Abstract:

Releasing hazardous chemicals is one of the major problems for office buildings in the chemical industry and, therefore, environmental risks are inherent to these environments. The adverse health effects of the airborne concentration of benzene have been a matter of significant concern, especially in oil refineries. The chronic and acute adverse health effects caused by benzene exposure have attracted wide attention. Acute exposure to benzene through inhalation could cause headaches, dizziness, drowsiness, and irritation of the skin. Chronic exposures have reported causing aplastic anemia and leukemia at the occupational settings. Association between chronic occupational exposure to benzene and the development of aplastic anemia and leukemia were documented by several epidemiological studies. Numerous research works have investigated benzene emissions and determined benzene concentration at different locations of the refinery plant and stated considerable health risks. The high cost of industrial control measures requires justification through lifetime health risk assessment of exposed workers and the public. In the present study, a Computational Fluid Dynamics (CFD) model has been proposed to assess the exposure risk of office building around a refinery due to its release of benzene. For simulation, GAMBIT, FLUENT, and CFD Post software were used as pre-processor, processor, and post-processor, and the model was validated based on comparison with experimental results of benzene concentration and wind speed. Model validation results showed that the model is highly validated, and this model can be used for health risk assessment. The simulation and risk assessment results showed that benzene could be dispersion to an office building nearby, and the exposure risk has been unacceptable. According to the results of this study, a validated CFD model, could be very useful for decision-makers for control measures and possibly support them for emergency planning of probable accidents. Also, this model can be used to assess exposure to various types of accidents as well as other pollutants such as toluene, xylene, and ethylbenzene in different atmospheric conditions.

Keywords: health risk assessment, office building, Benzene, numerical simulation, CFD

Procedia PDF Downloads 114
21552 Transforming Data Science Curriculum Through Design Thinking

Authors: Samar Swaid

Abstract:

Today, corporates are moving toward the adoption of Design-Thinking techniques to develop products and services, putting their consumer as the heart of the development process. One of the leading companies in Design-Thinking, IDEO (Innovation, Design, Engineering Organization), defines Design-Thinking as an approach to problem-solving that relies on a set of multi-layered skills, processes, and mindsets that help people generate novel solutions to problems. Design thinking may result in new ideas, narratives, objects or systems. It is about redesigning systems, organizations, infrastructures, processes, and solutions in an innovative fashion based on the users' feedback. Tim Brown, president and CEO of IDEO, sees design thinking as a human-centered approach that draws from the designer's toolkit to integrate people's needs, innovative technologies, and business requirements. The application of design thinking has been witnessed to be the road to developing innovative applications, interactive systems, scientific software, healthcare application, and even to utilizing Design-Thinking to re-think business operations, as in the case of Airbnb. Recently, there has been a movement to apply design thinking to machine learning and artificial intelligence to ensure creating the "wow" effect on consumers. The Association of Computing Machinery task force on Data Science program states that" Data scientists should be able to implement and understand algorithms for data collection and analysis. They should understand the time and space considerations of algorithms. They should follow good design principles developing software, understanding the importance of those principles for testability and maintainability" However, this definition hides the user behind the machine who works on data preparation, algorithm selection and model interpretation. Thus, the Data Science program includes design thinking to ensure meeting the user demands, generating more usable machine learning tools, and developing ways of framing computational thinking. Here, describe the fundamentals of Design-Thinking and teaching modules for data science programs.

Keywords: data science, design thinking, AI, currculum, transformation

Procedia PDF Downloads 63
21551 Optimization of a Convolutional Neural Network for the Automated Diagnosis of Melanoma

Authors: Kemka C. Ihemelandu, Chukwuemeka U. Ihemelandu

Abstract:

The incidence of melanoma has been increasing rapidly over the past two decades, making melanoma a current public health crisis. Unfortunately, even as screening efforts continue to expand in an effort to ameliorate the death rate from melanoma, there is a need to improve diagnostic accuracy to decrease misdiagnosis. Artificial intelligence (AI) a new frontier in patient care has the ability to improve the accuracy of melanoma diagnosis. Convolutional neural network (CNN) a form of deep neural network, most commonly applied to analyze visual imagery, has been shown to outperform the human brain in pattern recognition. However, there are noted limitations with the accuracy of the CNN models. Our aim in this study was the optimization of convolutional neural network algorithms for the automated diagnosis of melanoma. We hypothesized that Optimal selection of the momentum and batch hyperparameter increases model accuracy. Our most successful model developed during this study, showed that optimal selection of momentum of 0.25, batch size of 2, led to a superior performance and a faster model training time, with an accuracy of ~ 83% after nine hours of training. We did notice a lack of diversity in the dataset used, with a noted class imbalance favoring lighter vs. darker skin tone. Training set image transformations did not result in a superior model performance in our study.

Keywords: melanoma, convolutional neural network, momentum, batch hyperparameter

Procedia PDF Downloads 91
21550 Educational Leadership for Social Justice: Meeting UK Muslim Expectation

Authors: Mochammad Thalut

Abstract:

This essay discusses how educational leadership response the Muslims pupils’ problems and their expectation about education in the UK. As we know, the Muslims community in the country is increasing. However, the debate about educational leadership is still limited to the separation between religion and academic by westerns approach. It is found that there are four major problems of Muslims pupils that need to solve by the educational leader to provide social justice in education. Leader-teacher as an Islamic concept of the educational leader is an alternative approach that can be used by the educational leader to overcome the problems. In the end, it is strongly recommended to bring this issue to the leadership development program in the UK to give all aspiring heads understanding about Muslims expectation about education.

Keywords: Muslim, education, leadership, identity

Procedia PDF Downloads 239
21549 Nonlinear Homogenized Continuum Approach for Determining Peak Horizontal Floor Acceleration of Old Masonry Buildings

Authors: Andreas Rudisch, Ralf Lampert, Andreas Kolbitsch

Abstract:

It is a well-known fact among the engineering community that earthquakes with comparatively low magnitudes can cause serious damage to nonstructural components (NSCs) of buildings, even when the supporting structure performs relatively well. Past research works focused mainly on NSCs of nuclear power plants and industrial plants. Particular attention should also be given to architectural façade elements of old masonry buildings (e.g. ornamental figures, balustrades, vases), which are very vulnerable under seismic excitation. Large numbers of these historical nonstructural components (HiNSCs) can be found in highly frequented historical city centers and in the event of failure, they pose a significant danger to persons. In order to estimate the vulnerability of acceleration sensitive HiNSCs, the peak horizontal floor acceleration (PHFA) is used. The PHFA depends on the dynamic characteristics of the building, the ground excitation, and induced nonlinearities. Consequently, the PHFA can not be generalized as a simple function of height. In the present research work, an extensive case study was conducted to investigate the influence of induced nonlinearity on the PHFA for old masonry buildings. Probabilistic nonlinear FE time-history analyses considering three different hazard levels were performed. A set of eighteen synthetically generated ground motions was used as input to the structure models. An elastoplastic macro-model (multiPlas) for nonlinear homogenized continuum FE-calculation was calibrated to multiple scales and applied, taking specific failure mechanisms of masonry into account. The macro-model was calibrated according to the results of specific laboratory and cyclic in situ shear tests. The nonlinear macro-model is based on the concept of multi-surface rate-independent plasticity. Material damage or crack formation are detected by reducing the initial strength after failure due to shear or tensile stress. As a result, shear forces can only be transmitted to a limited extent by friction when the cracking begins. The tensile strength is reduced to zero. The first goal of the calibration was the consistency of the load-displacement curves between experiment and simulation. The calibrated macro-model matches well with regard to the initial stiffness and the maximum horizontal load. Another goal was the correct reproduction of the observed crack image and the plastic strain activities. Again the macro-model proved to work well in this case and shows very good correlation. The results of the case study show that there is significant scatter in the absolute distribution of the PHFA between the applied ground excitations. An absolute distribution along the normalized building height was determined in the framework of probability theory. It can be observed that the extent of nonlinear behavior varies for the three hazard levels. Due to the detailed scope of the present research work, a robust comparison with code-recommendations and simplified PHFA distributions are possible. The chosen methodology offers a chance to determine the distribution of PHFA along the building height of old masonry structures. This permits a proper hazard assessment of HiNSCs under seismic loads.

Keywords: nonlinear macro-model, nonstructural components, time-history analysis, unreinforced masonry

Procedia PDF Downloads 152
21548 A Stochastic Model to Predict Earthquake Ground Motion Duration Recorded in Soft Soils Based on Nonlinear Regression

Authors: Issam Aouari, Abdelmalek Abdelhamid

Abstract:

For seismologists, the characterization of seismic demand should include the amplitude and duration of strong shaking in the system. The duration of ground shaking is one of the key parameters in earthquake resistant design of structures. This paper proposes a nonlinear statistical model to estimate earthquake ground motion duration in soft soils using multiple seismicity indicators. Three definitions of ground motion duration proposed by literature have been applied. With a comparative study, we select the most significant definition to use for predict the duration. A stochastic model is presented for the McCann and Shah Method using nonlinear regression analysis based on a data set for moment magnitude, source to site distance and site conditions. The data set applied is taken from PEER strong motion databank and contains shallow earthquakes from different regions in the world; America, Turkey, London, China, Italy, Chili, Mexico...etc. Main emphasis is placed on soft site condition. The predictive relationship has been developed based on 600 records and three input indicators. Results have been compared with others published models. It has been found that the proposed model can predict earthquake ground motion duration in soft soils for different regions and sites conditions.

Keywords: duration, earthquake, prediction, regression, soft soil

Procedia PDF Downloads 140
21547 Cultural Statistics in Governance: A Comparative Analysis between the UK and Finland

Authors: Sandra Toledo

Abstract:

There is an increasing tendency in governments for a more evidence-based policy-making and a stricter auditing of public spheres. Especially when budgets are tight, and taxpayers demand a bigger scrutiny over the use of the available resources, statistics and numbers appeared as an effective tool to produce data that supports investments done, as well as evaluating public policy performance. This pressure has not exempted the cultural and art fields. Finland like the rest of Nordic countries has kept its principles from the welfare state, whilst UK seems to be going towards the opposite direction, relaying more and more in private sectors and foundations, as the state folds back. The boom of the creative industries along with a managerial trend introduced by Tatcher in the UK brought, as a result, a commodification of arts within a market logic, where sponsorship and commercial viability were the keynotes. Finland on its part, in spite of following a more protectionist approach of arts, seems to be heading in a similar direction. Additionally, there is an international growing interest in the application of cultural participation studies and the comparability between countries in their results. Nonetheless, the standardization in the application of cultural surveys has not happened yet. Not only there are differences in the application of these type of surveys in terms of time and frequency, but also regarding those conducting them. Therefore, one hypothesis considered in this research is that behind the differences between countries in the application of cultural surveys, production and utilization of cultural statistics is the cultural policy model adopted by the government. In other words, the main goal of this research is to answer the following: What are the differences and similarities between Finland and the UK regarding the role cultural surveys have in cultural policy making? Along with other secondary questions such as: How does the cultural policy model followed by each country influence the role of cultural surveys in cultural policy making? and what are the differences at the local level? In order to answer these questions, strategic cultural policy documents and interviews with key informants will be used and analyzed as source data, using content analysis methods. Cultural statistics per se will not be compared, but instead their use as instruments of governing, and its relation to the cultural policy model. Aspects such as execution of cultural surveys, funding, periodicity, and use of statistics in formal reports and publications, will be studied in the written documents while in the interviews other elements such as perceptions from those involved in collecting cultural statistics or policy making, distribution of tasks and hierarchies among cultural and statistical institutions, and a general view will be the target. A limitation identified beforehand and that it is expected to encounter throughout the process is the language barrier in the case of Finland when it comes to official documents, which will be tackled by interviewing the authors of such papers and choosing key extract of them for translation.

Keywords: Finland, cultural statistics, cultural surveys, United Kingdom

Procedia PDF Downloads 218
21546 Current Design Approach for Seismic Resistant Automated Rack Supported Warehouses: Strong Points and Critical Aspects

Authors: Agnese Natali, Francesco Morelli, Walter Salvatore

Abstract:

Automated Rack Supported Warehouses (ARSWs) are structures currently designed as steel racks. Even if there are common characteristics, there are differences that don’t allow to adopt the same design approach. Aiming to highlight the factors influencing the design and the behavior of ARSWs, a set of 5 structures designed by 5 European companies specialized in this field is used to perform both a critical analysis of the design approaches and the assessment of the seismic performance, which is used to point out the criticalities and the necessity of new design philosophy.

Keywords: steel racks, automated rack supported warehouse, thin walled cold-formed elements, seismic assessment

Procedia PDF Downloads 149
21545 Exploring the Dualistic Nature of Design: Integrative Perspectives and Methodological Approaches in Design Research

Authors: Joni Agung Sudarmanto

Abstract:

The concept of design has historically been elusive and characterized by its fluidity, leading to divergent viewpoints on its fundamental nature. Guy Julier views design as inherent in material culture, while Sanders sees it as a collective endeavor focusing on the outcome. Design's dualistic nature, procedural and outcome-oriented, spans various domains, including objects, individuals, and the environment. This comprehensive view of design challenges the notion that design practice is distinct from research, highlighting their shared exploratory nature. The article explores methodological techniques in design research and the three prevalent approaches: "into design," "through design," and "for design." The contradictory meanings of design arise from its etymology and its duality as both process and result, leading to its integrative nature across objects, humans, and the environment. The parallels between design and research activities, underscoring their exploratory and knowledge-generating nature, are situated within creative research, challenging the perception of design practice as separate from research endeavors. The "into design" approach encourages interdisciplinary collaboration, enriching design research with diverse perspectives. The "through design" approach bridges theory and practice, producing more practical outcomes. The "for design" approach supports specific design solutions, providing designers with valuable guidance.

Keywords: dualistic nature of design, integrative perspectives, methodological approaches, design research

Procedia PDF Downloads 53
21544 Bio-Hub Ecosystems: Expansion of Traditional Life Cycle Analysis Metrics to Include Zero-Waste Circularity Measures

Authors: Kimberly Samaha

Abstract:

In order to attract new types of investors into the emerging Bio-Economy, a new set of metrics and measurement system is needed to better quantify the environmental, social and economic impacts of circular zero-waste design. The Bio-Hub Ecosystem model was developed to address a critical area of concern within the global energy market regarding the use of biomass as a feedstock for power plants. Lack of an economically-viable business model for bioenergy facilities has resulted in the continuation of idled and decommissioned plants. In particular, the forestry-based plants which have been an invaluable outlet for woody biomass surplus, forest health improvement, timber production enhancement, and especially reduction of wildfire risk. This study looked at repurposing existing biomass-energy plants into Circular Zero-Waste Bio-Hub Ecosystems. A Bio-Hub model that first targets a ‘whole-tree’ approach and then looks at the circular economics of co-hosting diverse industries (wood processing, aquaculture, agriculture) in the vicinity of the Biomass Power Plants facilities. It proposes not only models for integration of forestry, aquaculture, and agriculture in cradle-to-cradle linkages of what have typically been linear systems, but the proposal also allows for the early measurement of the circularity and impact of resource use and investment risk mitigation, for these systems. Typically, life cycle analyses measure environmental impacts of different industrial production stages and are not integrated with indicators of material use circularity. This concept paper proposes the further development of a new set of metrics that would illustrate not only the typical life-cycle analysis (LCA), which shows the reduction in greenhouse gas (GHG) emissions, but also the zero-waste circularity measures of mass balance of the full value chain of the raw material and energy content/caloric value. These new measures quantify key impacts in making hyper-efficient use of natural resources and eliminating waste to landfills. The project utilized traditional LCA using the GREET model where the standalone biomass energy plant case was contrasted with the integration of a jet-fuel biorefinery. The methodology was then expanded to include combinations of co-hosts that optimize the life cycle of woody biomass from tree to energy, CO₂, heat and wood ash both from an energy/caloric value and for mass balance to include reuse of waste streams which are typically landfilled. The major findings of both a formal LCA study resulted in the masterplan for the first Bio-Hub to be built in West Enfield, Maine. Bioenergy facilities are currently at a critical juncture where they have an opportunity to be repurposed into efficient, profitable and socially responsible investments, or be idled and scrapped. If proven as a model, the expedited roll-out of these innovative scenarios can set a new standard for circular zero-waste projects that advance the critical transition from the current ‘take-make-dispose’ paradigm inherent in the energy, forestry and food industries to a more sustainable bio-economy paradigm where waste streams become valuable inputs, supporting local and rural communities in simple, sustainable ways.

Keywords: bio-economy, biomass energy, financing, metrics

Procedia PDF Downloads 143
21543 Towards a Complete Automation Feature Recognition System for Sheet Metal Manufacturing

Authors: Bahaa Eltahawy, Mikko Ylihärsilä, Reino Virrankoski, Esko Petäjä

Abstract:

Sheet metal processing is automated, but the step from product models to the production machine control still requires human intervention. This may cause time consuming bottlenecks in the production process and increase the risk of human errors. In this paper we present a system, which automatically recognizes features from the CAD-model of the sheet metal product. By using these features, the system produces a complete model of the particular sheet metal product. Then the model is used as an input for the sheet metal processing machine. Currently the system is implemented, capable to recognize more than 11 of the most common sheet metal structural features, and the procedure is fully automated. This provides remarkable savings in the production time, and protects against the human errors. This paper presents the developed system architecture, applied algorithms and system software implementation and testing.

Keywords: feature recognition, automation, sheet metal manufacturing, CAD, CAM

Procedia PDF Downloads 338
21542 The Impact of Task-Based Language Teaching on Iranian Female Intermediate EFL Learners’ Writing Performance

Authors: Gholam Reza Parvizi, Hossein Azad, Ali Reza Kargar

Abstract:

This article investigated the impact of task-based language teaching (TBLT) on writing performance of the Iranian intermediate EFL learners. There were two groups of forty students of the intermediate female learners studying English in Jahad-e-Daneshgahi language institute, ranging in age from thirteen to nineteen. They participated in their regular classes in the institute and were assigned to two groups including an experimental group of task-based language teaching and a control group for the purpose of homogeneity, all students in two groups took an achievement test before the treatment. As a pre-test; students were assigned to write a task at the beginning of the course. One of the classes was conducted through talking a TBLT approach on their writing, while the other class followed regular patterns of teaching, namely traditional approach for TBLT group. There were some tasks chosen from learners’ textbook. The task selection was in accordance with learning standards for ESL and TOFEL writing sections. At the end of the treatment, a post-test was administered to both experimental group and the control group. Scoring was done on the basis of scoring scale of “expository writing quality scale”. The researcher used paired samples t-test to analyze the effect of TBLT teaching approach on the writing performance of the learners. The data analysis revealed that the subjects in TBLT group performed better on the writing performance post-test than the subjects in control group. The findings of the study also demonstrated that TBLT would enhance writing performance in the group of learners. Moreover, it was indicated that TBLT has been effective in teaching writing performance to Iranian EFL learners

Keywords: task-based language teaching, task, language teaching approach, writing proficiency, EFL learners

Procedia PDF Downloads 406
21541 Experimental and Computational Investigations of Baffle Position Effects on ‎the Performance of Oil and Water Separator Tanks

Authors: Haitham A. Hussein, Rozi Abdullah‏‎, Md Azlin Md Said ‎

Abstract:

Gravity separator tanks are used to separate oil from water in treatment units. Achieving the best flow ‎uniformity in a separator tank will improve the maximum removal efficiency of oil globules from water. ‎In this study, the effect on hydraulic performance of different baffle structure positions inside a tank ‎was investigated. Experimental data and 2D computation fluid dynamics were used for analysis. In the ‎numerical model, two-phase flow (drift flux model) was used to validate one-phase flow. For ‎laboratory measurements, the velocity fields were measured using an acoustic Doppler velocimeter. The ‎measurements were compared with the result of the computational model. The results of the ‎experimental and computational simulations indicate that the best location of a baffle structure is ‎achieved when the standard deviation of the velocity profile and the volume of the circulation zone ‎inside the tank are minimized.‎

Keywords: gravity separator tanks, CFD, baffle position, two phase flow, ADV, oil droplet

Procedia PDF Downloads 308
21540 Numerical Simulation of Three-Dimensional Cavitating Turbulent Flow in Francis Turbines with ANSYS

Authors: Raza Abdulla Saeed

Abstract:

In this study, the three-dimensional cavitating turbulent flow in a complete Francis turbine is simulated using mixture model for cavity/liquid two-phase flows. Numerical analysis is carried out using ANSYS CFX software release 12, and standard k-ε turbulence model is adopted for this analysis. The computational fluid domain consist of spiral casing, stay vanes, guide vanes, runner and draft tube. The computational domain is discretized with a three-dimensional mesh system of unstructured tetrahedron mesh. The finite volume method (FVM) is used to solve the governing equations of the mixture model. Results of cavitation on the runner’s blades under three different boundary conditions are presented and discussed. From the numerical results it has been found that the numerical method was successfully applied to simulate the cavitating two-phase turbulent flow through a Francis turbine, and also cavitation is clearly predicted in the form of water vapor formation inside the turbine. By comparison the numerical prediction results with a real runner; it’s shown that the region of higher volume fraction obtained by simulation is consistent with the region of runner cavitation damage.

Keywords: computational fluid dynamics, hydraulic francis turbine, numerical simulation, two-phase mixture cavitation model

Procedia PDF Downloads 541
21539 The Prevalence of Coronary Artery Disease and Its Risk Factors in Rural and Urban Areas of Pakistan

Authors: Muhammad Kamran Hanif Khan, Fahad Mushtaq

Abstract:

Background: In both developed and underdeveloped countries, coronary artery disease (CAD) is a serious cause of death and disability. Cardiovascular disease (CVD) is becoming more prevalent in emerging countries like Pakistan due to the spread and acceptance of Western lifestyles. Material and Methods: An observational cross-sectional investigation was conducted, and data collection relied on a random cluster sampling method. The sample size for this cross-sectional study was calculated using the following factors: estimated true proportion of 17.5%, desired precision of 2%, and confidence interval of 95%. The data for this study was collected from a sample of 1387 adults. Results: The average age of those living in rural areas is 55.24 years, compared to 52.60 years for those living in urban areas. The mean fasting blood glucose of the urban participants is 105.28 mg/dL, which is higher than the mean fasting blood glucose of the rural participants, which is 102.06 mg/dL. The mean total cholesterol of the urban participants is 192.20 mg/dL, which is slightly higher than the mean total cholesterol of the rural participants, which is 191.97 mg/dL. CAD prevalence is greater in urban areas than in rural areas. ECG abnormalities prevalence is 16.1% in females compared to 12.5% in men. Conclusion: The prevalence of CAD is more common in urban areas than in rural ones for all of the measures of CAD used in the study.

Keywords: CVD prevalence, CVD risk factors, rural area, urban area

Procedia PDF Downloads 60
21538 Moroccan Human Ecological Behavior: Grounded Theory Approach

Authors: Dalal Tarfaoui, Salah Zkim

Abstract:

Today, environmental sustainability is everyone’s concern as it contributes in many aspects to a country's development. Morocco is also aware of the increasing threats to its natural resources. Accordingly, many projects and research have been discussed pointing mainly to water security, pollution, desertification, and land degradation, but few studies bothered to dig into the human demeanor to disclose its ecological behavior. Human behavior is accountable for environment deterioration in the first place, but we keep fighting the symptoms instead of limiting the root causes. In the conceptual framework highlighted in the present article, semi-structured interviews have been conducted using a grounded theory approach. Initially this study will serve as a pilot study and a cornerstone to approve a bigger project now in progress. Beyond the existing general ecological measures (GEM), this study has chosen the grounded theory approach to bring out firsthand insights, and probe to which extent an ecological dimension exists in Morocco as a developing country. The discourse of the ecological behavior within the Moroccan context is seen in more realist, social, and community philosophy. The study has revealed an appreciative ecological behavior that is unfortunately repressed by variables beyond people’s control, which would prevent the people’s environmental good intentions to be translated into real ecological actions.

Keywords: ecological behavior, ecological dimension, variables beyond people’s control, Morocco

Procedia PDF Downloads 472
21537 Factors Influencing Consumer Adoption of Digital Banking Apps in the UK

Authors: Sevelina Ndlovu

Abstract:

Financial Technology (fintech) advancement is recognised as one of the most transformational innovations in the financial industry. Fintech has given rise to internet-only digital banking, a novel financial technology advancement, and innovation that allows banking services through internet applications with no need for physical branches. This technology is becoming a new banking normal among consumers for its ubiquitous and real-time access advantages. There is evident switching and migration from traditional banking towards these fintech facilities, which could possibly pose a systemic risk if not properly understood and monitored. Fintech advancement has also brought about the emergence and escalation of financial technology consumption themes such as trust, security, perceived risk, and sustainability within the banking industry, themes scarcely covered in existing theoretic literature. To that end, the objective of this research is to investigate factors that determine fintech adoption and propose an integrated adoption model. This study aims to establish what the significant drivers of adoption are and develop a conceptual model that integrates technological, behavioral, and environmental constructs by extending the Unified Theory of Acceptance and Use of Technology 2 (UTAUT2). It proposes integrating constructs that influence financial consumption themes such as trust, perceived risk, security, financial incentives, micro-investing opportunities, and environmental consciousness to determine the impact of these factors on the adoption and intention to use digital banking apps. The main advantage of this conceptual model is the consolidation of a greater number of predictor variables that can provide a fuller explanation of the consumer's adoption of digital banking Apps. Moderating variables of age, gender, and income are incorporated. To the best of author’s knowledge, this study is the first that extends the UTAUT2 model with this combination of constructs to investigate user’s intention to adopt internet-only digital banking apps in the UK context. By investigating factors that are not included in the existing theories but are highly pertinent to the adoption of internet-only banking services, this research adds to existing knowledge and extends the generalisability of the UTAUT2 in a financial services adoption context. This is something that fills a gap in knowledge, as highlighted to needing further research on UTAUT2 after reviewing the theory in 2016 from its original version of 2003. To achieve the objectives of this study, this research assumes a quantitative research approach to empirically test the hypotheses derived from existing literature and pilot studies to give statistical support to generalise the research findings for further possible applications in theory and practice. This research is explanatory or casual in nature and uses cross-section primary data collected through a survey method. Convenient and purposive sampling using structured self-administered online questionnaires is used for data collection. The proposed model is tested using Structural Equation Modelling (SEM), and the analysis of primary data collected through an online survey is processed using Smart PLS software with a sample size of 386 digital bank users. The results are expected to establish if there are significant relationships between the dependent and independent variables and establish what the most influencing factors are.

Keywords: banking applications, digital banking, financial technology, technology adoption, UTAUT2

Procedia PDF Downloads 50
21536 An Approach to Practical Determination of Fair Premium Rates in Crop Hail Insurance Using Short-Term Insurance Data

Authors: Necati Içer

Abstract:

Crop-hail insurance plays a vital role in managing risks and reducing the financial consequences of hail damage on crop production. Predicting insurance premium rates with short-term data is a major difficulty in numerous nations because of the unique characteristics of hailstorms. This study aims to suggest a feasible approach for establishing equitable premium rates in crop-hail insurance for nations with short-term insurance data. The primary goal of the rate-making process is to determine premium rates for high and zero loss costs of villages and enhance their credibility. To do this, a technique was created using the author's practical knowledge of crop-hail insurance. With this approach, the rate-making method was developed using a range of temporal and spatial factor combinations with both hypothetical and real data, including extreme cases. This article aims to show how to incorporate the temporal and spatial elements into determining fair premium rates using short-term insurance data. The article ends with a suggestion on the ultimate premium rates for insurance contracts.

Keywords: crop-hail insurance, premium rate, short-term insurance data, spatial and temporal parameters

Procedia PDF Downloads 30
21535 Using Confirmatory Factor Analysis to Test the Dimensional Structure of Tourism Service Quality

Authors: Ibrahim A. Elshaer, Alaa M. Shaker

Abstract:

Several previous empirical studies have operationalized service quality as either a multidimensional or unidimensional construct. While few earlier studies investigated some practices of the assumed dimensional structure of service quality, no study has been found to have tested the construct’s dimensionality using confirmatory factor analysis (CFA). To gain a better insight into the dimensional structure of service quality construct, this paper tests its dimensionality using three CFA models (higher order factor model, oblique factor model, and one factor model) on a set of data collected from 390 British tourists visited Egypt. The results of the three tests models indicate that service quality construct is multidimensional. This result helps resolving the problems that might arise from the lack of clarity concerning the dimensional structure of service quality, as without testing the dimensional structure of a measure, researchers cannot assume that the significant correlation is a result of factors measuring the same construct.

Keywords: service quality, dimensionality, confirmatory factor analysis, Egypt

Procedia PDF Downloads 573
21534 Microseismicity of the Tehran Region Based on Three Seismic Networks

Authors: Jamileh Vasheghani Farahani

Abstract:

The main purpose of this research is to show the current active faults and active tectonic of the area by three seismic networks in Tehran region: 1-Tehran Disaster Mitigation and Management Organization (TDMMO), 2-Broadband Iranian National Seismic Network Center (BIN), 3-Iranian Seismological Center (IRSC). In this study, we analyzed microearthquakes happened in Tehran city and its surroundings using the Tehran networks from 1996 to 2015. We found some active faults and trends in the region. There is a 200-year history of historical earthquakes in Tehran. Historical and instrumental seismicity show that the east of Tehran is more active than the west. The Mosha fault in the North of Tehran is one of the active faults of the central Alborz. Moreover, other major faults in the region are Kahrizak, Eyvanakey, Parchin and North Tehran faults. An important seismicity region is an intersection of the Mosha and North Tehran fault systems (Kalan village in Lavasan). This region shows a cluster of microearthquakes. According to the historical and microseismic events analyzed in this research, there is a seismic gap in SE of Tehran. The empirical relationship is used to assess the Mmax based on the rupture length. There is a probability of occurrence of a strong motion of 7.0 to 7.5 magnitudes in the region (based on the assessed capability of the major faults such as Parchin and Eyvanekey faults and historical earthquakes).

Keywords: Iran, major faults, microseismicity, Tehran

Procedia PDF Downloads 353