Search results for: ontology engineering methodology
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8281

Search results for: ontology engineering methodology

7411 Plural Perspectives in Conservation Conflicts: The Role of Iconic Species

Authors: Jean Hugé, Francisco Benitez-Capistros, Giorgia Camperio-Ciani

Abstract:

Addressing conservation conflicts requires the consideration of multiple stakeholders' perspectives and knowledge claims, in order to inform complex and possibly contentious decision-making dilemmas. Hence, a better understanding of why people in particular contexts act in a particular way in a conservation conflict is needed. First, this contribution aims at providing and applying an approach to map and interpret the diversity of subjective viewpoints with regard to iconic species in conservation conflicts. Secondly, this contribution aims to feed the reflection on the possible consequences of the diversity of perspectives for the future management of wildlife (in particular iconic species), based on case studies in Galapagos and Malaysia. The use of the semi-quantitative Q methodology allowed us to identify various perspectives on conservation in different social-ecological contexts. While the presence of iconic species may lead to a more passionate and emotional debate, it may also provide more opportunities for finding common ground and for jointly developing acceptable management solutions that will depolarize emergent, long-lasting or latent conservation conflicts. Based on the research team’s experience in the field, and on the integration of ecological and social knowledge, methodological and management recommendations are made with regard to conservation conflicts involving iconic wildlife. The mere presence of iconic wildlife does not guarantee its centrality in conservation conflicts, and comparisons will be drawn between the cases of the giant tortoises (Chelonoidis spec.) in Galapagos, Ecuador and the Milky Stork (Mycteria cinerea) in western peninsular Malaysia. Acknowledging the diversity of viewpoints, reflecting how different stakeholders see, act and talk about wildlife management, highlights the need to develop pro-active and resilient strategies to deal with these issues.

Keywords: conservation conflicts, Q methodology, Galapagos, Malaysia, giant tortoise, milky stork

Procedia PDF Downloads 284
7410 Holistic Urban Development: Incorporating Both Global and Local Optimization

Authors: Christoph Opperer

Abstract:

The rapid urbanization of modern societies and the need for sustainable urban development demand innovative solutions that meet both individual and collective needs while addressing environmental concerns. To address these challenges, this paper presents a study that explores the potential of spatial and energetic/ecological optimization to enhance the performance of urban settlements, focusing on both architectural and urban scales. The study focuses on the application of biological principles and self-organization processes in urban planning and design, aiming to achieve a balance between ecological performance, architectural quality, and individual living conditions. The research adopts a case study approach, focusing on a 10-hectare brownfield site in the south of Vienna. The site is surrounded by a small-scale built environment as an appropriate starting point for the research and design process. However, the selected urban form is not a prerequisite for the proposed design methodology, as the findings can be applied to various urban forms and densities. The methodology used in this research involves dividing the overall building mass and program into individual small housing units. A computational model has been developed to optimize the distribution of these units, considering factors such as solar exposure/radiation, views, privacy, proximity to sources of disturbance (such as noise), and minimal internal circulation areas. The model also ensures that existing vegetation and buildings on the site are preserved and incorporated into the optimization and design process. The model allows for simultaneous optimization at two scales, architectural and urban design, which have traditionally been addressed sequentially. This holistic design approach leads to individual and collective benefits, resulting in urban environments that foster a balance between ecology and architectural quality. The results of the optimization process demonstrate a seemingly random distribution of housing units that, in fact, is a densified hybrid between traditional garden settlements and allotment settlements. This urban typology is selected due to its compatibility with the surrounding urban context, although the presented methodology can be extended to other forms of urban development and density levels. The benefits of this approach are threefold. First, it allows for the determination of ideal housing distribution that optimizes solar radiation for each building density level, essentially extending the concept of sustainable building to the urban scale. Second, the method enhances living quality by considering the orientation and positioning of individual functions within each housing unit, achieving optimal views and privacy. Third, the algorithm's flexibility and robustness facilitate the efficient implementation of urban development with various stakeholders, architects, and construction companies without compromising its performance. The core of the research is the application of global and local optimization strategies to create efficient design solutions. By considering both, the performance of individual units and the collective performance of the urban aggregation, we ensure an optimal balance between private and communal benefits. By promoting a holistic understanding of urban ecology and integrating advanced optimization strategies, our methodology offers a sustainable and efficient solution to the challenges of modern urbanization.

Keywords: sustainable development, self-organization, ecological performance, solar radiation and exposure, daylight, visibility, accessibility, spatial distribution, local and global optimization

Procedia PDF Downloads 66
7409 Designing and Enacting an Adjunct Faculty Self-Study of Teaching Community

Authors: Anastasia P. Samaras, Allison Ward-Parsons, Beth Dalbec, Paula Cristina Azevedo, Anya Evmenova, Arvinder Johri, Lynne Scott Constantine, Lesley Smith

Abstract:

Two cycles of qualitative data were collected. Cycle One sources included participant survey results, participant postings on Blackboard forums, facilitator memos, and meeting notes as well as reflections and notes from whole-group meetings.

Keywords: adjunct faculty, professional development, self-study methodology, teaching

Procedia PDF Downloads 162
7408 Site Investigations and Mitigation Measures of Landslides in Sainj and Tirthan Valley of Kullu District, Himachal Pradesh, India

Authors: Laxmi Versain, R. S. Banshtu

Abstract:

Landslides are found to be the most commonly occurring geological hazards in the mountainous regions of the Himalaya. This mountainous zone is facing large number of seismic turbulences, climatic changes, and topography changes due to increasing urbanization. That eventually has lead several researchers working for best suitable methodologies to infer the ultimate results. Landslide Hazard Zonation has widely come as suitable method to know the appropriate factors that trigger the lansdslide phenomenon on higher reaches. Most vulnerable zones or zones of weaknesses are indentified and safe mitigation measures are to be suggested to mitigate and channelize the study of an effected area. Use of Landslide Hazard Zonation methodology in relative zones of weaknesses depend upon the data available for the particular site. The causative factors are identified and data is made available to infer the results. Factors like seismicity in mountainous region have closely associated to make the zones of thrust and faults or lineaments more vulnerable. Data related to soil, terrain, rainfall, geology, slope, nature of terrain, are found to be varied for various landforms and areas. Thus, the relative causes are to be identified and classified by giving specific weightage to each parameter. Factors which cause the instability of slopes are several and can be grouped to infer the potential modes of failure. The triggering factors of the landslides on the mountains are not uniform. The urbanization has crawled like ladder and emergence of concrete jungles are in a very fast pace on hilly region of Himalayas. The local terrains has largely been modified and hence instability of several zones are triggering at very fast pace. More strategic and pronounced methods are required to reduce the effect of landslide.

Keywords: zonation, LHZ, susceptible, weightages, methodology

Procedia PDF Downloads 196
7407 Validation of the Linear Trend Estimation Technique for Prediction of Average Water and Sewerage Charge Rate Prices in the Czech Republic

Authors: Aneta Oblouková, Eva Vítková

Abstract:

The article deals with the issue of water and sewerage charge rate prices in the Czech Republic. The research is specifically focused on the analysis of the development of the average prices of water and sewerage charge rate in the Czech Republic in the years 1994-2021 and on the validation of the chosen methodology relevant for the prediction of the development of the average prices of water and sewerage charge rate in the Czech Republic. The research is based on data collection. The data for this research was obtained from the Czech Statistical Office. The aim of the paper is to validate the relevance of the mathematical linear trend estimate technique for the calculation of the predicted average prices of water and sewerage charge rates. The real values of the average prices of water and sewerage charge rates in the Czech Republic in the years 1994-2018 were obtained from the Czech Statistical Office and were converted into a mathematical equation. The same type of real data was obtained from the Czech Statistical Office for the years 2019-2021. Prediction of the average prices of water and sewerage charge rates in the Czech Republic in the years 2019-2021 were also calculated using a chosen method -a linear trend estimation technique. The values obtained from the Czech Statistical Office and the values calculated using the chosen methodology were subsequently compared. The research result is a validation of the chosen mathematical technique to be a suitable technique for this research.

Keywords: Czech Republic, linear trend estimation, price prediction, water and sewerage charge rate

Procedia PDF Downloads 120
7406 Quantitative Evaluation of Supported Catalysts Key Properties from Electron Tomography Studies: Assessing Accuracy Using Material-Realistic 3D-Models

Authors: Ainouna Bouziane

Abstract:

The ability of Electron Tomography to recover the 3D structure of catalysts, with spatial resolution in the subnanometer scale, has been widely explored and reviewed in the last decades. A variety of experimental techniques, based either on Transmission Electron Microscopy (TEM) or Scanning Transmission Electron Microscopy (STEM) have been used to reveal different features of nanostructured catalysts in 3D, but High Angle Annular Dark Field imaging in STEM mode (HAADF-STEM) stands out as the most frequently used, given its chemical sensitivity and avoidance of imaging artifacts related to diffraction phenomena when dealing with crystalline materials. In this regard, our group has developed a methodology that combines image denoising by undecimated wavelet transforms (UWT) with automated, advanced segmentation procedures and parameter selection methods using CS-TVM (Compressed Sensing-total variation minimization) algorithms to reveal more reliable quantitative information out of the 3D characterization studies. However, evaluating the accuracy of the magnitudes estimated from the segmented volumes is also an important issue that has not been properly addressed yet, because a perfectly known reference is needed. The problem particularly complicates in the case of multicomponent material systems. To tackle this key question, we have developed a methodology that incorporates volume reconstruction/segmentation methods. In particular, we have established an approach to evaluate, in quantitative terms, the accuracy of TVM reconstructions, which considers the influence of relevant experimental parameters like the range of tilt angles, image noise level or object orientation. The approach is based on the analysis of material-realistic, 3D phantoms, which include the most relevant features of the system under analysis.

Keywords: electron tomography, supported catalysts, nanometrology, error assessment

Procedia PDF Downloads 88
7405 Elements of Successful Commercial Streets: A Socio-Spatial Analysis of Commercial Streets in Cairo

Authors: Toka Aly

Abstract:

Historically, marketplaces were the most important nodes and focal points of cities, where different activities took place. Commercial streets offer more than just spaces for shopping; they also offer choices for social activities and cultural exchange. They are considered the backbone of the city’s vibrancy and vitality. Despite that, the public life in Cairo’s commercial streets has deteriorated, where the shopping activities became reliant mainly on 'planned formal places', mainly in privatized or indoor spaces like shopping malls. The main aim of this paper is to explore the key elements and tools of assessing the successfulness of commercial streets in Cairo. The methodology followed in this paper is based on a case study methodology (multiple cases) that is based on assessing and analyzing the physical and social elements in historical and contemporary commercial streets in El Muiz Street and Baghdad Street in Cairo. The data collection is based on personal observations, photographs, maps and street sections. Findings indicate that the key factors of analyzing commercial streets are factors affecting the sensory experience, factors affecting the social behavior, and general aspects that attract people. Findings also indicate that urban features have clear influence on shopping pedestrian activities in both streets. Moreover, in order for a commercial street to be successful, shopping patterns must provide people with a quality public space that can provide easy navigation and accessibility, good visual continuity, and well-designed urban features and social gathering. Outcomes of this study will be a significant endeavor in providing a good background for urban designers on analyzing and assessing successfulness of commercial streets. The study will also help in understanding the different physical and social pattern of vending activities taking place in Cairo.

Keywords: activities, commercial street, marketplace, successful, vending

Procedia PDF Downloads 302
7404 Linguistic Politeness in Higher Education Teaching Chinese as an Additional Language

Authors: Leei Wong

Abstract:

Changes in globalized contexts precipitate changing perceptions concerning linguistic politeness practices. Within these changing contexts, misunderstanding or stereotypification of politeness norms may lead to negative consequences such as hostility or even communication breakdown. With China’s rising influence, the country is offering a vast potential market for global economic development and diplomatic relations and opportunities for intercultural interaction, and many outside China are subsequently learning Chinese. These trends bring both opportunities and pitfalls for intercultural communication, including within the important field of politeness awareness. One internationally recognized benchmark for the study and classification of languages – the updated 2018 CEFR (Common European Framework of Reference for Language) Companion Volume New Descriptors (CEFR/CV) – classifies politeness as a B1 (or intermediate) level descriptor on the scale of Politeness Conventions. This provides some indication of the relevance of politeness awareness within new globalized contexts for fostering better intercultural communication. This study specifically examines Bald on record politeness strategies presented in current beginner TCAL textbooks used in Australian tertiary education through content-analysis. The investigation in this study involves the purposive sampling of commercial textbooks published in America and China followed by interpretive content analysis. The philosophical position of this study is therefore located within an interpretivist ontology, with a subjectivist epistemological perspective. It sets out with the aim to illuminate the characteristics of Chinese Bald on record strategies that are deemed significant in the present-world context through Chinese textbook writers and curriculum designers. The data reveals significant findings concerning politeness strategies in beginner stage curriculum, and also opens the way for further research on politeness strategies in intermediate and advanced level textbooks for additional language learners. This study will be useful for language teachers, and language teachers-in-training, by generating awareness and providing insights and advice into the teaching and learning of Bald on record politeness strategies. Authors of textbooks may also benefit from the findings of this study, as awareness is raised of the need to include reference to understanding politeness in language, and how this might be approached.

Keywords: linguistic politeness, higher education, Chinese language, additional language

Procedia PDF Downloads 104
7403 A Study on the Correlation Analysis between the Pre-Sale Competition Rate and the Apartment Unit Plan Factor through Machine Learning

Authors: Seongjun Kim, Jinwooung Kim, Sung-Ah Kim

Abstract:

The development of information and communication technology also affects human cognition and thinking, especially in the field of design, new techniques are being tried. In architecture, new design methodologies such as machine learning or data-driven design are being applied. In particular, these methodologies are used in analyzing the factors related to the value of real estate or analyzing the feasibility in the early planning stage of the apartment housing. However, since the value of apartment buildings is often determined by external factors such as location and traffic conditions, rather than the interior elements of buildings, data is rarely used in the design process. Therefore, although the technical conditions are provided, the internal elements of the apartment are difficult to apply the data-driven design in the design process of the apartment. As a result, the designers of apartment housing were forced to rely on designer experience or modular design alternatives rather than data-driven design at the design stage, resulting in a uniform arrangement of space in the apartment house. The purpose of this study is to propose a methodology to support the designers to design the apartment unit plan with high consumer preference by deriving the correlation and importance of the floor plan elements of the apartment preferred by the consumers through the machine learning and reflecting this information from the early design process. The data on the pre-sale competition rate and the elements of the floor plan are collected as data, and the correlation between pre-sale competition rate and independent variables is analyzed through machine learning. This analytical model can be used to review the apartment unit plan produced by the designer and to assist the designer. Therefore, it is possible to make a floor plan of apartment housing with high preference because it is possible to feedback apartment unit plan by using trained model when it is used in floor plan design of apartment housing.

Keywords: apartment unit plan, data-driven design, design methodology, machine learning

Procedia PDF Downloads 268
7402 Effect of Cooking Time, Seed-To-Water Ratio and Soaking Time on the Proximate Composition and Functional Properties of Tetracarpidium conophorum (Nigerian Walnut) Seeds

Authors: J. O. Idoko, C. N. Michael, T. O. Fasuan

Abstract:

This study investigated the effects of cooking time, seed-to-water ratio and soaking time on proximate and functional properties of African walnut seed using Box-Behnken design and Response Surface Methodology (BBD-RSM) with a view to increase its utilization in the food industry. African walnut seeds were sorted washed, soaked, cooked, dehulled, sliced, dried and milled. Proximate analysis and functional properties of the samples were evaluated using standard procedures. Data obtained were analyzed using descriptive and inferential statistics. Quadratic models were obtained to predict the proximate and functional qualities as a function of cooking time, seed-to-water ratio and soaking time. The results showed that the crude protein ranged between 11.80% and 23.50%, moisture content ranged between 1.00% and 4.66%, ash content ranged between 3.35% and 5.25%, crude fibre ranged from 0.10% to 7.25% and carbohydrate ranged from 1.22% to 29.35%. The functional properties showed that soluble protein ranged from 16.26% to 42.96%, viscosity ranged from 23.43 mPas to 57 mPas, emulsifying capacity ranged from 17.14% to 39.43% and water absorption capacity ranged from 232% to 297%. An increase in the volume of water used during cooking resulted in loss of water soluble protein through leaching, the length of soaking time and the moisture content of the dried product are inversely related, ash content is inversely related to the cooking time and amount of water used, extraction of fat is enhanced by increase in soaking time while increase in cooking and soaking times result into decrease in fibre content. The results obtained indicated that African walnut could be used in several food formulations as protein supplement and binder.

Keywords: African walnut, functional properties, proximate analysis, response surface methodology

Procedia PDF Downloads 396
7401 Capturing Public Voices: The Role of Social Media in Heritage Management

Authors: Mahda Foroughi, Bruno de Anderade, Ana Pereira Roders

Abstract:

Social media platforms have been increasingly used by locals and tourists to express their opinions about buildings, cities, and built heritage in particular. Most recently, scholars have been using social media to conduct innovative research on built heritage and heritage management. Still, the application of artificial intelligence (AI) methods to analyze social media data for heritage management is seldom explored. This paper investigates the potential of short texts (sentences and hashtags) shared through social media as a data source and artificial intelligence methods for data analysis for revealing the cultural significance (values and attributes) of built heritage. The city of Yazd, Iran, was taken as a case study, with a particular focus on windcatchers, key attributes conveying outstanding universal values, as inscribed on the UNESCO World Heritage List. This paper has three subsequent phases: 1) state of the art on the intersection of public participation in heritage management and social media research; 2) methodology of data collection and data analysis related to coding people's voices from Instagram and Twitter into values of windcatchers over the last ten-years; 3) preliminary findings on the comparison between opinions of locals and tourists, sentiment analysis, and its association with the values and attributes of windcatchers. Results indicate that the age value is recognized as the most important value by all interest groups, while the political value is the least acknowledged. Besides, the negative sentiments are scarcely reflected (e.g., critiques) in social media. Results confirm the potential of social media for heritage management in terms of (de)coding and measuring the cultural significance of built heritage for windcatchers in Yazd. The methodology developed in this paper can be applied to other attributes in Yazd and also to other case studies.

Keywords: social media, artificial intelligence, public participation, cultural significance, heritage, sentiment analysis

Procedia PDF Downloads 115
7400 Phenomena-Based Approach for Automated Generation of Process Options and Process Models

Authors: Parminder Kaur Heer, Alexei Lapkin

Abstract:

Due to global challenges of increased competition and demand for more sustainable products/processes, there is a rising pressure on the industry to develop innovative processes. Through Process Intensification (PI) the existing and new processes may be able to attain higher efficiency. However, very few PI options are generally considered. This is because processes are typically analysed at a unit operation level, thus limiting the search space for potential process options. PI performed at more detailed levels of a process can increase the size of the search space. The different levels at which PI can be achieved is unit operations, functional and phenomena level. Physical/chemical phenomena form the lowest level of aggregation and thus, are expected to give the highest impact because all the intensification options can be described by their enhancement. The objective of the current work is thus, generation of numerous process alternatives based on phenomena, and development of their corresponding computer aided models. The methodology comprises: a) automated generation of process options, and b) automated generation of process models. The process under investigation is disintegrated into functions viz. reaction, separation etc., and these functions are further broken down into the phenomena required to perform them. E.g., separation may be performed via vapour-liquid or liquid-liquid equilibrium. A list of phenomena for the process is formed and new phenomena, which can overcome the difficulties/drawbacks of the current process or can enhance the effectiveness of the process, are added to the list. For instance, catalyst separation issue can be handled by using solid catalysts; the corresponding phenomena are identified and added. The phenomena are then combined to generate all possible combinations. However, not all combinations make sense and, hence, screening is carried out to discard the combinations that are meaningless. For example, phase change phenomena need the co-presence of the energy transfer phenomena. Feasible combinations of phenomena are then assigned to the functions they execute. A combination may accomplish a single or multiple functions, i.e. it might perform reaction or reaction with separation. The combinations are then allotted to the functions needed for the process. This creates a series of options for carrying out each function. Combination of these options for different functions in the process leads to the generation of superstructure of process options. These process options, which are formed by a list of phenomena for each function, are passed to the model generation algorithm in the form of binaries (1, 0). The algorithm gathers the active phenomena and couples them to generate the model. A series of models is generated for the functions, which are combined to get the process model. The most promising process options are then chosen subjected to a performance criterion, for example purity of product, or via a multi-objective Pareto optimisation. The methodology was applied to a two-step process and the best route was determined based on the higher product yield. The current methodology can identify, produce and evaluate process intensification options from which the optimal process can be determined. It can be applied to any chemical/biochemical process because of its generic nature.

Keywords: Phenomena, Process intensification, Process models , Process options

Procedia PDF Downloads 232
7399 Carbon Footprint of Educational Establishments: The Case of the University of Alicante

Authors: Maria R. Mula-Molina, Juan A. Ferriz-Papi

Abstract:

Environmental concerns are increasingly obtaining higher priority in sustainability agenda of educational establishments. This is important not only for its environmental performance in its own right as an organization, but also to present a model for its students. On the other hand, universities play an important role on research and innovative solutions for measuring, analyzing and reducing environmental impacts for different activities. The assessment and decision-making process during the activity of educational establishments is linked to the application of robust indicators. In this way, the carbon footprint is a developing indicator for sustainability that helps understand the direct impact on climate change. But it is not easy to implement. There is a large amount of considering factors involved that increases its complexity, such as different uses at the same time (research, lecturing, administration), different users (students, staff) or different levels of activity (lecturing, exam or holidays periods). The aim of this research is to develop a simplified methodology for calculating and comparing carbon emissions per user at university campus considering two main aspects for carbon accountings: Building operations and transport. Different methodologies applied in other Spanish university campuses are analyzed and compared to obtain a final proposal to be developed in this type of establishments. First, building operation calculation considers the different uses and energy sources consumed. Second, for transport calculation, the different users and working hours are calculated separately, as well as their origin and traveling preferences. For every transport, a different conversion factor is used depending on carbon emissions produced. The final result is obtained as an average of carbon emissions produced per user. A case study is applied to the University of Alicante campus in San Vicente del Raspeig (Spain), where the carbon footprint is calculated. While the building operation consumptions are known per building and month, it does not happen with transport. Only one survey about the habit of transport for users was developed in 2009/2010, so no evolution of results can be shown in this case. Besides, building operations are not split per use, as building services are not monitored separately. These results are analyzed in depth considering all factors and limitations. Besides, they are compared to other estimations in other campuses. Finally, the application of the presented methodology is also studied. The recommendations concluded in this study try to enhance carbon emission monitoring and control. A Carbon Action Plan is then a primary solution to be developed. On the other hand, the application developed in the University of Alicante campus cannot only further enhance the methodology itself, but also render the adoption by other educational establishments more readily possible and yet with a considerable degree of flexibility to cater for their specific requirements.

Keywords: building operations, built environment, carbon footprint, climate change, transport

Procedia PDF Downloads 295
7398 A Critical Reflection of Ableist Methodologies: Approaching Interviews and Go-Along Interviews

Authors: Hana Porkertová, Pavel Doboš

Abstract:

Based on a research project studying the experience of visually disabled people with urban space in the Czech Republic, the conference contribution discusses the limits of social-science methodologies used in sociology and human geography. It draws on actor-network theory, assuming that science does not describe reality but produces it. Methodology connects theory, research questions, ways to answer them (methods), and results. A research design utilizing ableist methodologies can produce ableist realities. Therefore, it was necessary to adjust the methods so that they could mediate blind experience to the scientific community without reproducing ableism. The researchers faced multiple challenges, ranging from questionable validity to how to research experience that differs from that of the researchers who are able-bodied. Finding a suitable theory that could be used as an analytical tool that would demonstrate space and blind experience as multiple, dynamic, and mutually constructed was the first step that could offer a range of potentially productive methods and research questions, as well as bring critically reflected results. Poststructural theory, mainly Deleuze-Guattarian philosophy, was chosen, and two methods were used: interviews and go-along interviews that had to be adjusted to be able to explore blind experience. In spite of a thorough preparation of these methods, new difficulties kept emerging, which exposed the ableist character of scientific knowledge. From the beginning of data collecting, there was an agreement to work in teams with slightly different roles of each of the researchers, which was significant especially during go-along interviews. In some cases, the anticipations of the researchers and participants differed, which led to unexpected and potentially dangerous situations. These were not caused only by the differences between scientific and lay communities but also between able-bodied and disabled people. Researchers were sometimes assigned to the assistants’ roles, and this new position – doing research together – required further negotiations, which also opened various ethical questions.

Keywords: ableist methodology, blind experience, go-along interviews, research ethics, scientific knowledge

Procedia PDF Downloads 165
7397 CdS Quantum Dots as Fluorescent Probes for Detection of Naphthalene

Authors: Zhengyu Yan, Yan Yu, Jianqiu Chen

Abstract:

A novel sensing system has been designed for naphthalene detection based on the quenched fluorescence signal of CdS quantum dots. The fluorescence intensity of the system reduced significantly after adding CdS quantum dots to the water pollution model because of the fluorescent static quenching f mechanism. Herein, we have demonstrated the facile methodology can offer a convenient and low analysis cost with the recovery rate as 97.43%-103.2%, which has potential application prospect.

Keywords: CdS quantum dots, modification, detection, naphthalene

Procedia PDF Downloads 493
7396 Pleated Surfaces: Experimentation and Examples

Authors: Maritza Granados Manjarrés

Abstract:

This paper makes part of an investigation project which experiments with flat surfaces in order to pleat them using tessellations and flat origami conditions. The aim of the investigation is to eventually propose not only a methodology on how to pleat those surfaces but also to find an structural system to make them work as building skins. This stage of the investigation emphasizes on the experimentation with flat surfaces and different kinds of folding patterns and shows the many examples that can be made from this experimentation.

Keywords: flat origami, fold, space, surface

Procedia PDF Downloads 291
7395 Software Quality Promotion and Improvement through Usage of a PSP Oriented Information System

Authors: Gaoussou Doukoure Abdel Kader, Mnkandla Ernest

Abstract:

This research aims to investigate the usage of a personal software process oriented information system in order to facilitate the promotion of software quality and its improvement in organizations. In this light, at the term of a literature review on software quality and related concepts, the personal software process is discussed, more particularly in terms of software quality. Semi-structured interviews will be conducted with a team of software engineers on the first hand to establish a baseline on their understanding of what quality entails for them. The PSP methodology will then be presented to the engineers in its most basic aspects. The research will then proceed to practical case study where a PSP oriented information system is submitted to engineers for usage throughout their development process. Reports from the PSP information system as well as feedback from the engineers will be used in conjunction with the theoretical foundation to establish a PSP inspired framework for software quality promotion and improvement.

Keywords: information communication technology, personal software process, software quality, process quality, software engineering

Procedia PDF Downloads 494
7394 Enhancing Scalability in Ethereum Network Analysis: Methods and Techniques

Authors: Stefan K. Behfar

Abstract:

The rapid growth of the Ethereum network has brought forth the urgent need for scalable analysis methods to handle the increasing volume of blockchain data. In this research, we propose efficient methodologies for making Ethereum network analysis scalable. Our approach leverages a combination of graph-based data representation, probabilistic sampling, and parallel processing techniques to achieve unprecedented scalability while preserving critical network insights. Data Representation: We develop a graph-based data representation that captures the underlying structure of the Ethereum network. Each block transaction is represented as a node in the graph, while the edges signify temporal relationships. This representation ensures efficient querying and traversal of the blockchain data. Probabilistic Sampling: To cope with the vastness of the Ethereum blockchain, we introduce a probabilistic sampling technique. This method strategically selects a representative subset of transactions and blocks, allowing for concise yet statistically significant analysis. The sampling approach maintains the integrity of the network properties while significantly reducing the computational burden. Graph Convolutional Networks (GCNs): We incorporate GCNs to process the graph-based data representation efficiently. The GCN architecture enables the extraction of complex spatial and temporal patterns from the sampled data. This combination of graph representation and GCNs facilitates parallel processing and scalable analysis. Distributed Computing: To further enhance scalability, we adopt distributed computing frameworks such as Apache Hadoop and Apache Spark. By distributing computation across multiple nodes, we achieve a significant reduction in processing time and enhanced memory utilization. Our methodology harnesses the power of parallelism, making it well-suited for large-scale Ethereum network analysis. Evaluation and Results: We extensively evaluate our methodology on real-world Ethereum datasets covering diverse time periods and transaction volumes. The results demonstrate its superior scalability, outperforming traditional analysis methods. Our approach successfully handles the ever-growing Ethereum data, empowering researchers and developers with actionable insights from the blockchain. Case Studies: We apply our methodology to real-world Ethereum use cases, including detecting transaction patterns, analyzing smart contract interactions, and predicting network congestion. The results showcase the accuracy and efficiency of our approach, emphasizing its practical applicability in real-world scenarios. Security and Robustness: To ensure the reliability of our methodology, we conduct thorough security and robustness evaluations. Our approach demonstrates high resilience against adversarial attacks and perturbations, reaffirming its suitability for security-critical blockchain applications. Conclusion: By integrating graph-based data representation, GCNs, probabilistic sampling, and distributed computing, we achieve network scalability without compromising analytical precision. This approach addresses the pressing challenges posed by the expanding Ethereum network, opening new avenues for research and enabling real-time insights into decentralized ecosystems. Our work contributes to the development of scalable blockchain analytics, laying the foundation for sustainable growth and advancement in the domain of blockchain research and application.

Keywords: Ethereum, scalable network, GCN, probabilistic sampling, distributed computing

Procedia PDF Downloads 76
7393 Fully Eulerian Finite Element Methodology for the Numerical Modeling of the Dynamics of Heart Valves

Authors: Aymen Laadhari

Abstract:

During the last decade, an increasing number of contributions have been made in the fields of scientific computing and numerical methodologies applied to the study of the hemodynamics in the heart. In contrast, the numerical aspects concerning the interaction of pulsatile blood flow with highly deformable thin leaflets have been much less explored. This coupled problem remains extremely challenging and numerical difficulties include e.g. the resolution of full Fluid-Structure Interaction problem with large deformations of extremely thin leaflets, substantial mesh deformations, high transvalvular pressure discontinuities, contact between leaflets. Although the Lagrangian description of the structural motion and strain measures is naturally used, many numerical complexities can arise when studying large deformations of thin structures. Eulerian approaches represent a promising alternative to readily model large deformations and handle contact issues. We present a fully Eulerian finite element methodology tailored for the simulation of pulsatile blood flow in the aorta and sinus of Valsalva interacting with highly deformable thin leaflets. Our method enables to use a fluid solver on a fixed mesh, whilst being able to easily model the mechanical properties of the valve. We introduce a semi-implicit time integration scheme based on a consistent NewtonRaphson linearization. A variant of the classical Newton method is introduced and guarantees a third-order convergence. High-fidelity computational geometries are built and simulations are performed under physiological conditions. We address in detail the main features of the proposed method, and we report several experiments with the aim of illustrating its accuracy and efficiency.

Keywords: eulerian, level set, newton, valve

Procedia PDF Downloads 278
7392 Enriched Education: The Classroom as a Learning Network through Video Game Narrative Development

Authors: Wayne DeFehr

Abstract:

This study is rooted in a pedagogical approach that emphasizes student engagement as fundamental to meaningful learning in the classroom. This approach creates a paradigmatic shift, from a teaching practice that reinforces the teacher’s central authority to a practice that disperses that authority among the students in the classroom through networks that they themselves develop. The methodology of this study about creating optimal conditions for learning in the classroom includes providing a conceptual framework within which the students work, as well as providing clearly stated expectations for work standards, content quality, group methodology, and learning outcomes. These learning conditions are nurtured in a variety of ways. First, nearly every class includes a lecture from the professor with key concepts that students need in order to complete their work successfully. Secondly, students build on this scholarly material by forming their own networks, where students face each other and engage with each other in order to collaborate their way to solving a particular problem relating to the course content. Thirdly, students are given short, medium, and long-term goals. Short term goals relate to the week’s topic and involve workshopping particular issues relating to that stage of the course. The medium-term goals involve students submitting term assignments that are evaluated according to a well-defined rubric. And finally, long-term goals are achieved by creating a capstone project, which is celebrated and shared with classmates and interested friends on the final day of the course. The essential conclusions of the study are drawn from courses that focus on video game narrative. Enthusiastic student engagement is created not only with the dynamic energy and expertise of the instructor, but also with the inter-dependence of the students on each other to build knowledge, acquire skills, and achieve successful results.

Keywords: collaboration, education, learning networks, video games

Procedia PDF Downloads 115
7391 Valorisation of Mango Seed: Response Surface Methodology Based Optimization of Starch Extraction from Mango Seeds

Authors: Tamrat Tesfaye, Bruce Sithole

Abstract:

Box-Behnken Response surface methodology was used to determine the optimum processing conditions that give maximum extraction yield and whiteness index from mango seed. The steeping time ranges from 2 to 12 hours and slurring of the steeped seed in sodium metabisulphite solution (0.1 to 0.5 w/v) was carried out. Experiments were designed according to Box-Behnken Design with these three factors and a total of 15 runs experimental variables of were analyzed. At linear level, the concentration of sodium metabisulphite had significant positive influence on percentage yield and whiteness index at p<0.05. At quadratic level, sodium metabisulphite concentration and sodium metabisulphite concentration2 had a significant negative influence on starch yield; sodium metabisulphite concentration and steeping time*temperature had significant (p<0.05) positive influence on whiteness index. The adjusted R2 above 0.8 for starch yield (0.906465) and whiteness index (0.909268) showed a good fit of the model with the experimental data. The optimum sodium metabisulphite concentration, steeping hours, and temperature for starch isolation with maximum starch yield (66.428%) and whiteness index (85%) as set goals for optimization with the desirability of 0.91939 was 0.255w/v concentration, 2hrs and 50 °C respectively. The determined experimental value of each response based on optimal condition was statistically in accordance with predicted levels at p<0.05. The Mango seeds are the by-products obtained during mango processing and possess disposal problem if not handled properly. The substitution of food based sizing agents with mango seed starch can contribute as pertinent resource deployment for value-added product manufacturing and waste utilization which might play significance role of food security in Ethiopia.

Keywords: mango, synthetic sizing agent, starch, extraction, textile, sizing

Procedia PDF Downloads 231
7390 Analytical Solutions for Tunnel Collapse Mechanisms in Circular Cross-Section Tunnels under Seepage and Seismic Forces

Authors: Zhenyu Yang, Qiunan Chen, Xiaocheng Huang

Abstract:

Reliable prediction of tunnel collapse remains a prominent challenge in the field of civil engineering. In this study, leveraging the nonlinear Hoek-Brown failure criterion and the upper-bound theorem, an analytical solution for the collapse surface of shallowly buried circular tunnels was derived, taking into account the coupled effects of surface loads and pore water pressures. Initially, surface loads and pore water pressures were introduced as external force factors, equating the energy dissipation rate to the external force, yielding our objective function. Subsequently, the variational method was employed for optimization, and the outcomes were juxtaposed with previous research findings. Furthermore, we utilized the deduced equation set to systematically analyze the influence of various rock mass parameters on collapse shape and extent. To validate our analytical solutions, a comparison with prior studies was executed. The corroboration underscored the efficacy of our proposed methodology, offering invaluable insights for collapse risk assessment in practical engineering applications.

Keywords: tunnel roof stability, analytical solution, hoek–brown failure criterion, limit analysis

Procedia PDF Downloads 84
7389 Fabrication Methodologies for Anti-Microbial Polypropylene Surfaces with Leachable and Non-leachable Anti-Microbial Agents

Authors: Saleh Alkarri, Dimple Sharma, Teresa M. Bergholz, Muhammad Rabnawaz

Abstract:

Aims: Develop a methodology for the fabrication of anti-microbial polypropylene (PP) surfaces with (i) leachable copper, (II) chloride dihydrate (CuCl₂·₂H₂O) and (ii) non-leachable magnesium hydroxide (Mg(OH)₂) biocides. Methods and Results: Two methodologies are used to develop anti-microbial PP surfaces. One method involves melt-blending and subsequent injection molding, where the biocide additives were compounded with PP and subsequently injection-molded. The other method involves the thermal embossing of anti-microbial agents on the surface of a PP substrate. The obtained biocide-bearing PP surfaces were evaluated against E. coli K-12 MG1655 for 0, 4, and 24 h to evaluate their anti-microbial properties. The injection-molded PP bearing 5% CuCl2·₂H₂O showed a 6-log reduction of E. coli K-12 MG1655 after 24 h, while only 1 log reduction was observed for PP bearing 5% Mg(OH)2. The thermally embossed PP surfaces bearing CuCl2·2H2O and Mg(OH)₂ particles (at a concentration of 10 mg/mL) showed 3 log and 4 log reduction, respectively, against E.coli K-12 MG1655 after 24 h. Conclusion: The results clearly demonstrate that CuCl₂·2H₂O conferred anti-microbial properties to PP surfaces that were prepared by both injection molding as well as thermal embossing approaches owing to the presence of leachable copper ions. In contrast, the non-leachable Mg(OH)₂ imparted anti-microbial properties only to the surface prepared via the thermal embossing technique. Significance and Impact of The Study: Plastics with leachable biocides are effective anti-microbial surfaces, but their toxicity is a major concern. This study provides a fabrication methodology for non-leachable PP-based anti-microbial surfaces that are potentially safer. In addition, this strategy can be extended to many other plastics substrates.

Keywords: anti-microbial activity, E. coli K-12 MG1655, copper (II) chloride dihydrate, magnesium hydroxide, leachable, non-leachable, compounding, thermal embossing

Procedia PDF Downloads 78
7388 Fabrication Methodologies for Anti-microbial Polypropylene Surfaces with Leachable and Non-leachable Anti-microbial Agents

Authors: Saleh Alkarri, Dimple Sharma, Teresa M. Bergholz, Muhammad Rabnawa

Abstract:

Aims: Develop a methodology for the fabrication of anti-microbial polypropylene (PP) surfaces with (i) leachable copper (II) chloride dihydrate (CuCl2·2H2O) and (ii) non-leachable magnesium hydroxide (Mg(OH)2) biocides. Methods and Results: Two methodologies are used to develop anti-microbial PP surfaces. One method involves melt-blending and subsequent injection molding, where the biocide additives were compounded with PP and subsequently injection-molded. The other method involves the thermal embossing of anti-microbial agents on the surface of a PP substrate. The obtained biocide-bearing PP surfaces were evaluated against E. coli K-12 MG1655 for 0, 4, and 24 h to evaluate their anti-microbial properties. The injection-molded PP bearing 5% CuCl2·2H2O showed a 6-log reduction of E. coli K-12 MG1655 after 24 h, while only 1 log reduction was observed for PP bearing 5% Mg(OH)2. The thermally embossed PP surfaces bearing CuCl2·2H2O and Mg(OH)2 particles (at a concentration of 10 mg/mL) showed 3 log and 4 log reduction, respectively, against E.coli K-12 MG1655 after 24 h. Conclusion: The results clearly demonstrate that CuCl2·2H2O conferred anti-microbial properties to PP surfaces that were prepared by both injection molding as well as thermal embossing approaches owing to the presence of leachable copper ions. In contrast, the non-leachable Mg(OH)2 imparted anti-microbial properties only to the surface prepared via the thermal embossing technique. Significance and Impact of The Study: Plastics with leachable biocides are effective anti-microbial surfaces, but their toxicity is a major concern. This study provides a fabrication methodology for non-leachable PP-based anti-microbial surfaces that are potentially safer. In addition, this strategy can be extended to many other plastics substrates.

Keywords: anti-microbial activity, E. coli K-12 MG1655, copper (II) chloride dihydrate, magnesium hydroxide, leachable, non-leachable, compounding, thermal embossing

Procedia PDF Downloads 83
7387 Application and Evaluation of Teaching-Learning Guides Based on Swebok for the Requirements Engineering Area

Authors: Mauro Callejas-Cuervo, Andrea Catherine Alarcon-Aldana, Lorena Paola Castillo-Guerra

Abstract:

The software industry requires highly-trained professionals, capable of developing the roles integrated in the cycle of software development. That is why a large part of the task is the responsibility of higher education institutions; often through a curriculum established to orientate the academic development of the students. It is so that nowadays there are different models that support proposals for the improvement of the curricula for the area of Software Engineering, such as ACM, IEEE, ABET, Swebok, of which the last stands out, given that it manages and organises the knowledge of Software Engineering and offers a vision of theoretical and practical aspects. Moreover, it has been applied by different universities in the pursuit of achieving coverage in delivering the different topics and increasing the professional quality of future graduates. This research presents the structure of teaching and learning guides from the objectives of training and methodological strategies immersed in the levels of learning of Bloom’s taxonomy with which it is intended to improve the delivery of the topics in the area of Requirements Engineering. Said guides were implemented and validated in a course of Requirements Engineering of the Systems and Computer Engineering programme in the Universidad Pedagógica y Tecnológica de Colombia (Pedagogical and Technological University of Colombia) using a four stage methodology: definition of the evaluation model, implementation of the guides, guide evaluation, and analysis of the results. After the collection and analysis of the data, the results show that in six out of the seven topics proposed in the Swebok guide, the percentage of students who obtained total marks within the 'High grade' level, that is between 4.0 and 4.6 (on a scale of 0.0 to 5.0), was higher than the percentage of students who obtained marks within the 'Acceptable' range of 3.0 to 3.9. In 86% of the topics and the strategies proposed, the teaching and learning guides facilitated the comprehension, analysis, and articulation of the concepts and processes of the students. In addition, they mainly indicate that the guides strengthened the argumentative and interpretative competencies, while the remaining 14% denotes the need to reinforce the strategies regarding the propositive competence, given that it presented the lowest average.

Keywords: pedagogic guide, pedagogic strategies, requirements engineering, Swebok, teaching-learning process

Procedia PDF Downloads 286
7386 Developing Research Involving Different Species: Opportunities and Empirical Foundations

Authors: A. V. Varfolomeeva, N. S. Tkachenko, A. G. Tishchenko

Abstract:

The problem of violation of internal validity in studies of psychological structures is considered. The role of epistemological attitudes of researchers in the planning of research within the methodology of the system-evolutionary approach is assessed. Alternative programs of psychological research involving representatives of different biological species are presented. On the example of the results of two research series the variants of solving the problem are discussed.

Keywords: epistemological attitudes, experimental design, validity, psychological structure, learning

Procedia PDF Downloads 115
7385 Changes in Textural Properties of Zucchini Slices Under Effects of Partial Predrying and Deep-Fat-Frying

Authors: E. Karacabey, Ş. G. Özçelik, M. S. Turan, C. Baltacıoğlu, E. Küçüköner

Abstract:

Changes in textural properties of any food material during processing is significant for further consumer’s evaluation and directly affects their decisions. Thus any food material should be considered in terms of textural properties after any process. In the present study zucchini slices were partially predried to control and reduce the product’s final oil content. A conventional oven was used for partially dehydration of zucchini slices. Following frying was carried in an industrial fryer having temperature controller. This study was based on the effect of this predrying process on textural properties of fried zucchini slices. Texture profile analysis was performed. Hardness, elasticity, chewiness, cohesiveness were studied texture parameters of fried zucchini slices. Temperature and weight loss were monitored parameters of predrying process, whereas, in frying, oil temperature and process time were controlled. Optimization of two successive processes was done by response surface methodology being one of the common used statistical process optimization tools. Models developed for each texture parameters displayed high success to predict their values as a function of studied processes’ conditions. Process optimization was performed according to target values for each property determined for directly fried zucchini slices taking the highest score from sensory evaluation. Results indicated that textural properties of predried and then fried zucchini slices could be controlled by well-established equations. This is thought to be significant for fried stuff related food industry, where controlling of sensorial properties are crucial to lead consumer’s perception and texture related ones are leaders. This project (113R015) has been supported by TUBITAK.

Keywords: optimization, response surface methodology, texture profile analysis, conventional oven, modelling

Procedia PDF Downloads 433
7384 Unknown Groundwater Pollution Source Characterization in Contaminated Mine Sites Using Optimal Monitoring Network Design

Authors: H. K. Esfahani, B. Datta

Abstract:

Groundwater is one of the most important natural resources in many parts of the world; however it is widely polluted due to human activities. Currently, effective and reliable groundwater management and remediation strategies are obtained using characterization of groundwater pollution sources, where the measured data in monitoring locations are utilized to estimate the unknown pollutant source location and magnitude. However, accurately identifying characteristics of contaminant sources is a challenging task due to uncertainties in terms of predicting source flux injection, hydro-geological and geo-chemical parameters, and the concentration field measurement. Reactive transport of chemical species in contaminated groundwater systems, especially with multiple species, is a complex and highly non-linear geochemical process. Although sufficient concentration measurement data is essential to accurately identify sources characteristics, available data are often sparse and limited in quantity. Therefore, this inverse problem-solving method for characterizing unknown groundwater pollution sources is often considered ill-posed, complex and non- unique. Different methods have been utilized to identify pollution sources; however, the linked simulation-optimization approach is one effective method to obtain acceptable results under uncertainties in complex real life scenarios. With this approach, the numerical flow and contaminant transport simulation models are externally linked to an optimization algorithm, with the objective of minimizing the difference between measured concentration and estimated pollutant concentration at observation locations. Concentration measurement data are very important to accurately estimate pollution source properties; therefore, optimal design of the monitoring network is essential to gather adequate measured data at desired times and locations. Due to budget and physical restrictions, an efficient and effective approach for groundwater pollutant source characterization is to design an optimal monitoring network, especially when only inadequate and arbitrary concentration measurement data are initially available. In this approach, preliminary concentration observation data are utilized for preliminary source location, magnitude and duration of source activity identification, and these results are utilized for monitoring network design. Further, feedback information from the monitoring network is used as inputs for sequential monitoring network design, to improve the identification of unknown source characteristics. To design an effective monitoring network of observation wells, optimization and interpolation techniques are used. A simulation model should be utilized to accurately describe the aquifer properties in terms of hydro-geochemical parameters and boundary conditions. However, the simulation of the transport processes becomes complex when the pollutants are chemically reactive. Three dimensional transient flow and reactive contaminant transport process is considered. The proposed methodology uses HYDROGEOCHEM 5.0 (HGCH) as the simulation model for flow and transport processes with chemically multiple reactive species. Adaptive Simulated Annealing (ASA) is used as optimization algorithm in linked simulation-optimization methodology to identify the unknown source characteristics. Therefore, the aim of the present study is to develop a methodology to optimally design an effective monitoring network for pollution source characterization with reactive species in polluted aquifers. The performance of the developed methodology will be evaluated for an illustrative polluted aquifer sites, for example an abandoned mine site in Queensland, Australia.

Keywords: monitoring network design, source characterization, chemical reactive transport process, contaminated mine site

Procedia PDF Downloads 231
7383 Study on Seismic Response Feature of Multi-Span Bridges Crossing Fault

Authors: Yingxin Hui

Abstract:

Understanding seismic response feature of the bridges crossing fault is the basis of the seismic fortification. Taking a multi-span bridge crossing active fault under construction as an example, the seismic ground motions at bridge site were generated following hybrid simulation methodology. Multi-support excitations displacement input models and nonlinear time history analysis was used to calculate seismic response of structures, and the results were compared with bridge in the near-fault region. The results showed that the seismic response features of bridges crossing fault were different from the bridges in the near-fault region. The design according to the bridge in near-fault region would cause the calculation results with insecurity and non-reasonable if the effect of cross the fault was ignored. The design of seismic fortification should be based on seismic response feature, which could reduce the adverse effect caused by the structure damage.

Keywords: bridge engineering, seismic response feature, across faults, rupture directivity effect, fling step

Procedia PDF Downloads 433
7382 Cost Overrun in Construction Projects

Authors: Hailu Kebede Bekele

Abstract:

Construction delays are suitable where project events occur at a certain time expected due to causes related to the client, consultant, and contractor. Delay is the major cause of the cost overrun that leads to the poor efficiency of the project. The cost difference between completion and the originally estimated is known as cost overrun. The common ways of cost overruns are not simple issues that can be neglected, but more attention should be given to prevent the organization from being devastated to be failed, and financial expenses to be extended. The reasons that may raised in different studies show that the problem may arise in construction projects due to errors in budgeting, lack of favorable weather conditions, inefficient machinery, and the availability of extravagance. The study is focused on the pace of mega projects that can have a significant change in the cost overrun calculation.15 mega projects are identified to study the problem of the cost overrun in the site. The contractor, consultant, and client are the principal stakeholders in the mega projects. 20 people from each sector were selected to participate in the investigation of the current mega construction project. The main objective of the study on the construction cost overrun is to prioritize the major causes of the cost overrun problem. The methodology that was employed in the construction cost overrun is the qualitative methodology that mostly rates the causes of construction project cost overrun. Interviews, open-ended and closed-ended questions group discussions, and rating qualitative methods are the best methodologies to study construction projects overrun. The result shows that design mistakes, lack of labor, payment delay, old equipment and scheduling, weather conditions, lack of skilled labor, payment delays, transportation, inflation, and order variations, market price fluctuation, and people's thoughts and philosophies, the prior cause of the cost overrun that fail the project performance. The institute shall follow the scheduled activities to bring a positive forward in the project life.

Keywords: cost overrun, delay, mega projects, design

Procedia PDF Downloads 62