Search results for: static analysis tools
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30966

Search results for: static analysis tools

29016 Modeling Football Penalty Shootouts: How Improving Individual Performance Affects Team Performance and the Fairness of the ABAB Sequence

Authors: Pablo Enrique Sartor Del Giudice

Abstract:

Penalty shootouts often decide the outcome of important soccer matches. Although usually referred to as ”lotteries”, there is evidence that some national teams and clubs consistently perform better than others. The outcomes are therefore not explained just by mere luck, and therefore there are ways to improve the average performance of players, naturally at the expense of some sort of effort. In this article we study the payoff of player performance improvements in terms of the performance of the team as a whole. To do so we develop an analytical model with static individual performances, as well as Monte Carlo models that take into account the known influence of partial score and round number on individual performances. We find that within a range of usual values, the team performance improves above 70% faster than individual performances do. Using these models, we also estimate that the new ABBA penalty shootout ordering under test reduces almost all the known bias in favor of the first-shooting team under the current ABAB system.

Keywords: football, penalty shootouts, Montecarlo simulation, ABBA

Procedia PDF Downloads 162
29015 The Consumer Responses toward the Offensive Product Advertising

Authors: Chin Tangtarntana

Abstract:

The main purpose of this study was to investigate the effects of animation in offensive product advertising. Experiment was conducted to collect consumer responses toward animated and static ads of offensive and non-offensive products. The study was conducted by distributing questionnaires to the target respondents. According to statistics from Innovative Internet Research Center, Thailand, majority of internet users are 18 – 44 years old. The results revealed an interaction between ad design and offensive product. Specifically, when used in offensive product advertisements, animated ads were not effective for consumer attention, but yielded positive response in terms of attitude toward product. The findings support that information processing model is accurate in predicting consumer cognitive response toward cartoon ads, whereas U&G, arousal, and distinctive theory is more accurate in predicting consumer affective response. In practical, these findings can also be used to guide ad designers and marketers that are suitable for offensive products.

Keywords: animation, banner ad design, consumer responses, offensive product advertising, stock exchange of Thailand

Procedia PDF Downloads 269
29014 Investigation of Distortion and Impact Strength of 304 L Butt Joint Using Different Weld Groove

Authors: A. Sharma, S. S. Sandhu, A.Shahi, A. Kumar

Abstract:

In this study, the effects of geometric configurations of butt joints i.e. double V groove, double U groove and UV groove of AISI 304L of thickness 12 mm by using Gas Tungsten Arc Welding (GTAW) are investigated. The magnitude of transverse shrinkage stress and distortion generated during welding under the unrestrained conditions of butt joints is the main objective of the study. The effect of groove design on impact strength and metallurgical properties are also studied. The Finite element analysis for the groove design is done and compared the actual experimentation. The experimental results and the FEM results were compared and reveal a very good correlation for distortion and weld groove design for multipass joint with a standard analogy of 80%. In the case of VV groove design it was found that the transverse stress and cumulative deflection have the lowest value. It was found that the UV groove design had the maximum ultimate and yield tensile strength, VV groove had the highest impact strength. Vicker’s hardness value of all the groove design was measured. Micro structural studies were carried out using conventional microscopic tools which revealed a lot of useful information for correlating the microstructure with mechanical properties.

Keywords: weld groove design, distortion, AISI 304 L, butt joint, FEM, GTAW

Procedia PDF Downloads 366
29013 Theory and Practice of Wavelets in Signal Processing

Authors: Jalal Karam

Abstract:

The methods of Fourier, Laplace, and Wavelet Transforms provide transfer functions and relationships between the input and the output signals in linear time invariant systems. This paper shows the equivalence among these three methods and in each case presenting an application of the appropriate (Fourier, Laplace or Wavelet) to the convolution theorem. In addition, it is shown that the same holds for a direct integration method. The Biorthogonal wavelets Bior3.5 and Bior3.9 are examined and the zeros distribution of their polynomials associated filters are located. This paper also presents the significance of utilizing wavelets as effective tools in processing speech signals for common multimedia applications in general, and for recognition and compression in particular. Theoretically and practically, wavelets have proved to be effective and competitive. The practical use of the Continuous Wavelet Transform (CWT) in processing and analysis of speech is then presented along with explanations of how the human ear can be thought of as a natural wavelet transformer of speech. This generates a variety of approaches for applying the (CWT) to many paradigms analysing speech, sound and music. For perception, the flexibility of implementation of this transform allows the construction of numerous scales and we include two of them. Results for speech recognition and speech compression are then included.

Keywords: continuous wavelet transform, biorthogonal wavelets, speech perception, recognition and compression

Procedia PDF Downloads 416
29012 Size Effects on Structural Performance of Concrete Gravity Dams

Authors: Mehmet Akköse

Abstract:

Concern about seismic safety of concrete dams have been growing around the world, partly because the population at risk in locations downstream of major dams continues to expand and also because it is increasingly evident that the seismic design concepts in use at the time most existing dams were built were inadequate. Most of the investigations in the past have been conducted on large dams, typically above 100m high. A large number of concrete dams in our country and in other parts of the world are less than 50m high. Most of these dams were usually designed using pseudo-static methods, ignoring the dynamic characteristics of the structure as well as the characteristics of the ground motion. Therefore, it is important to carry out investigations on seismic behavior this category of dam in order to assess and evaluate the safety of existing dams and improve the knowledge for different high dams to be constructed in the future. In this study, size effects on structural performance of concrete gravity dams subjected to near and far-fault ground motions are investigated including dam-water-foundation interaction. For this purpose, a benchmark problem proposed by ICOLD (International Committee on Large Dams) is chosen as a numerical application. Structural performance of the dam having five different heights is evaluated according to damage criterions in USACE (U.S. Army Corps of Engineers). It is decided according to their structural performance if non-linear analysis of the dams requires or not. The linear elastic dynamic analyses of the dams to near and far-fault ground motions are performed using the step-by-step integration technique. The integration time step is 0.0025 sec. The Rayleigh damping constants are calculated assuming 5% damping ratio. The program NONSAP modified for fluid-structure systems with the Lagrangian fluid finite element is employed in the response calculations.

Keywords: concrete gravity dams, Lagrangian approach, near and far-fault ground motion, USACE damage criterions

Procedia PDF Downloads 267
29011 Improving Contributions to the Strengthening of the Legislation Regarding Road Infrastructure Safety Management in Romania, Case Study: Comparison Between the Initial Regulations and the Clarity of the Current Regulations - Trends Regarding the Efficiency

Authors: Corneliu-Ioan Dimitriu, Gheorghe Frățilă

Abstract:

Romania and Bulgaria have high rates of road deaths per million inhabitants. Directive (EU) 2019/1936, known as the RISM Directive, has been transposed into national law by each Member State. The research focuses on the amendments made to Romanian legislation through Government Ordinance no. 3/2022, which aims to improve road safety management on infrastructure. The aim of the research is two-fold: to sensitize the Romanian Government and decision-making entities to develop an integrated and competitive management system and to establish a safe and proactive mobility system that ensures efficient and safe roads. The research includes a critical analysis of European and Romanian legislation, as well as subsequent normative acts related to road infrastructure safety management. Public data from European Union and national authorities, as well as data from the Romanian Road Authority-ARR and Traffic Police database, are utilized. The research methodology involves comparative analysis, criterion analysis, SWOT analysis, and the use of GANTT and WBS diagrams. The Excel tool is employed to process the road accident databases of Romania and Bulgaria. Collaboration with Bulgarian specialists is established to identify common road infrastructure safety issues. The research concludes that the legislative changes have resulted in a relaxation of road safety management in Romania, leading to decreased control over certain management procedures. The amendments to primary and secondary legislation do not meet the current safety requirements for road infrastructure. The research highlights the need for legislative changes and strengthened administrative capacity to enhance road safety. Regional cooperation and the exchange of best practices are emphasized for effective road infrastructure safety management. The research contributes to the theoretical understanding of road infrastructure safety management by analyzing legislative changes and their impact on safety measures. It highlights the importance of an integrated and proactive approach in reducing road accidents and achieving the "zero deaths" objective set by the European Union. Data collection involves accessing public data from relevant authorities and using information from the Romanian Road Authority-ARR and Traffic Police database. Analysis procedures include critical analysis of legislation, comparative analysis of transpositions, criterion analysis, and the use of various diagrams and tools such as SWOT, GANTT, WBS, and Excel. The research addresses the effectiveness of legislative changes in road infrastructure safety management in Romania and the impact on control over management procedures. It also explores the need for strengthened administrative capacity and regional cooperation in addressing road safety issues. The research concludes that the legislative changes made in Romania have not strengthened road safety management and emphasize the need for immediate action, legislative amendments, and enhanced administrative capacity. Collaboration with Bulgarian specialists and the exchange of best practices are recommended for effective road infrastructure safety management. The research contributes to the theoretical understanding of road safety management and provides valuable insights for policymakers and decision-makers in Romania.

Keywords: management, road infrastructure safety, legislation, amendments, collaboration

Procedia PDF Downloads 84
29010 The Use of Classifiers in Image Analysis of Oil Wells Profiling Process and the Automatic Identification of Events

Authors: Jaqueline Maria Ribeiro Vieira

Abstract:

Different strategies and tools are available at the oil and gas industry for detecting and analyzing tension and possible fractures in borehole walls. Most of these techniques are based on manual observation of the captured borehole images. While this strategy may be possible and convenient with small images and few data, it may become difficult and suitable to errors when big databases of images must be treated. While the patterns may differ among the image area, depending on many characteristics (drilling strategy, rock components, rock strength, etc.). Previously we developed and proposed a novel strategy capable of detecting patterns at borehole images that may point to regions that have tension and breakout characteristics, based on segmented images. In this work we propose the inclusion of data-mining classification strategies in order to create a knowledge database of the segmented curves. These classifiers allow that, after some time using and manually pointing parts of borehole images that correspond to tension regions and breakout areas, the system will indicate and suggest automatically new candidate regions, with higher accuracy. We suggest the use of different classifiers methods, in order to achieve different knowledge data set configurations.

Keywords: image segmentation, oil well visualization, classifiers, data-mining, visual computer

Procedia PDF Downloads 303
29009 Informal Self-Governance: The Formation of an Alternative Urban Framework in a Cairo Region

Authors: Noor Abdelhamid

Abstract:

Almost half of Cairo’s growing population is housed in self-built, self-governed informal settlements serving as an alternative in the absence of government-provided public housing. These settlements emerged as the spatial expression of informal practices or activities operating outside regulated, formal frameworks. A comprehensive narrative of political events, administrative decisions, and urban policies set the stage for the growth of informal expression in Egypt. The purpose of this qualitative inquiry is to portray informal self-governance practiced by residents in the Cairo region. This research argues that informal spatial practices offer an alternative urban framework for bottom-up development in the absence of government provisions. In the context of this study, informal self-governance is defined as the residents’ autonomous control and use of public urban space in informal settlements. The case study for this research is Ard al-Liwa, a semi-formal settlement representing the majority of informal settlement typologies in Egypt, which consist of the formal occupation of land through an uncontrolled land subdivision, zoning, and construction. An inductive methodological approach is adopted to first study informal practices as singular activities and then as components of a larger environment. The collected set of empirical data consists of audiovisual material and observations obtained during regular site visits and interviews with residents native to the settlement. Methods of analysis are synthesized to identify themes in the data: the static and dynamic use of sidewalks, the urban traces of informal building allocation and construction, the de facto right to urban space, and the resultant spatial patterns. The paper concludes by positioning the research in the context of the current architectural practice, questioning the role, and responsibility, of designers in these self-governed urban regions.

Keywords: Egypt, informal settlements, self-governance, urban framework

Procedia PDF Downloads 160
29008 Accomplishing Mathematical Tasks in Bilingual Primary Classrooms

Authors: Gabriela Steffen

Abstract:

Learning in a bilingual classroom not only implies learning in two languages or in an L2, it also means learning content subjects through the means of bilingual or plurilingual resources, which is of a qualitatively different nature than ‘monolingual’ learning. These resources form elements of a didactics of plurilingualism, aiming not only at the development of a plurilingual competence, but also at drawing on plurilingual resources for nonlinguistic subject learning. Applying a didactics of plurilingualism allows for taking account of the specificities of bilingual content subject learning in bilingual education classrooms. Bilingual education is used here as an umbrella term for different programs, such as bilingual education, immersion, CLIL, bilingual modules in which one or several non-linguistic subjects are taught partly or completely in an L2. This paper aims at discussing first results of a study on pupil group work in bilingual classrooms in several Swiss primary schools. For instance, it analyses two bilingual classes in two primary schools in a French-speaking region of Switzerland that follows a part of their school program through German in addition to French, the language of instruction in this region. More precisely, it analyses videotaped classroom interaction and in situ classroom practices of pupil group work in a mathematics lessons. The ethnographic observation of pupils’ group work and the analysis of their interaction (analytical tools of conversational analysis, discourse analysis and plurilingual interaction) enhance the description of whole-class interaction done in the same (and several other) classes. While the latter are teacher-student interactions, the former are student-student interactions giving more space to and insight into pupils’ talk. This study aims at the description of the linguistic and multimodal resources (in German L2 and/or French L1) pupils mobilize while carrying out a mathematical task. The analysis shows that the accomplishment of the mathematical task takes place in a bilingual mode, whether the whole-class interactions are conducted rather in a bilingual (German L2-French L1) or a monolingual mode in L2 (German). The pupils make plenty of use of German L2 in a setting that lends itself to use French L1 (peer groups with French as a dominant language, in absence of the teacher and a task with a mathematical aim). They switch from French to German and back ‘naturally’, which is regular for bilingual speakers. Their linguistic resources in German L2 are not sufficient to allow them to (inter-)act well enough to accomplish the task entirely in German L2, despite their efforts to do so. However, this does not stop them from carrying out the task in mathematics adequately, which is the main objective, by drawing on the bilingual resources at hand.

Keywords: bilingual content subject learning, bilingual primary education, bilingual pupil group work, bilingual teaching/learning resources, didactics of plurilingualism

Procedia PDF Downloads 162
29007 Community Radio Broadcasting in Phutthamonthon District, Nakhon Pathom, Thailand

Authors: Anchana Sooksomchitra

Abstract:

This study aims to explore and compare the current condition of community radio stations in Phutthamonthon district, Nakhon Pathom province, Thailand, as well as the challenges they are facing. Qualitative research tools including in-depth interviews, documentary analysis, focus group interviews, and observation are used to examine the content, programming, and management structure of three community radio stations currently in operation within the district. Research findings indicate that the management and operational approaches adopted by the two non-profit stations included in the study, Salaya Pattana and Voice of Dhamma, are more structured and effective than that of the for-profit Tune Radio. Salaya Pattana, backed by the Faculty of Engineering, Mahidol University, and the charity-funded Voice of Dhamma are comparatively free from political and commercial influence, and able to provide more relevant and consistent community-oriented content to meet the real demand of the audience. Tune Radio, on the other hand, has to rely solely on financial support from political factions and business groups, which heavily influence its content.

Keywords: radio broadcasting, programming, management, community radio, Thailand

Procedia PDF Downloads 343
29006 Geographic Information System for District Level Energy Performance Simulations

Authors: Avichal Malhotra, Jerome Frisch, Christoph van Treeck

Abstract:

The utilization of semantic, cadastral and topological data from geographic information systems (GIS) has exponentially increased for building and urban-scale energy performance simulations. Urban planners, simulation scientists, and researchers use virtual 3D city models for energy analysis, algorithms and simulation tools. For dynamic energy simulations at city and district level, this paper provides an overview of the available GIS data models and their levels of detail. Adhering to different norms and standards, these models also intend to describe building and construction industry data. For further investigations, CityGML data models are considered for simulations. Though geographical information modelling has considerably many different implementations, extensions of virtual city data can also be made for domain specific applications. Highlighting the use of the extended CityGML models for energy researches, a brief introduction to the Energy Application Domain Extension (ADE) along with its significance is made. Consequently, addressing specific input simulation data, a workflow using Modelica underlining the usage of GIS information and the quantification of its significance over annual heating energy demand is presented in this paper.

Keywords: CityGML, EnergyADE, energy performance simulation, GIS

Procedia PDF Downloads 169
29005 The Development of a Conceptual Framework for Assessing Neighborhood Sustainability in South Africa

Authors: Benedict Okundaye, Patricia Tzortzopoulos, Yun Gao

Abstract:

Scholars and international organisations have contended that developing nations lack the technical expertise, infrastructure, and ability to cope with or prepare for the neighbourhood’s sustainable development as Sustainable Development Goals, mainly targeting goal 11 unimpressive accomplishments. Both wealthy and impoverished communities are facing increasing issues due to rapid urbanisation and pandemics, particularly in Africa. The global neighbourhood challenges, especially in developing countries such as South Africa, include pollution poverty, energy poverty, digital poverty, environmental degradation, social exclusion, and socioeconomic inequalities. With the problematic international sustainability assessment tools lingering, few researchers have produced frameworks to engage the local contexts, but improvements are still required. This research anchors on developing a people-centred, flexible, and adaptable neighbourhood sustainability assessment framework that becomes a tool to assess the characteristics of neighbourhood sustainability in South Africa. The conceptual framework employs a variety of approaches, including broader dimensional factors, a closed-ended questionnaire, and statistical analysis to improve on and complement other existing frameworks.

Keywords: participation, development, inclusion, urbanism, cities, resilience

Procedia PDF Downloads 91
29004 Mobile Platform’s Attitude Determination Based on Smoothed GPS Code Data and Carrier-Phase Measurements

Authors: Mohamed Ramdani, Hassen Abdellaoui, Abdenour Boudrassen

Abstract:

Mobile platform’s attitude estimation approaches mainly based on combined positioning techniques and developed algorithms; which aim to reach a fast and accurate solution. In this work, we describe the design and the implementation of an attitude determination (AD) process, using only measurements from GPS sensors. The major issue is based on smoothed GPS code data using Hatch filter and raw carrier-phase measurements integrated into attitude algorithm based on vectors measurement using least squares (LSQ) estimation method. GPS dataset from a static experiment is used to investigate the effectiveness of the presented approach and consequently to check the accuracy of the attitude estimation algorithm. Attitude results from GPS multi-antenna over short baselines are introduced and analyzed. The 3D accuracy of estimated attitude parameters using smoothed measurements is over 0.27°.

Keywords: attitude determination, GPS code data smoothing, hatch filter, carrier-phase measurements, least-squares attitude estimation

Procedia PDF Downloads 155
29003 i2kit: A Tool for Immutable Infrastructure Deployments

Authors: Pablo Chico De Guzman, Cesar Sanchez

Abstract:

Microservice architectures are increasingly in distributed cloud applications due to the advantages on the software composition, development speed, release cycle frequency and the business logic time to market. On the other hand, these architectures also introduce some challenges on the testing and release phases of applications. Container technology solves some of these issues by providing reproducible environments, easy of software distribution and isolation of processes. However, there are other issues that remain unsolved in current container technology when dealing with multiple machines, such as networking for multi-host communication, service discovery, load balancing or data persistency (even though some of these challenges are already solved by traditional cloud vendors in a very mature and widespread manner). Container cluster management tools, such as Kubernetes, Mesos or Docker Swarm, attempt to solve these problems by introducing a new control layer where the unit of deployment is the container (or the pod — a set of strongly related containers that must be deployed on the same machine). These tools are complex to configure and manage and they do not follow a pure immutable infrastructure approach since servers are reused between deployments. Indeed, these tools introduce dependencies at execution time for solving networking or service discovery problems. If an error on the control layer occurs, which would affect running applications, specific expertise is required to perform ad-hoc troubleshooting. As a consequence, it is not surprising that container cluster support is becoming a source of revenue for consulting services. This paper presents i2kit, a deployment tool based on the immutable infrastructure pattern, where the virtual machine is the unit of deployment. The input for i2kit is a declarative definition of a set of microservices, where each microservice is defined as a pod of containers. Microservices are built into machine images using linuxkit —- a tool for creating minimal linux distributions specialized in running containers. These machine images are then deployed to one or more virtual machines, which are exposed through a cloud vendor load balancer. Finally, the load balancer endpoint is set into other microservices using an environment variable, providing service discovery. The toolkit i2kit reuses the best ideas from container technology to solve problems like reproducible environments, process isolation, and software distribution, and at the same time relies on mature, proven cloud vendor technology for networking, load balancing and persistency. The result is a more robust system with no learning curve for troubleshooting running applications. We have implemented an open source prototype that transforms i2kit definitions into AWS cloud formation templates, where each microservice AMI (Amazon Machine Image) is created on the fly using linuxkit. Even though container cluster management tools have more flexibility for resource allocation optimization, we defend that adding a new control layer implies more important disadvantages. Resource allocation is greatly improved by using linuxkit, which introduces a very small footprint (around 35MB). Also, the system is more secure since linuxkit installs the minimum set of dependencies to run containers. The toolkit i2kit is currently under development at the IMDEA Software Institute.

Keywords: container, deployment, immutable infrastructure, microservice

Procedia PDF Downloads 179
29002 Development and Validation of Work Movement Task Analysis: Part 1

Authors: Mohd Zubairy Bin Shamsudin

Abstract:

Work-related Musculoskeletal Disorder (WMSDs) is one of the occupational health problems encountered by workers over the world. In Malaysia, there is increasing in trend over the years, particularly in the manufacturing sectors. Current method to observe workplace WMSDs is self-report questionnaire, observation and direct measurement. Observational method is most frequently used by the researcher and practitioner because of the simplified, quick and versatile when it applies to the worksite. However, there are some limitations identified e.g. some approach does not cover a wide spectrum of biomechanics activity and not sufficiently sensitive to assess the actual risks. This paper elucidates the development of Work Movement Task Analysis (WMTA), which is an observational tool for industrial practitioners’ especially untrained personnel to assess WMSDs risk factors and provide a basis for suitable intervention. First stage of the development protocol involved literature reviews, practitioner survey, tool validation and reliability. A total of six themes/comments were received in face validity stage. New revision of WMTA consisted of four sections of postural (neck, back, shoulder, arms, and legs) and associated risk factors; movement, load, coupling and basic environmental factors (lighting, noise, odorless, heat and slippery floor). For inter-rater reliability study shows substantial agreement among rater with K = 0.70. Meanwhile, WMTA validation shows significant association between WMTA score and self-reported pain or discomfort for the back, shoulder&arms and knee&legs with p<0.05. This tool is expected to provide new workplace ergonomic observational tool to assess WMSDs for the next stage of the case study.

Keywords: assessment, biomechanics, musculoskeletal disorders, observational tools

Procedia PDF Downloads 469
29001 Theoretical Comparisons and Empirical Illustration of Malmquist, Hicks–Moorsteen, and Luenberger Productivity Indices

Authors: Fatemeh Abbasi, Sahand Daneshvar

Abstract:

Productivity is one of the essential goals of companies to improve performance, which as a strategy-oriented method, determines the basis of the company's economic growth. The history of productivity goes back centuries, but most researchers defined productivity as the relationship between a product and the factors used in production in the early twentieth century. Productivity as the optimal use of available resources means that "more output using less input" can increase companies' economic growth and prosperity capacity. Also, having a quality life based on economic progress depends on productivity growth in that society. Therefore, productivity is a national priority for any developed country. There are several methods for calculating productivity growth measurements that can be divided into parametric and non-parametric methods. Parametric methods rely on the existence of a function in their hypotheses, while non-parametric methods do not require a function based on empirical evidence. One of the most popular non-parametric methods is Data Envelopment Analysis (DEA), which measures changes in productivity over time. The DEA evaluates the productivity of decision-making units (DMUs) based on mathematical models. This method uses multiple inputs and outputs to compare the productivity of similar DMUs such as banks, government agencies, companies, airports, Etc. Non-parametric methods are themselves divided into the frontier and non frontier approaches. The Malmquist productivity index (MPI) proposed by Caves, Christensen, and Diewert (1982), the Hicks–Moorsteen productivity index (HMPI) proposed by Bjurek (1996), or the Luenberger productivity indicator (LPI) proposed by Chambers (2002) are powerful tools for measuring productivity changes over time. This study will compare the Malmquist, Hicks–Moorsteen, and Luenberger indices theoretically and empirically based on DEA models and review their strengths and weaknesses.

Keywords: data envelopment analysis, Hicks–Moorsteen productivity index, Leuenberger productivity indicator, malmquist productivity index

Procedia PDF Downloads 194
29000 Designing Urban Spaces Differently: A Case Study of the Hercity Herstreets Public Space Improvement Initiative in Nairobi, Kenya

Authors: Rehema Kabare

Abstract:

As urban development initiatives continue to emerge and are implemented amid rapid urbanization and climate change effects in the global south, the plight of women is only being noticed. The pandemic exposed the atrocities, violence and unsafety women and girls face daily both in their homes and in public urban spaces. This is a result of poorly implemented and managed urban structures, which women have been left out of during design and implementation for centuries. The UN Habitat’s HerCity toolkit provides a unique opportunity to change course for both governments and civil society actors where women and girls are onboarded onto urban development initiatives, with their designs and ideas being the focal point. This toolkit proves that when women and girls design, they design for everyone. The HerCity HerStreets, Public Space Improvement Initiative, resulted in a design that focused on two aspects: Streets are a shared resource, and Streets are public spaces. These two concepts illustrate that for streets to be experienced effectively as cultural spaces, they need to be user-friendly, safe and inclusive. This report demonstrates how the HerCity HerStreets as a pilot project can be a benchmark for designing urban spaces in African cities. The project focused on five dimensions to improve the air quality of the space, the space allocation to street vending and bodaboda (passenger motorcycle) stops parking and the green coverage. The process displays how digital tools such as Minecraft and Kobo Toolbox can be utilized to improve citizens’ participation in the development of public spaces, with a special focus on including vulnerable groups such as women, girls and youth.

Keywords: urban space, sustainable development, gender and the city, digital tools and urban development

Procedia PDF Downloads 82
28999 Microeconomic Consequences of the Housing Market Deformation in the Selected Region of the Czech Republic

Authors: Hana Janáčková

Abstract:

Housing can be sorted as basic needs of households. Purchase of acceptable ownership housing is important investments for most them. For rental housing households must consider the part of rent expenditure paid in the total household income. For this reason, financial considerations of households in this area depend on the government innervations (public administration) in housing - on housing policy. Market system of housing allocation, whether ownership or tenancy, is based on the fact that housing is a scarce good. The allocation of housing is based on demand and supply. The market system of housing can sometimes have a negative impact on some households, the market is unable to satisfy certain groups of the population that are not able or willing to accept market price. For these reasons, there is a more or less regulation of the market. Regulation is both on the demand and supply side, and the state determines the rules of behaviour for all economic entities of the housing market. This article submits results of analysis of selected regulatory interference of the state in the housing market and assesses their implications deforming the market in the selected region of the Czech Republic. The first part describes tools of supports and the second part discusses deformations and analyses their consequences on the demand side of housing market and on supply side.

Keywords: housing, housing market, microeconomic consequences, deformation

Procedia PDF Downloads 399
28998 A Framework for Teaching Distributed Requirements Engineering in Latin American Universities

Authors: G. Sevilla, S. Zapata, F. Giraldo, E. Torres, C. Collazos

Abstract:

This work describes a framework for teaching of global software engineering (GSE) in university undergraduate programs. This framework proposes a method of teaching that incorporates adequate techniques of software requirements elicitation and validated tools of communication, critical aspects to global software development scenarios. The use of proposed framework allows teachers to simulate small software development companies formed by Latin American students, which build information systems. Students from three Latin American universities played the roles of engineers by applying an iterative development of a requirements specification in a global software project. The proposed framework involves the use of a specific purpose Wiki for asynchronous communication between the participants of the process. It is also a practice to improve the quality of software requirements that are formulated by the students. The additional motivation of students to participate in these practices, in conjunction with peers from other countries, is a significant additional factor that positively contributes to the learning process. The framework promotes skills for communication, negotiation, and other complementary competencies that are useful for working on GSE scenarios.

Keywords: requirements analysis, distributed requirements engineering, practical experiences, collaborative support

Procedia PDF Downloads 204
28997 Analysis of Buddhist Rock Carvings in Diamer Basha Dam Reservoir Area, Gilgit-Baltistan, Pakistan

Authors: Abdul Ghani Khan

Abstract:

This paper focuses on the Buddhist rock carvings in the Diamer-Basha reservoir area, Gilgit-Baltistan, which is perhaps the largest rock art province of the world. The study region has thousands of rock carvings, particularly of the stupa carvings, engraved by artists, devotees or pilgrims, merchants have left their marks in the landscape or for the propagation of Buddhism. The Pak-German Archaeological Mission prepared, documented, and published the extensive catalogues of these carvings. Though, to date, very little systematic or statistically driven analysis was undertaken for in-depth understandings of the Buddhist rock carving tradition of the study region. This paper had made an attempt to examine stupa carvings and their constituent parts from the five selected sites, namely Oshibat, Shing Nala, Gichi Nala, Dadam Das, and Chilas Bridge. The statistical analyses and classification of the stupa carvings and their chronological contexts were carried out with the help of modern scientific tools such as STATA, FileMaker Pro, and MapSource softwares. The study had found that the tradition of stupa carvings on the surfaces of the rocks at the five selected sites continued for around 900 years, from the 1st century BCE to 8th century CE. There is a variation within the chronological settings of each of selected sites, possibly impacted by their utilization within particular landscapes, such as political (for example, change in political administrations or warfare) landscapes and geographical (for example, shifting of routes). The longer existence of the stupa carvings' tradition at these specific locations also indicates their central position on the trade and communication routes, and these were possibly also linked with religious ideologies within their particular times. The analyses of the different architectural elements of stupa carvings in the study area show that this tradition had structural similarities and differences in temporal and spatial contexts.

Keywords: rock carvings, stupa, stupa carvings, Buddhism, Pak-German archaeological mission

Procedia PDF Downloads 224
28996 Dynamical Models for Enviromental Effect Depuration for Structural Health Monitoring of Bridges

Authors: Francesco Morgan Bono, Simone Cinquemani

Abstract:

This research aims to enhance bridge monitoring by employing innovative techniques that incorporate exogenous factors into the modeling of sensor signals, thereby improving long-term predictability beyond traditional static methods. Using real datasets from two different bridges equipped with Linear Variable Displacement Transducer (LVDT) sensors, the study investigates the fundamental principles governing sensor behavior for more precise long-term forecasts. Additionally, the research evaluates performance on noisy and synthetically damaged data, proposing a residual-based alarm system to detect anomalies in the bridge. In summary, this novel approach combines advanced modeling, exogenous factors, and anomaly detection to extend prediction horizons and improve preemptive damage recognition, significantly advancing structural health monitoring practices.

Keywords: structural health monitoring, dynamic models, sindy, railway bridges

Procedia PDF Downloads 38
28995 Empowering Learners: From Augmented Reality to Shared Leadership

Authors: Vilma Zydziunaite, Monika Kelpsiene

Abstract:

In early childhood and preschool education, play has an important role in learning and cognitive processes. In the context of a changing world, personal autonomy and the use of technology are becoming increasingly important for the development of a wide range of learner competencies. By integrating technology into learning environments, the educational reality is changed, promoting unusual learning experiences for children through play-based activities. Alongside this, teachers are challenged to develop encouragement and motivation strategies that empower children to act independently. The aim of the study was to reveal the changes in the roles and experiences of teachers in the application of AR technology for the enrichment of the learning process. A quantitative research approach was used to conduct the study. The data was collected through an electronic questionnaire. Participants: 319 teachers of 5-6-year-old children using AR technology tools in their educational process. Methods of data analysis: Cronbach alpha, descriptive statistical analysis, normal distribution analysis, correlation analysis, regression analysis (SPSS software). Results. The results of the study show a significant relationship between children's learning and the educational process modeled by the teacher. The strongest predictor of child learning was found to be related to the role of the educator. Other predictors, such as pedagogical strategies, the concept of AR technology, and areas of children's education, have no significant relationship with child learning. The role of the educator was found to be a strong determinant of the child's learning process. Conclusions. The greatest potential for integrating AR technology into the teaching-learning process is revealed in collaborative learning. Teachers identified that when integrating AR technology into the educational process, they encourage children to learn from each other, develop problem-solving skills, and create inclusive learning contexts. A significant relationship has emerged - how the changing role of the teacher relates to the child's learning style and the aspiration for personal leadership and responsibility for their learning. Teachers identified the following key roles: observer of the learning process, proactive moderator, and creator of the educational context. All these roles enable the learner to become an autonomous and active participant in the learning process. This provides a better understanding and explanation of why it becomes crucial to empower the learner to experiment, explore, discover, actively create, and foster collaborative learning in the design and implementation of the educational content, also for teachers to integrate AR technologies and the application of the principles of shared leadership. No statistically significant relationship was found between the understanding of the definition of AR technology and the teacher’s choice of role in the learning process. However, teachers reported that their understanding of the definition of AR technology influences their choice of role, which has an impact on children's learning.

Keywords: teacher, learner, augmented reality, collaboration, shared leadership, preschool education

Procedia PDF Downloads 41
28994 Molecular Modeling and Prediction of the Physicochemical Properties of Polyols in Aqueous Solution

Authors: Maria Fontenele, Claude-Gilles Dussap, Vincent Dumouilla, Baptiste Boit

Abstract:

Roquette Frères is a producer of plant-based ingredients that employs many processes to extract relevant molecules and often transforms them through chemical and physical processes to create desired ingredients with specific functionalities. In this context, Roquette encounters numerous multi-component complex systems in their processes, including fibers, proteins, and carbohydrates, in an aqueous environment. To develop, control, and optimize both new and old processes, Roquette aims to develop new in silico tools. Currently, Roquette uses process modelling tools which include specific thermodynamic models and is willing to develop computational methodologies such as molecular dynamics simulations to gain insights into the complex interactions in such complex media, and especially hydrogen bonding interactions. The issue at hand concerns aqueous mixtures of polyols with high dry matter content. The polyols mannitol and sorbitol molecules are diastereoisomers that have nearly identical chemical structures but very different physicochemical properties: for example, the solubility of sorbitol in water is 2.5 kg/kg of water, while mannitol has a solubility of 0.25 kg/kg of water at 25°C. Therefore, predicting liquid-solid equilibrium properties in this case requires sophisticated solution models that cannot be based solely on chemical group contributions, knowing that for mannitol and sorbitol, the chemical constitutive groups are the same. Recognizing the significance of solvation phenomena in polyols, the GePEB (Chemical Engineering, Applied Thermodynamics, and Biosystems) team at Institut Pascal has developed the COSMO-UCA model, which has the structural advantage of using quantum mechanics tools to predict formation and phase equilibrium properties. In this work, we use molecular dynamics simulations to elucidate the behavior of polyols in aqueous solution. Specifically, we employ simulations to compute essential metrics such as radial distribution functions and hydrogen bond autocorrelation functions. Our findings illuminate a fundamental contrast: sorbitol and mannitol exhibit disparate hydrogen bond lifetimes within aqueous environments. This observation serves as a cornerstone in elucidating the divergent physicochemical properties inherent to each compound, shedding light on the nuanced interplay between their molecular structures and water interactions. We also present a methodology to predict the physicochemical properties of complex solutions, taking as sole input the three-dimensional structure of the molecules in the medium. Finally, by developing knowledge models, we represent some physicochemical properties of aqueous solutions of sorbitol and mannitol.

Keywords: COSMO models, hydrogen bond, molecular dynamics, thermodynamics

Procedia PDF Downloads 43
28993 Prediction of Alzheimer's Disease Based on Blood Biomarkers and Machine Learning Algorithms

Authors: Man-Yun Liu, Emily Chia-Yu Su

Abstract:

Alzheimer's disease (AD) is the public health crisis of the 21st century. AD is a degenerative brain disease and the most common cause of dementia, a costly disease on the healthcare system. Unfortunately, the cause of AD is poorly understood, furthermore; the treatments of AD so far can only alleviate symptoms rather cure or stop the progress of the disease. Currently, there are several ways to diagnose AD; medical imaging can be used to distinguish between AD, other dementias, and early onset AD, and cerebrospinal fluid (CSF). Compared with other diagnostic tools, blood (plasma) test has advantages as an approach to population-based disease screening because it is simpler, less invasive also cost effective. In our study, we used blood biomarkers dataset of The Alzheimer’s disease Neuroimaging Initiative (ADNI) which was funded by National Institutes of Health (NIH) to do data analysis and develop a prediction model. We used independent analysis of datasets to identify plasma protein biomarkers predicting early onset AD. Firstly, to compare the basic demographic statistics between the cohorts, we used SAS Enterprise Guide to do data preprocessing and statistical analysis. Secondly, we used logistic regression, neural network, decision tree to validate biomarkers by SAS Enterprise Miner. This study generated data from ADNI, contained 146 blood biomarkers from 566 participants. Participants include cognitive normal (healthy), mild cognitive impairment (MCI), and patient suffered Alzheimer’s disease (AD). Participants’ samples were separated into two groups, healthy and MCI, healthy and AD, respectively. We used the two groups to compare important biomarkers of AD and MCI. In preprocessing, we used a t-test to filter 41/47 features between the two groups (healthy and AD, healthy and MCI) before using machine learning algorithms. Then we have built model with 4 machine learning methods, the best AUC of two groups separately are 0.991/0.709. We want to stress the importance that the simple, less invasive, common blood (plasma) test may also early diagnose AD. As our opinion, the result will provide evidence that blood-based biomarkers might be an alternative diagnostics tool before further examination with CSF and medical imaging. A comprehensive study on the differences in blood-based biomarkers between AD patients and healthy subjects is warranted. Early detection of AD progression will allow physicians the opportunity for early intervention and treatment.

Keywords: Alzheimer's disease, blood-based biomarkers, diagnostics, early detection, machine learning

Procedia PDF Downloads 322
28992 An Application of Extreme Value Theory as a Risk Measurement Approach in Frontier Markets

Authors: Dany Ng Cheong Vee, Preethee Nunkoo Gonpot, Noor Sookia

Abstract:

In this paper, we consider the application of Extreme Value Theory as a risk measurement tool. The Value at Risk, for a set of indices, from six Stock Exchanges of Frontier markets is calculated using the Peaks over Threshold method and the performance of the model index-wise is evaluated using coverage tests and loss functions. Our results show that 'fat-tailedness' alone of the data is not enough to justify the use of EVT as a VaR approach. The structure of the returns dynamics is also a determining factor. This approach works fine in markets which have had extremes occurring in the past thus making the model capable of coping with extremes coming up (Colombo, Tunisia and Zagreb Stock Exchanges). On the other hand, we find that indices with lower past than present volatility fail to adequately deal with future extremes (Mauritius and Kazakhstan). We also conclude that using EVT alone produces quite static VaR figures not reflecting the actual dynamics of the data.

Keywords: extreme value theory, financial crisis 2008, value at risk, frontier markets

Procedia PDF Downloads 276
28991 Effect of 12 Weeks Pedometer-Based Workplace Program on Inflammation and Arterial Stiffness in Young Men with Cardiovascular Risks

Authors: Norsuhana Omar, Amilia Aminuddina Zaiton Zakaria, Raifana Rosa Mohamad Sattar, Kalaivani Chellappan, Mohd Alauddin Mohd Ali, Norizam Salamt, Zanariyah Asmawi, Norliza Saari, Aini Farzana Zulkefli, Nor Anita Megat Mohd. Nordin

Abstract:

Inflammation plays an important role in the pathogenesis of vascular dysfunction leading to arterial stiffness. Pulse wave velocity (PWV) and augmentation index (AS), as tools for the assessment of vascular damages are widely used and have been shown to predict cardiovascular disease (CVD). C-reactive protein (CRP) is a marker of inflammation. Several studies noted that regular exercise is associated with reduced arterial stiffness. The lack of exercise among Malaysians and the increasing CVD morbidity and mortality among young men are of concern. In Malaysia data on the workplace exercise intervention is scarce. A programme was designed to enable subjects to increase their level of walking as part of their daily work routine and self-monitored by using pedometers. The aim of this study to evaluate the reducing of inflammation by measuring CRP and improvement arterial stiffness measured by carotid femoral PWV (PWVCF) and AI. A total of 70 young men (20 - 40 years) who were sedentary, achieving less than 5,000 steps/day in casual walking with 2 or more cardiovascular risk factors were recruited in Institute of Vocational Skills for Youth (IKBN Hulu Langat). Subjects were randomly assigned to a control (CG) (n=34; no change in walking) and pedometer group (PG) (n=36; minimum target: 8,000 steps/day). The CRP was measured by using immunological method while PWVCF and AI were measured using Vicorder. All parameters were measured at baseline and after 12 weeks. Data for analysis was conducted using Statistical Package of Social Sciences Version 22 (SPSS Inc., Chicago, IL, USA). At post intervention, the CG step counts were similar (4983 ± 366vs 5697 ± 407steps/day). The PG increased step count from 4996 ± 805 to 10,128 ±511 steps/day (P<0.001). The PG showed significant improvement in anthropometric variables and lipid (time and group effect p<0.001). For vascular assessment, the PG showed significantly decreased for time and effect (p<0.001) for PWV (7.21± 0.83 to 6.42 ± 0.89) m/s; AI (11.88± 6.25 to 8.83 ± 3.7) % and CRP (pre= 2.28 ± 3.09, post=1.08± 1.37mg/L). However, no changes were seen in CG. As a conclusion, a pedometer-based walking programme may be an effective strategy for promoting increased daily physical activity which reduces cardiovascular risk markers and thus improve cardiovascular health in terms of inflammation and arterial stiffness. The community intervention for health maintenance has potential to adopt walking as an exercise and adopting vascular fitness index as the performance measuring tools.

Keywords: arterial stiffness, exercise, inflammation, pedometer

Procedia PDF Downloads 354
28990 Organizational Culture and Its Internalization of Change in the Manufacturing and Service Sector Industries in India

Authors: Rashmi Uchil, A. H. Sequeira

Abstract:

Post-liberalization era in India has seen an unprecedented growth of mergers, both domestic as well as cross-border deals. Indian organizations have slowly begun appreciating this inorganic method of growth. However, all is not well as is evidenced in the lowering value creation of organizations after mergers. Several studies have identified that organizational culture is one of the key factors that affects the success of mergers. But very few studies have been attempted in this realm in India. The current study attempts to identify the factors in the organizational culture variable that may be unique to India. It also focuses on the difference in the impact of organizational culture on merger of organizations in the manufacturing and service sectors in India. The study uses a mixed research approach. An exploratory research approach is adopted to identify the variables that constitute organizational culture specifically in the Indian scenario. A few hypotheses were developed from the identified variables and tested to arrive at the Grounded Theory. The Grounded Theory approach used in the study, attempts to integrate the variables related to organizational culture. Descriptive approach is used to validate the developed grounded theory with a new empirical data set and thus test the relationship between the organizational culture variables and the success of mergers. Empirical data is captured from merged organizations situated in major cities of India. These organizations represent significant proportions of the total number of organizations which have adopted mergers. The mix of industries included software, banking, manufacturing, pharmaceutical and financial services. Mixed sampling approach was adopted for this study. The first phase of sampling was conducted using the probability method of stratified random sampling. The study further used the non-probability method of judgmental sampling. Adequate sample size was identified for the study which represents the top, middle and junior management levels of the organizations that had adopted mergers. Validity and reliability of the research instrument was ensured with appropriate tests. Statistical tools like regression analysis, correlation analysis and factor analysis were used for data analysis. The results of the study revealed a strong relationship between organizational culture and its impact on the success of mergers. The study also revealed that the results were unique to the extent that they highlighted a marked difference in the manner of internalization of change of organizational culture after merger by the organizations in the manufacturing sector. Further, the study reveals that the organizations in the service sector internalized the changes at a slower rate. The study also portrays the industries in the manufacturing sector as more proactive and can contribute to a change in the perception of the said organizations.

Keywords: manufacturing industries, mergers, organizational culture, service industries

Procedia PDF Downloads 297
28989 The Significance of Urban Space in Death Trilogy of Alejandro González Iñárritu

Authors: Marta Kaprzyk

Abstract:

The cinema of Alejandro González Iñárritu hasn’t been subjected to a lot of detailed analysis yet, what makes it an exceptionally interesting research material. The purpose of this presentation is to discuss the significance of urban space in three films of this Mexican director, that forms Death Trilogy: ‘Amores Perros’ (2000), ‘21 Grams’ (2003) and ‘Babel’ (2006). The fact that in the aforementioned movies the urban space itself becomes an additional protagonist with its own identity, psychology and the ability to transform and affect other characters, in itself warrants for independent research and analysis. Independently, such mode of presenting urban space has another function; it enables the director to complement the rest of characters. The basis for methodology of this description of cinematographic space is to treat its visual layer as a point of departure for a detailed analysis. At the same time, the analysis itself will be supported by recognised academic theories concerning special issues, which are transformed here into essential tools necessary to describe the world (mise-en-scène) created by González Iñárritu. In ‘Amores perros’ the Mexico City serves as a scenery – a place full of contradictions- in the movie depicted as a modern conglomerate and an urban jungle, as well as a labyrinth of poverty and violence. In this work stylistic tropes can be found in an intertextual dialogue of the director with photographies of Nan Goldin and Mary Ellen Mark. The story recounted in ‘21 Grams’, the most tragic piece in the trilogy, is characterised by almost hyperrealistic sadism. It takes place in Memphis, which on the screen turns into an impersonal formation full of heterotopias described by Michel Foucault and non-places, as defined by Marc Augé in his essay. By contrast, the main urban space in ‘Babel’ is Tokio, which seems to perfectly correspond with the image of places discussed by Juhani Pallasmaa in his works concerning the reception of the architecture by ‘pathological senses’ in the modern (or, even more adequately, postmodern) world. It’s portrayed as a city full of buildings that look so surreal, that they seem to be completely unsuitable for the humans to move between them. Ultimately, the aim of this paper is to demonstrate the coherence of the manner in which González Iñárritu designs urban spaces in his Death Trilogy. In particular, the author attempts to examine the imperative role of the cities that form three specific microcosms in which the protagonists of the Mexican director live their overwhelming tragedies.

Keywords: cinematographic space, Death Trilogy, film Studies, González Iñárritu Alejandro, urban space

Procedia PDF Downloads 333
28988 Guidelines for Enhancing the Learning Environment by the Integration of Design Flexibility and Immersive Technology: The Case of the British University in Egypt’s Classrooms

Authors: Eman Ayman, Gehan Nagy

Abstract:

The learning environment has four main parameters that affect its efficiency which they are: pedagogy, user, technology, and space. According to Morrone, enhancing these parameters to be adaptable for future developments is essential. The educational organization will be in need of developing its learning spaces. Flexibility of design an immersive technology could be used as tools for this development. when flexible design concepts are used, learning spaces that can accommodate a variety of teaching and learning activities are created. To accommodate the various needs and interests of students, these learning spaces are easily reconfigurable and customizable. The immersive learning opportunities offered by technologies like virtual reality, augmented reality, and interactive displays, on the other hand, transcend beyond the confines of the traditional classroom. These technological advancements could improve learning. This thesis highlights the problem of the lack of innovative, flexible learning spaces in educational institutions. It aims to develop guidelines for enhancing the learning environment by the integration of flexible design and immersive technology. This research uses a mixed method approach, both qualitative and quantitative: the qualitative section is related to the literature review theories and case studies analysis. On the other hand, the quantitative section will be identified by the results of the applied studies of the effectiveness of redesigning a learning space from its traditional current state to a flexible technological contemporary space that will be adaptable to many changes and educational needs. Research findings determine the importance of flexibility in learning spaces' internal design as it enhances the space optimization and capability to accommodate the changes and record the significant contribution of immersive technology that assists the process of designing. It will be summarized by the questionnaire results and comparative analysis, which will be the last step of finalizing the guidelines.

Keywords: flexibility, learning space, immersive technology, learning environment, interior design

Procedia PDF Downloads 94
28987 Bioecological Assessment of Cage Farming on the Soft Bottom Benthic Communities of the Vlora Gulf (Albania)

Authors: Ina Nasto, Denada Sota, Pudrila Haskoçelaj, Mariola Ismailaj, Hajdar Kicaj

Abstract:

Most of the fishing areas of the Mediterranean Sea are considered to be overfished, consequently fishing has decreased or is static. Considering the continuous increase in demand for fish, the option of aquaculture production has had a growing development in recent decades. The environmental impact of aquaculture in the marine ecosystem has been a subject of study for several years in the Mediterranean. In the case of the Albanian waters, and in particular the Gulf of Vlora, have had a progressive growing of aquaculture activity in the last twenty years. Given the convenient and secluded location for tourist activities, the bay of Ragusa was considered as the most suitable area to install the aquaculture cage system for the breeding of sea bass and sea bream. The impact of aquaculture in on the soft bottom benthic communities has been assessed at the biggest commercial fish farm (Alb-Adriatico Sh.P.K) established in coastal waters of Ragusa bay 30–50 m deep, in the southern part of the Gulf of Vlora. In order to determine if there is a possible impact on the aquaculture cage in benthic communities, a comparative analysis was undertaken between transects and samples with differences in distances between them and with a gradient of distance from the fish cages. A total of 275 taxa were identified (1 Foraminifera, 1 Porifera, 3 Cnidaria, 2 Platyhelminthes, 2 Nemertea, 1 Bryozoa, 171 Mollusca, 39 Annelida, 35 Crustacea, 14 Echinodermata, 1 Hemichordata, and 5 Tunicata). The anaysis showed three main habitats in the area: biocoenosis of terrigenous mud, residual areas with Possidonia oceanica and also residual assemblages of algal coralligenous. Four benthic biotic indexes were calculated (Shannon H ’, BENTIX, Simpson's Diversity and Peilou’s J’) also benthic indicators as total abundance, number of taxa and species frequency to evaluate possible ecological impact of fish cages in Ragusa bay.

Keywords: Bentix index, Benthic community, invertebrates, aquaculture, Raguza bay

Procedia PDF Downloads 100