Search results for: coding complexity metric mccabe
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2523

Search results for: coding complexity metric mccabe

873 The Inclusive Human Trafficking Checklist: A Dialectical Measurement Methodology

Authors: Maria C. Almario, Pam Remer, Jeff Resse, Kathy Moran, Linda Theander Adam

Abstract:

The identification of victims of human trafficking and consequential service provision is characterized by a significant disconnection between the estimated prevalence of this issue and the number of cases identified. This poses as tremendous problem for human rights advocates as it prevents data collection, information sharing, allocation of resources and opportunities for international dialogues. The current paper introduces the Inclusive Human Trafficking Checklist (IHTC) as a measurement methodology with theoretical underpinnings derived from dialectic theory. The presence of human trafficking in a person’s life is conceptualized as a dynamic and dialectic interaction between vulnerability and exploitation. The current papers explores the operationalization of exploitation and vulnerability, evaluates the metric qualities of the instrument, evaluates whether there are differences in assessment based on the participant’s profession, level of knowledge, and training, and assesses if users of the instrument perceive it as useful. A total of 201 participants were asked to rate three vignettes predetermined by experts to qualify as a either human trafficking case or not. The participants were placed in three conditions: business as usual, utilization of the IHTC with and without training. The results revealed a statistically significant level of agreement between the expert’s diagnostic and the application of the IHTC with an improvement of 40% on identification when compared with the business as usual condition While there was an improvement in identification in the group with training, the difference was found to have a small effect size. Participants who utilized the IHTC showed an increased ability to identify elements of identity-based vulnerabilities as well as elements of fraud, which according to the results, are distinctive variables in cases of human trafficking. In terms of the perceived utility, the results revealed higher mean scores for the groups utilizing the IHTC when compared to the business as usual condition. These findings suggest that the IHTC improves appropriate identification of cases and that it is perceived as a useful instrument. The application of the IHTC as a multidisciplinary instrumentation that can be utilized in legal and human services settings is discussed as a pivotal piece of helping victims restore their sense of dignity, and advocate for legal, physical and psychological reparations. It is noteworthy that this study was conducted with a sample in the United States and later re-tested in Colombia. The implications of the instrument for treatment conceptualization and intervention in human trafficking cases are discussed as opportunities for enhancement of victim well-being, restoration engagement and activism. With the idea that what is personal is also political, we believe that the careful observation and data collection in specific cases can inform new areas of human rights activism.

Keywords: exploitation, human trafficking, measurement, vulnerability, screening

Procedia PDF Downloads 331
872 Origamic Forms: A New Realm in Improving Acoustical Environment

Authors: Mostafa Refat Ismail, Hazem Eldaly

Abstract:

The adaptation of architecture design to building function is getting highly needed in contemporary designs, especially with the great progression in design methods and tools. This, in turn, requires great flexibility in design strategies, as well as a wider spectrum of space settings to achieve the required environment that special activities imply. Acoustics is an essential factor influencing cognitive acts and behavior as well as, on the extreme end, the physical well-being inside a space. The complexity of this constrain is fueled up by the extended geometric dimensions of multipurpose halls, making acoustic adequateness a great concern that could not easily be achieved for each purpose. To achieve a performance oriented acoustic environment, various parametric shaped false ceilings based on origami folded notion are simulated. These parametric origami shapes are able to fold and unfold forming an interactive structure that changes the mutual acoustic environment according to the geometric shapes' position and its changing exposed surface areas. The mobility of the facets in the origami surface can stretch up the range from a complete plain surface to an unfolded element where a considerable amount of absorption is added to the space. The behavior of the parametric origami shapes are being modeled employing a ray tracing computer simulation package for various shapes topology. The conclusion shows a great variation in the acoustical performance due to the variation in folding faces of the origami surfaces, which cause different reflections and consequently large variations in decay curves.

Keywords: parametric, origami, acoustics, architecture

Procedia PDF Downloads 285
871 Impact of Emotional Intelligence of Principals in High Schools on Teachers Conflict Management: A Case Study on Secondary Schools, Tehran, Iran

Authors: Amir Ahmadi, Hossein Ahmadi, Alireza Ahmadi

Abstract:

Emotional Intelligence (EI) has been defined as the ability to empathize, persevere, control impulses, communicate clearly, make thoughtful decisions, solve problems, and work with others in a way that earns friends and success. These abilities allow an individual to recognize and regulate emotion, develop self-control, set goals, develop empathy, resolve conflicts, and develop skills needed for leadership and effective group participation. Due to the increasing complexity of organizations and different ways of thinking, attitudes and beliefs of individuals, Conflict as an important part of organizational life has been examined frequently. The main point is that the conflict is not necessarily in organization, unnecessary; But it can be more creative (increase creativity), to promote innovation, or may avoid wasting energy and resources of the organization. The purpose of this study was to investigate the relation between principals emotional intelligence as one of the factors affecting conflict management among teachers. This relation was analyzed through cluster sampling with a sample size consisting of 120 individuals. The results of the study showed that, at the 95% level of confidence, the two secondary hypotheses (i.e. relation between emotional intelligence of principals and use of competition and cooperation strategies of conflict management among teachers)were confirmed, but the other three secondary hypotheses (i.e. the relation between emotional intelligence of managers and use of avoidance, adaptation and adaptability strategies of conflict management among teachers) were rejected. The primary hypothesis (i.e. relation between emotional intelligence of principals with conflict management among teachers) is supported.

Keywords: emotional intelligence, conflict, conflict management, strategies of conflict management

Procedia PDF Downloads 358
870 Language in Court: Ideology, Power and Cognition

Authors: Mehdi Damaliamiri

Abstract:

Undoubtedly, the power of language is hardly a new topic; indeed, the persuasive power of language accompanied by ideology has long been recognized in different aspects of life. The two and a half thousand-year-old Bisitun inscriptions in Iran, proclaiming the victories of the Persian King, Darius, are considered by some historians to have been an early example of the use of propaganda. Added to this, the modern age is the true cradle of fully-fledged ideologies and the ongoing process of centrifugal ideologization. The most visible work on ideology today within the field of linguistics is “Critical Discourse Analysis” (CDA). The focus of CDA is on “uncovering injustice, inequality, taking sides with the powerless and suppressed” and making “mechanisms of manipulation, discrimination, demagogy, and propaganda explicit and transparent.” possible way of relating language to ideology is to propose that ideology and language are inextricably intertwined. From this perspective, language is always ideological, and ideology depends on the language. All language use involves ideology, and so ideology is ubiquitous – in our everyday encounters, as much as in the business of the struggle for power within and between the nation-states and social statuses. At the same time, ideology requires language. Its key characteristics – its power and pervasiveness, its mechanisms for continuity and for change – all come out of the inner organization of language. The two phenomena are homologous: they share the same evolutionary trajectory. To get a more robust portrait of the power and ideology, we need to examine its potential place in the structure, and consider how such structures pattern in terms of the functional elements which organize meanings in the clause. This is based on the belief that all grammatical, including syntactic, knowledge is stored mentally as constructions have become immensely popular. When the structure of the clause is taken into account, the power and ideology have a preference for Complement over Subject and Adjunct. The subject is a central interpersonal element in discourse: it is one of two elements that form the central interactive nub of a proposition. Conceptually, there are countless ways of construing a given event and linguistically, a variety of grammatical devices that are usually available as alternate means of coding a given conception, such as political crime and corruption. In the theory of construal, then, which, like transitivity in Halliday, makes options available, Cognitive Linguistics can offer a cognitive account of ideology in language, where ideology is made possible by the choices a language allows for representing the same material situation in different ways. The possibility of promoting alternative construals of the same reality means that any particular choice in representation is always ideologically constrained or motivated and indicates the perspective and interests of the text-producer.

Keywords: power, ideology, court, discourse

Procedia PDF Downloads 164
869 Optimization of the Mechanical Performance of Fused Filament Fabrication Parts

Authors: Iván Rivet, Narges Dialami, Miguel Cervera, Michele Chiumenti

Abstract:

Process parameters in Additive Manufacturing (AM) play a critical role in the mechanical performance of the final component. In order to find the input configuration that guarantees the optimal performance of the printed part, the process-performance relationship must be found. Fused Filament Fabrication (FFF) is the selected demonstrative AM technology due to its great popularity in the industrial manufacturing world. A material model that considers the different printing patterns present in a FFF part is used. A voxelized mesh is built from the manufacturing toolpaths described in the G-Code file. An Adaptive Mesh Refinement (AMR) based on the octree strategy is used in order to reduce the complexity of the mesh while maintaining its accuracy. High-fidelity and cost-efficient Finite Element (FE) simulations are performed and the influence of key process parameters in the mechanical performance of the component is analyzed. A robust optimization process based on appropriate failure criteria is developed to find the printing direction that leads to the optimal mechanical performance of the component. The Tsai-Wu failure criterion is implemented due to the orthotropy and heterogeneity constitutive nature of FFF components and because of the differences between the strengths in tension and compression. The optimization loop implements a modified version of an Anomaly Detection (AD) algorithm and uses the computed metrics to obtain the optimal printing direction. The developed methodology is verified with a case study on an industrial demonstrator.

Keywords: additive manufacturing, optimization, printing direction, mechanical performance, voxelization

Procedia PDF Downloads 64
868 Study and Simulation of the Thrust Vectoring in Supersonic Nozzles

Authors: Kbab H, Hamitouche T

Abstract:

In recent years, significant progress has been accomplished in the field of aerospace propulsion and propulsion systems. These developments are associated with efforts to enhance the accuracy of the analysis of aerothermodynamic phenomena in the engine. This applies in particular to the flow in the nozzles used. One of the most remarkable processes in this field is thrust vectoring by means of devices able to orientate the thrust vector and control the deflection of the exit jet in the engine nozzle. In the study proposed, we are interested in the fluid thrust vectoring using a second injection in the nozzle divergence. This fluid injection causes complex phenomena, such as boundary layer separation, which generates a shock wave in the primary jet upstream of the fluid interacting zone (primary jet - secondary jet). This will cause the deviation of the main flow, and therefore of the thrust vector with reference to the axis nozzle. In the modeling of the fluidic thrust vector, various parameters can be used. The Mach number of the primary jet and the injected fluid, the total pressures ratio, the injection rate, the thickness of the upstream boundary layer, the injector position in the divergent part, and the nozzle geometry are decisive factors in this type of phenomenon. The complexity of the latter challenges researchers to understand the physical phenomena of the turbulent boundary layer encountered in supersonic nozzles, as well as the calculation of its thickness and the friction forces induced on the walls. The present study aims to numerically simulate the thrust vectoring by secondary injection using the ANSYS-FLUENT, then to analyze and validate the results and the performances obtained (angle of deflection, efficiency...), which will then be compared with those obtained by other authors.

Keywords: CD Nozzle, TVC, SVC, NPR, CFD, NPR, SPR

Procedia PDF Downloads 136
867 4D Modelling of Low Visibility Underwater Archaeological Excavations Using Multi-Source Photogrammetry in the Bulgarian Black Sea

Authors: Rodrigo Pacheco-Ruiz, Jonathan Adams, Felix Pedrotti

Abstract:

This paper introduces the applicability of underwater photogrammetric survey within challenging conditions as the main tool to enhance and enrich the process of documenting archaeological excavation through the creation of 4D models. Photogrammetry was being attempted on underwater archaeological sites at least as early as the 1970s’ and today the production of traditional 3D models is becoming a common practice within the discipline. Photogrammetry underwater is more often implemented to record exposed underwater archaeological remains and less so as a dynamic interpretative tool.  Therefore, it tends to be applied in bright environments and when underwater visibility is > 1m, reducing its implementation on most submerged archaeological sites in more turbid conditions. Recent years have seen significant development of better digital photographic sensors and the improvement of optical technology, ideal for darker environments. Such developments, in tandem with powerful processing computing systems, have allowed underwater photogrammetry to be used by this research as a standard recording and interpretative tool. Using multi-source photogrammetry (5, GoPro5 Hero Black cameras) this paper presents the accumulation of daily (4D) underwater surveys carried out in the Early Bronze Age (3,300 BC) to Late Ottoman (17th Century AD) archaeological site of Ropotamo in the Bulgarian Black Sea under challenging conditions (< 0.5m visibility). It proves that underwater photogrammetry can and should be used as one of the main recording methods even in low light and poor underwater conditions as a way to better understand the complexity of the underwater archaeological record.

Keywords: 4D modelling, Black Sea Maritime Archaeology Project, multi-source photogrammetry, low visibility underwater survey

Procedia PDF Downloads 238
866 A Comparative Analysis Approach Based on Fuzzy AHP, TOPSIS and PROMETHEE for the Selection Problem of GSCM Solutions

Authors: Omar Boutkhoum, Mohamed Hanine, Abdessadek Bendarag

Abstract:

Sustainable economic growth is nowadays driving firms to extend toward the adoption of many green supply chain management (GSCM) solutions. However, the evaluation and selection of these solutions is a matter of concern that needs very serious decisions, involving complexity owing to the presence of various associated factors. To resolve this problem, a comparative analysis approach based on multi-criteria decision-making methods is proposed for adequate evaluation of sustainable supply chain management solutions. In the present paper, we propose an integrated decision-making model based on FAHP (Fuzzy Analytic Hierarchy Process), TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) and PROMETHEE (Preference Ranking Organisation METHod for Enrichment Evaluations) to contribute to a better understanding and development of new sustainable strategies for industrial organizations. Due to the varied importance of the selected criteria, FAHP is used to identify the evaluation criteria and assign the importance weights for each criterion, while TOPSIS and PROMETHEE methods employ these weighted criteria as inputs to evaluate and rank the alternatives. The main objective is to provide a comparative analysis based on TOPSIS and PROMETHEE processes to help make sound and reasoned decisions related to the selection problem of GSCM solution.

Keywords: GSCM solutions, multi-criteria analysis, decision support system, TOPSIS, FAHP, PROMETHEE

Procedia PDF Downloads 164
865 Framework for Enhancing Water Literacy and Sustainable Management in Southwest Nova Scotia

Authors: Etienne Mfoumou, Mo Shamma, Martin Tango, Michael Locke

Abstract:

Water literacy is essential for addressing emerging water management challenges in southwest Nova Scotia (SWNS), where growing concerns over water scarcity and sustainability have highlighted the need for improved educational frameworks. Current approaches often fail to fully represent the complexity of water systems, focusing narrowly on the water cycle while neglecting critical aspects such as groundwater infiltration and the interconnectedness of surface and subsurface water systems. To address these gaps, this paper proposes a comprehensive framework for water literacy that integrates the physical dimensions of water systems with key aspects of understanding, including processes, energy, scale, and human dependency. Moreover, a suggested tool to enhance this framework is a real-time hydrometric data map supported by a network of water level monitoring devices deployed across the province. These devices, particularly for monitoring dug wells, would provide critical data on groundwater levels and trends, offering stakeholders actionable insights into water availability and sustainability. This real-time data would facilitate deeper understanding and engagement with local water issues, complementing the educational framework and empowering stakeholders to make informed decisions. By integrating this tool, the proposed framework offers a practical, interdisciplinary approach to improving water literacy and promoting sustainable water management in SWNS.

Keywords: water education, water literacy, water management, water systems, Southwest Nova Scotia

Procedia PDF Downloads 33
864 Developing a Framework to Aid Sustainable Assessment in Indian Buildings

Authors: P. Amarnath, Albert Thomas

Abstract:

Buildings qualify to be the major consumer of energy and resources thereby urging the designers, architects and policy makers to place a great deal of effort in achieving and implementing sustainable building strategies in construction. Green building rating systems help a great deal in this by measuring the effectiveness of these strategies along with the escalation of building performance in social, environmental and economic perspective, and construct new sustainable buildings. However, for a country like India, enormous population and its rapid rate of growth impose an increasing burden on the country's limited and continuously degrading natural resource base, which also includes the land available for construction. In general, the number of sustainable rated buildings in India is very minimal primarily due to the complexity and obstinate nature of the assessment systems/regulations that restrict the stakeholders and designers in proper implementation and utilization of these rating systems. This paper aims to introduce a data driven and user-friendly framework which cross compares the present prominent green building rating systems such as LEED, BREEAM, and GRIHA and subsequently help the users to rate their proposed building design as per the regulations of these assessment frameworks. This framework is validated using the input data collected from green buildings constructed globally. The proposed system has prospects to encourage the users to test the efficiency of various sustainable construction practices and thereby promote more sustainable buildings in the country.

Keywords: BREEAM, GRIHA, green building rating systems, LEED, sustainable buildings

Procedia PDF Downloads 140
863 Distributed System Computing Resource Scheduling Algorithm Based on Deep Reinforcement Learning

Authors: Yitao Lei, Xingxiang Zhai, Burra Venkata Durga Kumar

Abstract:

As the quantity and complexity of computing in large-scale software systems increase, distributed system computing becomes increasingly important. The distributed system realizes high-performance computing by collaboration between different computing resources. If there are no efficient resource scheduling resources, the abuse of distributed computing may cause resource waste and high costs. However, resource scheduling is usually an NP-hard problem, so we cannot find a general solution. However, some optimization algorithms exist like genetic algorithm, ant colony optimization, etc. The large scale of distributed systems makes this traditional optimization algorithm challenging to work with. Heuristic and machine learning algorithms are usually applied in this situation to ease the computing load. As a result, we do a review of traditional resource scheduling optimization algorithms and try to introduce a deep reinforcement learning method that utilizes the perceptual ability of neural networks and the decision-making ability of reinforcement learning. Using the machine learning method, we try to find important factors that influence the performance of distributed system computing and help the distributed system do an efficient computing resource scheduling. This paper surveys the application of deep reinforcement learning on distributed system computing resource scheduling proposes a deep reinforcement learning method that uses a recurrent neural network to optimize the resource scheduling, and proposes the challenges and improvement directions for DRL-based resource scheduling algorithms.

Keywords: resource scheduling, deep reinforcement learning, distributed system, artificial intelligence

Procedia PDF Downloads 113
862 Towards Green(er) Cities: The Role of Spatial Planning in Realising the Green Agenda

Authors: Elizelle Juaneé Cilliers

Abstract:

The green hype is becoming stronger within various disciplines, modern practices and academic thinking, enforced by concepts such as eco-health, eco-tourism, eco-cities, and eco-engineering. There is currently also an expanded scientific understanding regarding the value and benefits relating to green infrastructure, for both communities and their host cities, linked to broader sustainability and resilience thinking. The integration and implementation of green infrastructure as part of spatial planning approaches and municipal planning, are, however, more complex, especially in South Africa, inflated by limitations of budgets and human resources, development pressures, inequities in terms of green space availability and political legacies of the past. The prevailing approach to spatial planning is further contributing to complexity, linked to misguided perceptions of the function and value of green infrastructure. As such, green spaces are often considered a luxury, and green infrastructure a costly alternative, resulting in green networks being susceptible to land-use changes and under-prioritized in local authority decision-making. Spatial planning, in this sense, may well be a valuable tool to realise the green agenda, encapsulating various initiatives of sustainability as provided by a range of disciplines. This paper aims to clarify the importance and value of green infrastructure planning as a component of spatial planning approaches, in order to inform and encourage local authorities to embed sustainability thinking into city planning and decision-making approaches. It reflects on the decisive role of land-use management to guide the green agenda and refers to some recent planning initiatives. Lastly, it calls for trans-disciplinary planning approaches to build a case towards green(er) cities.

Keywords: green infrastructure, spatial planning, transdisciplinary, integrative

Procedia PDF Downloads 255
861 Sustainable Production of Tin Oxide Nanoparticles: Exploring Synthesis Techniques, Formation Mechanisms, and Versatile Applications

Authors: Yemane Tadesse Gebreslassie, Henok Gidey Gebretnsae

Abstract:

Nanotechnology has emerged as a highly promising field of research with wide-ranging applications across various scientific disciplines. In recent years, tin oxide has garnered significant attention due to its intriguing properties, particularly when synthesized in the nanoscale range. While numerous physical and chemical methods exist for producing tin oxide nanoparticles, these approaches tend to be costly, energy-intensive, and involve the use of toxic chemicals. Given the growing concerns regarding human health and environmental impact, there has been a shift towards developing cost-effective and environmentally friendly processes for tin oxide nanoparticle synthesis. Green synthesis methods utilizing biological entities such as plant extracts, bacteria, and natural biomolecules have shown promise in successfully producing tin oxide nanoparticles. However, scaling up the production to an industrial level using green synthesis approaches remains challenging due to the complexity of biological substrates, which hinders the elucidation of reaction mechanisms and formation processes. Thus, this review aims to provide an overview of the various sources of biological entities and methodologies employed in the green synthesis of tin oxide nanoparticles, as well as their impact on nanoparticle properties. Furthermore, this research delves into the strides made in comprehending the mechanisms behind the formation of nanoparticles as documented in existing literature. It also sheds light on the array of analytical techniques employed to investigate and elucidate the characteristics of these minuscule particles.

Keywords: nanotechnology, tin oxide, green synthesis, formation mechanisms

Procedia PDF Downloads 55
860 Foreign Languages and Employability in the European Union

Authors: Paulina Pietrzyk-Kowalec

Abstract:

This paper presents the phenomenon of multilingualism becoming the norm rather than the exception in the European Union. It also seeks to describe the correlation between the command of foreign languages and employability. It is evident that the challenges of today's societies when it comes to employability and to the reality of the current labor market are more and more diversified. Thus, it is one of the crucial tasks of higher education to prepare its students to face this kind of complexity, understand its nuances, and have the capacity to adapt effectively to situations that are common in corporations based in the countries belonging to the EU. From this point of view, the assessment of the impact that the command of foreign languages of European university students could have on the numerous business sectors becomes vital. It also involves raising awareness of future professionals to make them understand the importance of mastering communicative skills in foreign languages that will meet the requirements of students' prospective employers. The direct connection between higher education institutions and the world of business also allows companies to realize that they should rethink their recruitment and human resources procedures in order to take into account the importance of foreign languages. This article focuses on the objective of the multilingualism policy developed by the European Commission, which is to enable young people to master at least two foreign languages, which is crucial in their future careers. The article puts emphasis on the existence of a crucial connection between the research conducted in higher education institutions and the business sector in order to reduce current qualification gaps.

Keywords: cross-cultural communication, employability, human resources, language attitudes, multilingualism

Procedia PDF Downloads 135
859 Systems Engineering Management Using Transdisciplinary Quality System Development Lifecycle Model

Authors: Mohamed Asaad Abdelrazek, Amir Taher El-Sheikh, M. Zayan, A.M. Elhady

Abstract:

The successful realization of complex systems is dependent not only on the technology issues and the process for implementing them, but on the management issues as well. Managing the systems development lifecycle requires technical management. Systems engineering management is the technical management. Systems engineering management is accomplished by incorporating many activities. The three major activities are development phasing, systems engineering process and lifecycle integration. Systems engineering management activities are performed across the system development lifecycle. Due to the ever-increasing complexity of systems as well the difficulty of managing and tracking the development activities, new ways to achieve systems engineering management activities are required. This paper presents a systematic approach used as a design management tool applied across systems engineering management roles. In this approach, Transdisciplinary System Development Lifecycle (TSDL) Model has been modified and integrated with Quality Function Deployment. Hereinafter, the name of the systematic approach is the Transdisciplinary Quality System Development Lifecycle (TQSDL) Model. The QFD translates the voice of customers (VOC) into measurable technical characteristics. The modified TSDL model is based on Axiomatic Design developed by Suh which is applicable to all designs: products, processes, systems and organizations. The TQSDL model aims to provide a robust structure and systematic thinking to support the implementation of systems engineering management roles. This approach ensures that the customer requirements are fulfilled as well as satisfies all the systems engineering manager roles and activities.

Keywords: axiomatic design, quality function deployment, systems engineering management, system development lifecycle

Procedia PDF Downloads 364
858 Utilization of Cervical Cancer Screening Among HIV Infected Women in Nairobi, Kenya

Authors: E. Njuguna, S. Ilovi, P. Muiruri, K. Mutai, J. Kinuthia, P. Njoroge

Abstract:

Introduction: Cervical cancer is the commonest cause of cancer-related morbidity and mortality among women in developing countries in Sub Saharan Africa. Screening for cervical cancer in all women regardless of HIV status is crucial for the early detection of cancer of the cervix when treatment is most effective in curing the disease. It is particularly more important to screen HIV infected women as they are more at risk of developing the disease and progressing faster once infected with HPV (Human Papilloma Virus). We aimed to determine the factors affecting the utilization of cervical cancer screenings among HIV infected women above 18 years of age at Kenyatta National Hospital (KNH) Comprehensive Care Center (CCC). Materials and Methods: A cross-sectional mixed quantitative and qualitative study involving randomly and purposefully selected HIV positive female respectively was conducted. Qualitative data collection involved 4 focus group discussions of eligible female participants while quantitative data were acquired by one to one interviewer administered structured questionnaires. The outcome variable was the utilization of cervical cancer screening. Data were entered into Access data base and analyzed using Stata version 11.1. Qualitative data were analyzed after coding for significant clauses and transcribing to determine themes arising. Results: We enrolled a total of 387 patients, mean age (IQ range) 40 years (36-44). Cervical cancer screening utilization was 46% despite a health care provider recommendation of 85%. The screening results were reported as normal in 72 of 81 (88.9%) and abnormal 7 of 81(8.6%) of the cases. Those who did not know their result were 2 of 81(2.5%). Patients were less likely to utilize the service with increasing number of years attending the clinic (OR 0.9, 95% CI 0.86-0.99, p-value 0.02), but more likely to utilize the service if recommendation by a staff was made (OR 10, 95% CI 4.2-23.9, p<0.001), and if cervical screening had been done before joining KNH CCC (OR 2.9, 95% CI 1.7-4.9, p < 0.001). Similarly, they were more likely to rate the services on cervical cancer screening as good (OR 5.0, 95% CI 1.7-3.4, p <0.001) and very good (OR 8.1, 95% CI 2.5-6.1, p<0.001) if they had utilized the service. The main barrier themes emerging from qualitative data included fear of screening due to excessive pain or bleeding, lack of proper communication on screening procedures and increased waiting time. Conclusions: Utilization of cervical cancer screening services was low despite health care recommendation. Patient socio-demographic characteristics did not influence whether or not they utilized the services, indicating the important role of the health care provider in the referral and provision of the service.

Keywords: cervical, cancer, HIV, women, comprehensive care center

Procedia PDF Downloads 277
857 Deployment of Beyond 4G Wireless Communication Networks with Carrier Aggregation

Authors: Bahram Khan, Anderson Rocha Ramos, Rui R. Paulo, Fernando J. Velez

Abstract:

With the growing demand for a new blend of applications, the users dependency on the internet is increasing day by day. Mobile internet users are giving more attention to their own experiences, especially in terms of communication reliability, high data rates and service stability on move. This increase in the demand is causing saturation of existing radio frequency bands. To address these challenges, researchers are investigating the best approaches, Carrier Aggregation (CA) is one of the newest innovations, which seems to fulfill the demands of the future spectrum, also CA is one the most important feature for Long Term Evolution - Advanced (LTE-Advanced). For this purpose to get the upcoming International Mobile Telecommunication Advanced (IMT-Advanced) mobile requirements (1 Gb/s peak data rate), the CA scheme is presented by 3GPP, which would sustain a high data rate using widespread frequency bandwidth up to 100 MHz. Technical issues such as aggregation structure, its implementations, deployment scenarios, control signal techniques, and challenges for CA technique in LTE-Advanced, with consideration of backward compatibility, are highlighted in this paper. Also, performance evaluation in macro-cellular scenarios through a simulation approach is presented, which shows the benefits of applying CA, low-complexity multi-band schedulers in service quality, system capacity enhancement and concluded that enhanced multi-band scheduler is less complex than the general multi-band scheduler, which performs better for a cell radius longer than 1800 m (and a PLR threshold of 2%).

Keywords: component carrier, carrier aggregation, LTE-advanced, scheduling

Procedia PDF Downloads 200
856 Soft Computing Employment to Optimize Safety Stock Levels in Supply Chain Dairy Product under Supply and Demand Uncertainty

Authors: Riyadh Jamegh, Alla Eldin Kassam, Sawsan Sabih

Abstract:

In order to overcome uncertainty conditions and inability to meet customers' requests due to these conditions, organizations tend to reserve a certain safety stock level (SSL). This level must be chosen carefully in order to avoid the increase in holding cost due to excess in SSL or shortage cost due to too low SSL. This paper used soft computing fuzzy logic to identify optimal SSL; this fuzzy model uses the dynamic concept to cope with high complexity environment status. The proposed model can deal with three input variables, i.e., demand stability level, raw material availability level, and on hand inventory level by using dynamic fuzzy logic to obtain the best SSL as an output. In this model, demand stability, raw material, and on hand inventory levels are described linguistically and then treated by inference rules of the fuzzy model to extract the best level of safety stock. The aim of this research is to provide dynamic approach which is used to identify safety stock level, and it can be implanted in different industries. Numerical case study in the dairy industry with Yogurt 200 gm cup product is explained to approve the validity of the proposed model. The obtained results are compared with the current level of safety stock which is calculated by using the traditional approach. The importance of the proposed model has been demonstrated by the significant reduction in safety stock level.

Keywords: inventory optimization, soft computing, safety stock optimization, dairy industries inventory optimization

Procedia PDF Downloads 126
855 Co-payment Strategies for Chronic Medications: A Qualitative and Comparative Analysis at European Level

Authors: Pedro M. Abreu, Bruno R. Mendes

Abstract:

The management of pharmacotherapy and the process of dispensing medicines is becoming critical in clinical pharmacy due to the increase of incidence and prevalence of chronic diseases, the complexity and customization of therapeutic regimens, the introduction of innovative and more expensive medicines, the unbalanced relation between expenditure and revenue as well as due to the lack of rationalization associated with medication use. For these reasons, co-payments emerged in Europe in the 70s and have been applied over the past few years in healthcare. Co-payments lead to a rationing and rationalization of user’s access under healthcare services and products, and simultaneously, to a qualification and improvement of the services and products for the end-user. This analysis, under hospital practices particularly and co-payment strategies in general, was carried out on all the European regions and identified four reference countries, that apply repeatedly this tool and with different approaches. The structure, content and adaptation of European co-payments were analyzed through 7 qualitative attributes and 19 performance indicators, and the results expressed in a scorecard, allowing to conclude that the German models (total score of 68,2% and 63,6% in both elected co-payments) can collect more compliance and effectiveness, the English models (total score of 50%) can be more accessible, and the French models (total score of 50%) can be more adequate to the socio-economic and legal framework. Other European models did not show the same quality and/or performance, so were not taken as a standard in the future design of co-payments strategies. In this sense, we can see in the co-payments a strategy not only to moderate the consumption of healthcare products and services, but especially to improve them, as well as a strategy to increment the value that the end-user assigns to these services and products, such as medicines.

Keywords: clinical pharmacy, co-payments, healthcare, medicines

Procedia PDF Downloads 251
854 INRAM-3DCNN: Multi-Scale Convolutional Neural Network Based on Residual and Attention Module Combined with Multilayer Perceptron for Hyperspectral Image Classification

Authors: Jianhong Xiang, Rui Sun, Linyu Wang

Abstract:

In recent years, due to the continuous improvement of deep learning theory, Convolutional Neural Network (CNN) has played a great superior performance in the research of Hyperspectral Image (HSI) classification. Since HSI has rich spatial-spectral information, only utilizing a single dimensional or single size convolutional kernel will limit the detailed feature information received by CNN, which limits the classification accuracy of HSI. In this paper, we design a multi-scale CNN with MLP based on residual and attention modules (INRAM-3DCNN) for the HSI classification task. We propose to use multiple 3D convolutional kernels to extract the packet feature information and fully learn the spatial-spectral features of HSI while designing residual 3D convolutional branches to avoid the decline of classification accuracy due to network degradation. Secondly, we also design the 2D Inception module with a joint channel attention mechanism to quickly extract key spatial feature information at different scales of HSI and reduce the complexity of the 3D model. Due to the high parallel processing capability and nonlinear global action of the Multilayer Perceptron (MLP), we use it in combination with the previous CNN structure for the final classification process. The experimental results on two HSI datasets show that the proposed INRAM-3DCNN method has superior classification performance and can perform the classification task excellently.

Keywords: INRAM-3DCNN, residual, channel attention, hyperspectral image classification

Procedia PDF Downloads 81
853 Characterization of Optical Systems for Intraocular Projection

Authors: Charles Q. Yu, Victoria H. Fan, Ahmed F. Al-Qahtani, Ibraim Viera

Abstract:

Introduction: Over 12 million people are blind due to opacity of the cornea, the clear tissue forming the front of the eye. Current methods use plastic implants to produce a clear optical pathway into the eye but are limited by a high rate of complications. New implants utilizing completely inside-the-eye projection technology can overcome blindness due to scarring of the eye by producing images on the retina without need for a clear optical pathway into the eye and may be free of the complications of traditional treatments. However, the interior of the eye is a challenging location for the design of optical focusing systems which can produce a sufficiently high quality image. No optical focusing systems have previously been characterized for this purpose. Methods: 3 optical focusing systems for intraocular (inside the eye) projection were designed and then modeled with ray tracing software, including a pinhole system, a planoconvex, and an achromatic system. These were then constructed using off-the-shelf components and tested in the laboratory. Weight, size, magnification, depth of focus, image quality and brightness were characterized. Results: Image quality increased with complexity of system design, as did weight and size. A dual achromatic doublet optical system produced the highest image quality. The visual acuity equivalent achieved with this system was better than 20/200. Its weight was less than that of the natural human crystalline lens. Conclusions: We demonstrate for the first time that high quality images can be produced by optical systems sufficiently small and light to be implanted within the eye.

Keywords: focusing, projection, blindness, cornea , achromatic, pinhole

Procedia PDF Downloads 132
852 Embedded System of Signal Processing on FPGA: Underwater Application Architecture

Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad

Abstract:

The purpose of this paper is to study the phenomenon of acoustic scattering by using a new method. The signal processing (Fast Fourier Transform FFT Inverse Fast Fourier Transform iFFT and BESSEL functions) is widely applied to obtain information with high precision accuracy. Signal processing has a wider implementation in general-purpose pro-cessors. Our interest was focused on the use of FPGAs (Field-Programmable Gate Ar-rays) in order to minimize the computational complexity in single processor architecture, then be accelerated on FPGA and meet real-time and energy efficiency requirements. Gen-eral-purpose processors are not efficient for signal processing. We implemented the acous-tic backscattered signal processing model on the Altera DE-SOC board and compared it to Odroid xu4. By comparison, the computing latency of Odroid xu4 and FPGA is 60 sec-onds and 3 seconds, respectively. The detailed SoC FPGA-based system has shown that acoustic spectra are performed up to 20 times faster than the Odroid xu4 implementation. FPGA-based system of processing algorithms is realized with an absolute error of about 10⁻³. This study underlines the increasing importance of embedded systems in underwater acoustics, especially in non-destructive testing. It is possible to obtain information related to the detection and characterization of submerged cells. So we have achieved good exper-imental results in real-time and energy efficiency.

Keywords: DE1 FPGA, acoustic scattering, form function, signal processing, non-destructive testing

Procedia PDF Downloads 79
851 The Influence of Human Factors Education on the Irish Registered Pre-Hospital Practitioner within the National Ambulance Service

Authors: Desmond Wade, Alfredo Ormazabal

Abstract:

Background: Ever since it commenced its registration process of pre-hospital practitioners in the year 2000 through the Irish Government Statute Instrument (SI 109 of 2000) process, the approach to education of its professionals has changed drastically. The progression from the traditional behaviouristic to the current constructivist approach has been based on experiences from other sectors and industries, nationally and internationally. Today, the delivery of a safe and efficient ambulance service heavily depends on its practitioners’ range of technical skills, academic knowledge, and overall competences. As these increase, so does the level of complexity of paramedics’ everyday practice. This has made it inevitable to consider the 'Human Factor' as a source of potential risk and made formative institutions like the National Ambulance Service College to include it in their curriculum. Methods: This paper used a mixed-method approach, where both, an online questionnaire and a set of semi-structured interviews were the source of primary data. An analysis of this data was carried out using qualitative and quantitative data analysis. Conclusions: The evidence presented leads to the conclusion that in the National Ambulance Service there is a considerable lack of education of Human Factors and the levels in understanding of how to manage Human Factors in practice vary across its spectrum. Paramedic Practitioners in Ireland seem to understand that the responsibility of patient care lies on the team, rather than on the most hierarchically senior practitioner present in the scene.

Keywords: human factors, ergonomics, stress, decision making, pre-hospital care, paramedic, education

Procedia PDF Downloads 150
850 Gas Phase Extraction: An Environmentally Sustainable and Effective Method for The Extraction and Recovery of Metal from Ores

Authors: Kolela J Nyembwe, Darlington C. Ashiegbu, Herman J. Potgieter

Abstract:

Over the past few decades, the demand for metals has increased significantly. This has led to a decrease and decline of high-grade ore over time and an increase in mineral complexity and matrix heterogeneity. In addition to that, there are rising concerns about greener processes and a sustainable environment. Due to these challenges, the mining and metal industry has been forced to develop new technologies that are able to economically process and recover metallic values from low-grade ores, materials having a metal content locked up in industrially processed residues (tailings and slag), and complex matrix mineral deposits. Several methods to address these issues have been developed, among which are ionic liquids (IL), heap leaching, and bioleaching. Recently, the gas phase extraction technique has been gaining interest because it eliminates many of the problems encountered in conventional mineral processing methods. The technique relies on the formation of volatile metal complexes, which can be removed from the residual solids by a carrier gas. The complexes can then be reduced using the appropriate method to obtain the metal and regenerate-recover the organic extractant. Laboratory work on the gas phase have been conducted for the extraction and recovery of aluminium (Al), iron (Fe), copper (Cu), chrome (Cr), nickel (Ni), lead (Pb), and vanadium V. In all cases the extraction revealed to depend of temperature and mineral surface area. The process technology appears very promising, offers the feasibility of recirculation, organic reagent regeneration, and has the potential to deliver on all promises of a “greener” process.

Keywords: gas-phase extraction, hydrometallurgy, low-grade ore, sustainable environment

Procedia PDF Downloads 137
849 De Novo Assembly and Characterization of the Transcriptome from the Fluoroacetate Producing Plant, Dichapetalum Cymosum

Authors: Selisha A. Sooklal, Phelelani Mpangase, Shaun Aron, Karl Rumbold

Abstract:

Organically bound fluorine (C-F bond) is extremely rare in nature. Despite this, the first fluorinated secondary metabolite, fluoroacetate, was isolated from the plant Dichapetalum cymosum (commonly known as Gifblaar). However, the enzyme responsible for fluorination (fluorinase) in Gifblaar was never isolated and very little progress has been achieved in understanding this process in higher plants. Fluorinated compounds have vast applications in the pharmaceutical, agrochemical and fine chemicals industries. Consequently, an enzyme capable of catalysing a C-F bond has great potential as a biocatalyst in the industry considering that the field of fluorination is virtually synthetic. As with any biocatalyst, a range of these enzymes are required. Therefore, it is imperative to expand the exploration for novel fluorinases. This study aimed to gain molecular insights into secondary metabolite biosynthesis in Gifblaar using a high-throughput sequencing-based approach. Mechanical wounding studies were performed using Gifblaar leaf tissue in order to induce expression of the fluorinase. The transcriptome of the wounded and unwounded plant was then sequenced on the Illumina HiSeq platform. A total of 26.4 million short sequence reads were assembled into 77 845 transcripts using Trinity. Overall, 68.6 % of transcripts were annotated with gene identities using public databases (SwissProt, TrEMBL, GO, COG, Pfam, EC) with an E-value threshold of 1E-05. Sequences exhibited the greatest homology to the model plant, Arabidopsis thaliana (27 %). A total of 244 annotated transcripts were found to be differentially expressed between the wounded and unwounded plant. In addition, secondary metabolic pathways present in Gifblaar were successfully reconstructed using Pathway tools. Due to lack of genetic information for plant fluorinases, a transcript failed to be annotated as a fluorinating enzyme. Thus, a local database containing the 5 existing bacterial fluorinases was created. Fifteen transcripts having homology to partial regions of existing fluorinases were found. In efforts to obtain the full coding sequence of the Gifblaar fluorinase, primers were designed targeting the regions of homology and genome walking will be performed to amplify the unknown regions. This is the first genetic data available for Gifblaar. It has provided novel insights into the mechanisms of metabolite biosynthesis and will allow for the discovery of the first eukaryotic fluorinase.

Keywords: biocatalyst, fluorinase, gifblaar, transcriptome

Procedia PDF Downloads 277
848 A Key Parameter in Ocean Thermal Energy Conversion Plant Design and Operation

Authors: Yongjian Gu

Abstract:

Ocean thermal energy is one of the ocean energy sources. It is a renewable, sustainable, and green energy source. Ocean thermal energy conversion (OTEC) applies the ocean temperature gradient between the warmer surface seawater and the cooler deep seawater to run a heat engine and produce a useful power output. Unfortunately, the ocean temperature gradient is not big. Even in the tropical and equatorial regions, the surface water temperature can only reach up to 28oC and the deep water temperature can be as low as 4oC. The thermal efficiency of the OTEC plants, therefore, is low. In order to improve the plant thermal efficiency by using the limited ocean temperature gradient, some OTEC plants use the method of adding more equipment for better heat recovery, such as heat exchangers, pumps, etc. Obviously, the method will increase the plant's complexity and cost. The more important impact of the method is the additional equipment needs to consume power too, which may have an adverse effect on the plant net power output, in turn, the plant thermal efficiency. In the paper, the author first describes varied OTEC plants and the practice of using the method of adding more equipment for improving the plant's thermal efficiency. Then the author proposes a parameter, plant back works ratio ϕ, for measuring if the added equipment is appropriate for the plant thermal efficiency improvement. Finally, in the paper, the author presents examples to illustrate the application of the back work ratio ϕ as a key parameter in the OTEC plant design and operation.

Keywords: ocean thermal energy, ocean thermal energy conversion (OTEC), OTEC plant, plant back work ratio ϕ

Procedia PDF Downloads 198
847 Secure and Privacy-Enhanced Blockchain-Based Authentication System for University User Management

Authors: Ali El Ksimi

Abstract:

In today's digital academic environment, secure authentication methods are essential for managing sensitive user data, including that of students and faculty. The rise in cyber threats and data breaches has exposed the vulnerabilities of traditional authentication systems used in universities. Passwords, often the first line of defense, are particularly susceptible to hacking, phishing, and brute-force attacks. While multi-factor authentication (MFA) provides an additional layer of security, it can still be compromised and often adds complexity and inconvenience for users. As universities seek more robust security measures, blockchain technology emerges as a promising solution. Renowned for its decentralization, immutability, and transparency, blockchain has the potential to transform how user management is conducted in academic institutions. In this article, we explore a system that leverages blockchain technology specifically for managing user accounts within a university setting. The system enables the secure creation and management of accounts for different roles, such as administrators, teachers, and students. Each user is authenticated through a decentralized application (DApp) that ensures their data is securely stored and managed on the blockchain. By eliminating single points of failure and utilizing cryptographic techniques, the system enhances the security and integrity of user management processes. We will delve into the technical architecture, security benefits, and implementation considerations of this approach. By integrating blockchain into user management, we aim to address the limitations of traditional systems and pave the way for the future of digital security in education.

Keywords: blockchain, university, authentication, decentralization, cybersecurity, user management, privacy

Procedia PDF Downloads 29
846 Postmortem Genetic Testing to Sudden and Unexpected Deaths Using the Next Generation Sequencing

Authors: Eriko Ochiai, Fumiko Satoh, Keiko Miyashita, Yu Kakimoto, Motoki Osawa

Abstract:

Sudden and unexpected deaths from unknown causes occur in infants and youths. Recently, molecular links between a part of these deaths and several genetic diseases are examined in the postmortem. For instance, hereditary long QT syndrome and Burgada syndrome are occasionally fatal through critical ventricular tachyarrhythmia. There are a large number of target genes responsible for such diseases, the conventional analysis using the Sanger’s method has been laborious. In this report, we attempted to analyze sudden deaths comprehensively using the next generation sequencing (NGS) technique. Multiplex PCR to subject’s DNA was performed using Ion AmpliSeq Library Kits 2.0 and Ion AmpliSeq Inherited Disease Panel (Life Technologies). After the library was constructed by emulsion PCR, the amplicons were sequenced 500 flows on Ion Personal Genome Machine System (Life Technologies) according to the manufacture instruction. SNPs and indels were analyzed to the sequence reads that were mapped on hg19 of reference sequences. This project has been approved by the ethical committee of Tokai University School of Medicine. As a representative case, the molecular analysis to a 40 years old male who received a diagnosis of Brugada syndrome demonstrated a total of 584 SNPs or indels. Non-synonymous and frameshift nucleotide substitutions were selected in the coding region of heart disease related genes of ANK2, AKAP9, CACNA1C, DSC2, KCNQ1, MYLK, SCN1B, and STARD3. In particular, c.629T-C transition in exon 3 of the SCN1B gene, resulting in a leu210-to-pro (L210P) substitution is predicted “damaging” by the SIFT program. Because the mutation has not been reported, it was unclear if the substitution was pathogenic. Sudden death that failed in determining the cause of death constitutes one of the most important unsolved subjects in forensic pathology. The Ion AmpliSeq Inherited Disease Panel can amplify the exons of 328 genes at one time. We realized the difficulty in selection of the true source from a number of candidates, but postmortem genetic testing using NGS analysis deserves of a diagnostic to date. We now extend this analysis to SIDS suspected subjects and young sudden death victims.

Keywords: postmortem genetic testing, sudden death, SIDS, next generation sequencing

Procedia PDF Downloads 360
845 Numerical Simulation of Free Surface Water Wave for the Flow Around NACA 0012 Hydrofoil and Wigley Hull Using VOF Method

Authors: Omar Imine, Mohammed Aounallah, Mustapha Belkadi

Abstract:

Steady three-dimensional and two free surface waves generated by moving bodies are presented, the flow problem to be simulated is rich in complexity and poses many modeling challenges because of the existence of breaking waves around the ship hull, and because of the interaction of the two-phase flow with the turbulent boundary layer. The results of several simulations are reported. The first study was performed for NACA0012 of hydrofoil with different meshes, this section is analyzed at h/c= 1, 0345 for 2D. In the second simulation, a mathematically defined Wigley hull form is used to investigate the application of a commercial CFD code in prediction of the total resistance and its components from tangential and normal forces on the hull wetted surface. The computed resistance and wave profiles are used to estimate the coefficient of the total resistance for Wigley hull advancing in calm water under steady conditions. The commercial CFD software FLUENT version 12 is used for the computations in the present study. The calculated grid is established using the code computer GAMBIT 2.3.26. The shear stress k-ωSST model is used for turbulence modeling and the volume of the fluid technique is employed to simulate the free-surface motion. The second order upwind scheme is used for discretizing the convection terms in the momentum transport equations, the Modified HRICscheme for VOF discretization. The results obtained compare well with the experimental data.

Keywords: free surface flows, breaking waves, boundary layer, Wigley hull, volume of fluid

Procedia PDF Downloads 377
844 Design of a CO₂-Reduced 3D Concrete Mixture Using Circular (Clay-Based) Building Materials

Authors: N. Z. van Hierden, Q. Yu, F. Gauvin

Abstract:

Cement manufacturing is, because of its production process, among the highest contributors to CO₂ emissions worldwide. As cement is one of the major components in 3D printed concrete, achieving sustainability and carbon neutrality can be particularly challenging. To improve the sustainability of 3D printed materials, different CO₂-reducing strategies can be used, each one with a distinct level of impact and complexity. In this work, we focus on the development of these sustainable mixtures and finding alternatives. Promising alternatives for cement and clinker replacement include the use of recycled building materials, amongst which (calcined) bricks and roof tiles. To study the potential of recycled clay-based building materials, the application of calcinated clay itself is studied as well. Compared to cement, the calcination temperature of clay-based materials is significantly lower, resulting in reduced CO₂ output. Reusing these materials is therefore a promising solution for utilizing waste streams while simultaneously reducing the cement content in 3D concrete mixtures. In addition, waste streams can be locally sourced, thereby reducing the emitted CO₂ during transportation. In this research, various alternative binders are examined, such as calcined clay blends (LC3) from recycled tiles and bricks, or locally obtained clay resources. Using various experiments, a high potential for mix designs including these resources has been shown with respect to material strength, while sustaining decent printability and buildability. Therefore, the defined strategies are promising and can lead to a more sustainable, low-CO₂ mixture suitable for 3D printing while using accessible materials.

Keywords: cement replacement, 3DPC, circular building materials, calcined clay, CO₂ reduction

Procedia PDF Downloads 86