Search results for: electronic measurement systems
4472 Efficient Feature Fusion for Noise Iris in Unconstrained Environment
Authors: Yao-Hong Tsai
Abstract:
This paper presents an efficient fusion algorithm for iris images to generate stable feature for recognition in unconstrained environment. Recently, iris recognition systems are focused on real scenarios in our daily life without the subject’s cooperation. Under large variation in the environment, the objective of this paper is to combine information from multiple images of the same iris. The result of image fusion is a new image which is more stable for further iris recognition than each original noise iris image. A wavelet-based approach for multi-resolution image fusion is applied in the fusion process. The detection of the iris image is based on Adaboost algorithm and then local binary pattern (LBP) histogram is then applied to texture classification with the weighting scheme. Experiment showed that the generated features from the proposed fusion algorithm can improve the performance for verification system through iris recognition.Keywords: image fusion, iris recognition, local binary pattern, wavelet
Procedia PDF Downloads 3674471 3D Simulation for Design and Predicting Performance of a Thermal Heat Storage Facility using Sand
Authors: Nadjiba Mahfoudi, Abdelhafid Moummi , Mohammed El Ganaoui
Abstract:
Thermal applications are drawing increasing attention in the solar energy research field, due to their high performance in energy storage density and energy conversion efficiency. In these applications, solar collectors and thermal energy storage systems are the two core components. This paper presents a thermal analysis of the transient behavior and storage capability of a sensible heat storage device in which sand is used as a storage media. The TES unit with embedded charging tubes is connected to a solar air collector. To investigate it storage characteristics a 3D-model using no linear coupled partial differential equations for both temperature of storage medium and heat transfer fluid (HTF), has been developed. Performances of thermal storage bed of capacity of 17 MJ (including bed temperature, charging time, energy storage rate, charging energy efficiency) have been evaluated. The effect of the number of charging tubes (3 configurations) is presented.Keywords: design, thermal modeling, heat transfer enhancement, sand, sensible heat storage
Procedia PDF Downloads 5614470 Improving University Operations with Data Mining: Predicting Student Performance
Authors: Mladen Dragičević, Mirjana Pejić Bach, Vanja Šimičević
Abstract:
The purpose of this paper is to develop models that would enable predicting student success. These models could improve allocation of students among colleges and optimize the newly introduced model of government subsidies for higher education. For the purpose of collecting data, an anonymous survey was carried out in the last year of undergraduate degree student population using random sampling method. Decision trees were created of which two have been chosen that were most successful in predicting student success based on two criteria: Grade Point Average (GPA) and time that a student needs to finish the undergraduate program (time-to-degree). Decision trees have been shown as a good method of classification student success and they could be even more improved by increasing survey sample and developing specialized decision trees for each type of college. These types of methods have a big potential for use in decision support systems.Keywords: data mining, knowledge discovery in databases, prediction models, student success
Procedia PDF Downloads 4074469 Participatory Monitoring Strategy to Address Stakeholder Engagement Impact in Co-creation of NBS Related Project: The OPERANDUM Case
Authors: Teresa Carlone, Matteo Mannocchi
Abstract:
In the last decade, a growing number of International Organizations are pushing toward green solutions for adaptation to climate change. This is particularly true in the field of Disaster Risk Reduction (DRR) and land planning, where Nature-Based Solutions (NBS) had been sponsored through funding programs and planning tools. Stakeholder engagement and co-creation of NBS is growing as a practice and research field in environmental projects, fostering the consolidation of a multidisciplinary socio-ecological approach in addressing hydro-meteorological risk. Even thou research and financial interests are constantly spread, the NBS mainstreaming process is still at an early stage as innovative concepts and practices make it difficult to be fully accepted and adopted by a multitude of different actors to produce wide scale societal change. The monitoring and impact evaluation of stakeholders’ participation in these processes represent a crucial aspect and should be seen as a continuous and integral element of the co-creation approach. However, setting up a fit for purpose-monitoring strategy for different contexts is not an easy task, and multiple challenges emerge. In this scenario, the Horizon 2020 OPERANDUM project, designed to address the major hydro-meteorological risks that negatively affect European rural and natural territories through the co-design, co-deployment, and assessment of Nature-based Solution, represents a valid case study to test a monitoring strategy from which set a broader, general and scalable monitoring framework. Applying a participative monitoring methodology, based on selected indicators list that combines quantitative and qualitative data developed within the activity of the project, the paper proposes an experimental in-depth analysis of the stakeholder engagement impact in the co-creation process of NBS. The main focus will be to spot and analyze which factors increase knowledge, social acceptance, and mainstreaming of NBS, promoting also a base-experience guideline to could be integrated with the stakeholder engagement strategy in current and future similar strongly collaborative approach-based environmental projects, such as OPERANDUM. Measurement will be carried out through survey submitted at a different timescale to the same sample (stakeholder: policy makers, business, researchers, interest groups). Changes will be recorded and analyzed through focus groups in order to highlight causal explanation and to assess the proposed list of indicators to steer the conduction of similar activities in other projects and/or contexts. The idea of the paper is to contribute to the construction of a more structured and shared corpus of indicators that can support the evaluation of the activities of involvement and participation of various levels of stakeholders in the co-production, planning, and implementation of NBS to address climate change challenges.Keywords: co-creation and collaborative planning, monitoring, nature-based solution, participation & inclusion, stakeholder engagement
Procedia PDF Downloads 1134468 Requirements for a Shared Management of State-Owned Building in the Archaeological Park of Pompeii
Authors: Maria Giovanna Pacifico
Abstract:
Maintenance, in Italy, is not yet a consolidated practice despite the benefits that could come from. Among the main reasons, there are the lack of financial resources and personnel in the public administration and a general lack of knowledge about how to activate and to manage a prevented and programmed maintenance. The experimentation suggests that users and tourists could be involved in the maintenance process from the knowledge phase to the monitoring ones by using mobile devices. The goal is to increase the quality of Facility Management for cultural heritage, prioritizing usage needs, and limiting interference between the key stakeholders. The method simplifies the consolidated procedures for the Information Systems, avoiding a loss in terms of quality and amount of information by focusing on the users' requirements: management economy, user safety, accessibility, and by receiving feedback information to define a framework that will lead to predictive maintenance. This proposal was designed to be tested in the Archaeological Park of Pompeii on the state property asset.Keywords: asset maintenance, key stakeholders, Pompeii, user requirement
Procedia PDF Downloads 1254467 The Establishment of Primary Care Networks (England, UK) Throughout the COVID-19 Pandemic: A Qualitative Exploration of Workforce Perceptions
Authors: Jessica Raven Gates, Gemma Wilson-Menzfeld, Professor Alison Steven
Abstract:
In 2019, the Primary Care system in the UK National Health Service (NHS) was subject to reform and restructuring. Primary Care Networks (PCNs) were established, which aligned with a trend towards integrated care both within the NHS and internationally. The introduction of PCNs brought groups of GP practices in a locality together, to operate as a network, build on existing services and collaborate at a larger scale. PCNs were expected to bring a range of benefits to patients and address some of the workforce pressures in the NHS, through an expanded and collaborative workforce. The early establishment of PCNs was disrupted by the emerging COVID-19 pandemic. This study, set in the context of the pandemic, aimed to explore experiences of the PCN workforce, and their perceptions of the establishment of PCNs. Specific objectives focussed on examining factors perceived as enabling or hindering the success of a PCN, the impact on day-to-day work, the approach to implementing change, and the influence of the COVID-19 pandemic upon PCN development. This study is part of a three-phase PhD project that utilized qualitative approaches and was underpinned by social constructionist philosophy. Phase 1: a systematic narrative review explored the provision of preventative healthcare services in UK primary settings and examined facilitators and barriers to delivery as experienced by the workforce. Phase 2: informed by the findings of phase 1, semi-structured interviews were conducted with fifteen participants (PCN workforce). Phase 3: follow-up interviews were conducted with original participants to examine any changes to their experiences and perceptions of PCNs. Three main themes span across phases 2 and 3 and were generated through a Framework Analysis approach: 1) working together at scale, 2) network infrastructure, and 3) PCN leadership. Findings suggest that through efforts to work together at scale and collaborate as a network, participants have broadly accepted the concept of PCNs. However, the workforce has been hampered by system design and system complexity. Operating against such barriers has led to a negative psychological impact on some PCN leaders and others in the PCN workforce. While the pandemic undeniably increased pressure on healthcare systems around the world, it also acted as a disruptor, offering a glimpse into how collaboration in primary care can work well. Through the integration of findings from all phases, a new theoretical model has been developed, which conceptualises the findings from this Ph.D. study and demonstrates how the workforce has experienced change associated with the establishment of PCNs. The model includes a contextual component of the COVID-19 pandemic and has been informed by concepts from Complex Adaptive Systems theory. This model is the original contribution to knowledge of the PhD project, alongside recommendations for practice, policy and future research. This study is significant in the realm of health services research, and while the setting for this study is the UK NHS, the findings will be of interest to an international audience as the research provides insight into how the healthcare workforce may experience imposed policy and service changes.Keywords: health services research, qualitative research, NHS workforce, primary care
Procedia PDF Downloads 584466 Frequency Decomposition Approach for Sub-Band Common Spatial Pattern Methods for Motor Imagery Based Brain-Computer Interface
Authors: Vitor M. Vilas Boas, Cleison D. Silva, Gustavo S. Mafra, Alexandre Trofino Neto
Abstract:
Motor imagery (MI) based brain-computer interfaces (BCI) uses event-related (de)synchronization (ERS/ ERD), typically recorded using electroencephalography (EEG), to translate brain electrical activity into control commands. To mitigate undesirable artifacts and noise measurements on EEG signals, methods based on band-pass filters defined by a specific frequency band (i.e., 8 – 30Hz), such as the Infinity Impulse Response (IIR) filters, are typically used. Spatial techniques, such as Common Spatial Patterns (CSP), are also used to estimate the variations of the filtered signal and extract features that define the imagined motion. The CSP effectiveness depends on the subject's discriminative frequency, and approaches based on the decomposition of the band of interest into sub-bands with smaller frequency ranges (SBCSP) have been suggested to EEG signals classification. However, despite providing good results, the SBCSP approach generally increases the computational cost of the filtering step in IM-based BCI systems. This paper proposes the use of the Fast Fourier Transform (FFT) algorithm in the IM-based BCI filtering stage that implements SBCSP. The goal is to apply the FFT algorithm to reduce the computational cost of the processing step of these systems and to make them more efficient without compromising classification accuracy. The proposal is based on the representation of EEG signals in a matrix of coefficients resulting from the frequency decomposition performed by the FFT, which is then submitted to the SBCSP process. The structure of the SBCSP contemplates dividing the band of interest, initially defined between 0 and 40Hz, into a set of 33 sub-bands spanning specific frequency bands which are processed in parallel each by a CSP filter and an LDA classifier. A Bayesian meta-classifier is then used to represent the LDA outputs of each sub-band as scores and organize them into a single vector, and then used as a training vector of an SVM global classifier. Initially, the public EEG data set IIa of the BCI Competition IV is used to validate the approach. The first contribution of the proposed method is that, in addition to being more compact, because it has a 68% smaller dimension than the original signal, the resulting FFT matrix maintains the signal information relevant to class discrimination. In addition, the results showed an average reduction of 31.6% in the computational cost in relation to the application of filtering methods based on IIR filters, suggesting FFT efficiency when applied in the filtering step. Finally, the frequency decomposition approach improves the overall system classification rate significantly compared to the commonly used filtering, going from 73.7% using IIR to 84.2% using FFT. The accuracy improvement above 10% and the computational cost reduction denote the potential of FFT in EEG signal filtering applied to the context of IM-based BCI implementing SBCSP. Tests with other data sets are currently being performed to reinforce such conclusions.Keywords: brain-computer interfaces, fast Fourier transform algorithm, motor imagery, sub-band common spatial patterns
Procedia PDF Downloads 1284465 The Nature and Impact of Trojan Horses in Cybersecurity
Authors: Mehrab Faraghti
Abstract:
Trojan horses, a form of malware masquerading as legitimate software, pose significant cybersecurity threats. These malicious programs exploit user trust, infiltrate systems, and can lead to data breaches, financial loss, and compromised privacy. This paper explores the mechanisms through which Trojan horses operate, including delivery methods such as phishing and software vulnerabilities. It categorizes various types of Trojan horses and their specific impacts on individuals and organizations. Additionally, the research highlights the evolution of Trojan threats and the importance of user awareness and proactive security measures. By analyzing case studies of notable Trojan attacks, this study identifies common vulnerabilities that can be exploited and offers insights into effective countermeasures, including behavioral analysis, anomaly detection, and robust incident response strategies. The findings emphasize the need for comprehensive cybersecurity education and the implementation of advanced security protocols to mitigate the risks associated with Trojan horses.Keywords: Trojan horses, cybersecurity, malware, data breach
Procedia PDF Downloads 94464 An Energy-Efficient Model of Integrating Telehealth IoT Devices with Fog and Cloud Computing-Based Platform
Authors: Yunyong Guo, Sudhakar Ganti, Bryan Guo
Abstract:
The rapid growth of telehealth Internet of Things (IoT) devices has raised concerns about energy consumption and efficient data processing. This paper introduces an energy-efficient model that integrates telehealth IoT devices with a fog and cloud computing-based platform, offering a sustainable and robust solution to overcome these challenges. Our model employs fog computing as a localized data processing layer while leveraging cloud computing for resource-intensive tasks, significantly reducing energy consumption. We incorporate adaptive energy-saving strategies. Simulation analysis validates our approach's effectiveness in enhancing energy efficiency for telehealth IoT systems integrated with localized fog nodes and both private and public cloud infrastructures. Future research will focus on further optimization of the energy-saving model, exploring additional functional enhancements, and assessing its broader applicability in other healthcare and industry sectors.Keywords: energy-efficient, fog computing, IoT, telehealth
Procedia PDF Downloads 864463 Analytical Solution for Thermo-Hydro-Mechanical Analysis of Unsaturated Porous Media Using AG Method
Authors: Davood Yazdani Cherati, Hussein Hashemi Senejani
Abstract:
In this paper, a convenient analytical solution for a system of coupled differential equations, derived from thermo-hydro-mechanical analysis of three-phase porous media such as unsaturated soils is developed. This kind of analysis can be used in various fields such as geothermal energy systems and seepage of leachate from buried municipal and domestic waste in geomaterials. Initially, a system of coupled differential equations, including energy, mass, and momentum conservation equations is considered, and an analytical method called AGM is employed to solve the problem. The method is straightforward and comprehensible and can be used to solve various nonlinear partial differential equations (PDEs). Results indicate the accuracy of the applied method for solving nonlinear partial differential equations.Keywords: AGM, analytical solution, porous media, thermo-hydro-mechanical, unsaturated soils
Procedia PDF Downloads 2294462 A Study of the Performance Parameter for Recommendation Algorithm Evaluation
Authors: C. Rana, S. K. Jain
Abstract:
The enormous amount of Web data has challenged its usage in efficient manner in the past few years. As such, a range of techniques are applied to tackle this problem; prominent among them is personalization and recommender system. In fact, these are the tools that assist user in finding relevant information of web. Most of the e-commerce websites are applying such tools in one way or the other. In the past decade, a large number of recommendation algorithms have been proposed to tackle such problems. However, there have not been much research in the evaluation criteria for these algorithms. As such, the traditional accuracy and classification metrics are still used for the evaluation purpose that provides a static view. This paper studies how the evolution of user preference over a period of time can be mapped in a recommender system using a new evaluation methodology that explicitly using time dimension. We have also presented different types of experimental set up that are generally used for recommender system evaluation. Furthermore, an overview of major accuracy metrics and metrics that go beyond the scope of accuracy as researched in the past few years is also discussed in detail.Keywords: collaborative filtering, data mining, evolutionary, clustering, algorithm, recommender systems
Procedia PDF Downloads 4134461 Metal Contamination in an E-Waste Recycling Community in Northeastern Thailand
Authors: Aubrey Langeland, Richard Neitzel, Kowit Nambunmee
Abstract:
Electronic waste, ‘e-waste’, refers generally to discarded electronics and electrical equipment, including products from cell phones and laptops to wires, batteries and appliances. While e-waste represents a transformative source of income in low- and middle-income countries, informal e-waste workers use rudimentary methods to recover materials, simultaneously releasing harmful chemicals into the environment and creating a health hazard for themselves and surrounding communities. Valuable materials such as precious metals, copper, aluminum, ferrous metals, plastic and components are recycled from e-waste. However, persistent organic pollutants such as polychlorinated biphenyls (PCBs) and some polybrominated diphenyl ethers (PBDEs), and heavy metals are toxicants contained within e-waste and are of great concern to human and environmental health. The current study seeks to evaluate the environmental contamination resulting from informal e-waste recycling in a predominantly agricultural community in northeastern Thailand. To accomplish this objective, five types of environmental samples were collected and analyzed for concentrations of eight metals commonly associated with e-waste recycling during the period of July 2016 through July 2017. Rice samples from the community were collected after harvest and analyzed using inductively coupled plasma mass spectrometry (ICP-MS) and gas furnace atomic spectroscopy (GF-AS). Soil samples were collected and analyzed using methods similar to those used in analyzing the rice samples. Surface water samples were collected and analyzed using absorption colorimetry for three heavy metals. Environmental air samples were collected using a sampling pump and matched-weight PVC filters, then analyzed using Inductively Coupled Argon Plasma-Atomic Emission Spectroscopy (ICAP-AES). Finally, surface wipe samples were collected from surfaces in homes where e-waste recycling activities occur and were analyzed using ICAP-AES. Preliminary1 results indicate that some rice samples have concentrations of lead and cadmium significantly higher than limits set by the United States Department of Agriculture (USDA) and the World Health Organization (WHO). Similarly, some soil samples show levels of copper, lead and cadmium more than twice the maximum permissible level set by the USDA and WHO, and significantly higher than other areas of Thailand. Surface water samples indicate that areas near e-waste recycling activities, particularly the burning of e-waste products, result in increased levels of cadmium, lead and copper in surface waters. This is of particular concern given that many of the surface waters tested are used in irrigation of crops. Surface wipe samples measured concentrations of metals commonly associated with e-waste, suggesting a danger of ingestion of metals during cooking and other activities. Of particular concern is the relevance of surface contamination of metals to child health. Finally, air sampling showed that the burning of e-waste presents a serious health hazard to workers and the environment through inhalation and deposition2. Our research suggests a need for improved methods of e-waste recycling that allows workers to continue this valuable revenue stream in a sustainable fashion that protects both human and environmental health. 1Statistical analysis to be finished in October 2017 due to follow-up field studies occurring in July and August 2017. 2Still awaiting complete analytic results.Keywords: e-waste, environmental contamination, informal recycling, metals
Procedia PDF Downloads 3624460 Assesing Spatio-Temporal Growth of Kochi City Using Remote Sensing Data
Authors: Navya Saira George, Patroba Achola Odera
Abstract:
This study aims to determine spatio-temporal expansion of Kochi City, situated on the west coast of Kerala State in India. Remote sensing and GIS techniques have been used to determine land use/cover and urban expansion of the City. Classification of Landsat images of the years 1973, 1988, 2002 and 2018 have been used to reproduce a visual story of the growth of the City over a period of 45 years. Accuracy range of 0.79 ~ 0.86 is achieved with kappa coefficient range of 0.69 ~ 0.80. Results show that the areas covered by vegetation and water bodies decreased progressively from 53.0 ~ 30.1% and 34.1 ~ 26.2% respectively, while built-up areas increased steadily from 12.5 to 42.2% over the entire study period (1973 ~ 2018). The shift in land use from agriculture to non-agriculture may be attributed to the land reforms since 1980s.Keywords: Geographical Information Systems, Kochi City, Land use/cover, Remote Sensing, Urban Sprawl
Procedia PDF Downloads 1294459 Petri Net Modeling and Simulation of a Call-Taxi System
Authors: T. Godwin
Abstract:
A call-taxi system is a type of taxi service where a taxi could be requested through a phone call or mobile app. A schematic functioning of a call-taxi system is modeled using Petri net, which provides the necessary conditions for a taxi to be assigned by a dispatcher to pick a customer as well as the conditions for the taxi to be released by the customer. A Petri net is a graphical modeling tool used to understand sequences, concurrences, and confluences of activities in the working of discrete event systems. It uses tokens on a directed bipartite multi-graph to simulate the activities of a system. The Petri net model is translated into a simulation model and a call-taxi system is simulated. The simulation model helps in evaluating the operation of a call-taxi system based on the fleet size as well as the operating policies for call-taxi assignment and empty call-taxi repositioning. The developed Petri net based simulation model can be used to decide the fleet size as well as the call-taxi assignment policies for a call-taxi system.Keywords: call-taxi, discrete event system, petri net, simulation modeling
Procedia PDF Downloads 4244458 Photovoltaic Maximum Power-Point Tracking Using Artificial Neural Network
Authors: Abdelazziz Aouiche, El Moundher Aouiche, Mouhamed Salah Soudani
Abstract:
Renewable energy sources now significantly contribute to the replacement of traditional fossil fuel energy sources. One of the most potent types of renewable energy that has developed quickly in recent years is photovoltaic energy. We all know that solar energy, which is sustainable and non-depleting, is the best knowledge form of energy that we have at our disposal. Due to changing weather conditions, the primary drawback of conventional solar PV cells is their inability to track their maximum power point. In this study, we apply artificial neural networks (ANN) to automatically track and measure the maximum power point (MPP) of solar panels. In MATLAB, the complete system is simulated, and the results are adjusted for the external environment. The results are better performance than traditional MPPT methods and the results demonstrate the advantages of using neural networks in solar PV systems.Keywords: modeling, photovoltaic panel, artificial neural networks, maximum power point tracking
Procedia PDF Downloads 884457 Financing the Welfare State in the United States: The Recent American Economic and Ideological Challenges
Authors: Rafat Fazeli, Reza Fazeli
Abstract:
This paper focuses on the study of the welfare state and social wage in the leading liberal economy of the United States. The welfare state acquired a broad acceptance as a major socioeconomic achievement of the liberal democracy in the Western industrialized countries during the postwar boom period. The modern and modified vision of capitalist democracy offered, on the one hand, the possibility of high growth rate and, on the other hand, the possibility of continued progression of a comprehensive system of social support for a wider population. The economic crises of the 1970s, provided the ground for a great shift in economic policy and ideology in several Western countries, most notably the United States and the United Kingdom (and to a lesser extent Canada under Prime Minister Brian Mulroney). In the 1980s, the free market oriented reforms undertaken under Reagan and Thatcher greatly affected the economic outlook not only of the United States and the United Kingdom, but of the whole Western world. The movement which was behind this shift in policy is often called neo-conservatism. The neoconservatives blamed the transfer programs for the decline in economic performance during the 1970s and argued that cuts in spending were required to go back to the golden age of full employment. The agenda for both Reagan and Thatcher administrations was rolling back the welfare state, and their budgets included a wide range of cuts for social programs. The question is how successful were Reagan and Thatcher’s efforts to achieve retrenchment? The paper involves an empirical study concerning the distributive role of the welfare state in the two countries. Other studies have often concentrated on the redistributive effect of fiscal policy on different income brackets. This study examines the net benefit/ burden position of the working population with respect to state expenditures and taxes in the postwar period. This measurement will enable us to find out whether the working population has received a net gain (or net social wage). This study will discuss how the expansion of social expenditures and the trend of the ‘net social wage’ can be linked to distinct forms of economic and social organizations. This study provides an empirical foundation for analyzing the growing significance of ‘social wage’ or the collectivization of consumption and the share of social or collective consumption in total consumption of the working population in the recent decades. The paper addresses three other major questions. The first question is whether the expansion of social expenditures has posed any drag on capital accumulation and economic growth. The findings of this study provide an analytical foundation to evaluate the neoconservative claim that the welfare state is itself the source of economic stagnation that leads to the crisis of the welfare state. The second question is whether the increasing ideological challenges from the right and the competitive pressures of globalization have led to retrenchment of the American welfare states in the recent decades. The third question is how social policies have performed in the presence of the rising inequalities in the recent decades.Keywords: the welfare state, social wage, The United States, limits to growth
Procedia PDF Downloads 2094456 Statistical Modeling of Mobile Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes
Authors: Jihad S. Daba, J. P. Dubois
Abstract:
Understanding the statistics of non-isotropic scattering multipath channels that fade randomly with respect to time, frequency, and space in a mobile environment is very crucial for the accurate detection of received signals in wireless and cellular communication systems. In this paper, we derive stochastic models for the probability density function (PDF) of the shift in the carrier frequency caused by the Doppler Effect on the received illuminating signal in the presence of a dominant line of sight. Our derivation is based on a generalized Clarke’s and a two-wave partially developed scattering models, where the statistical distribution of the frequency shift is shown to be consistent with the power spectral density of the Doppler shifted signal.Keywords: Doppler shift, filtered Poisson process, generalized Clark’s model, non-isotropic scattering, partially developed scattering, Rician distribution
Procedia PDF Downloads 3724455 The Gender Criteria of Film Criticism: Creating the ‘Big’, Avoiding the Important
Authors: Eleni Karasavvidou
Abstract:
Social and anthropological research, parallel to Gender Studies, highlighted the relationship between social structures and symbolic forms as an important field of interaction and recording of 'social trends.' Since the study of representations can contribute to the understanding of the social functions and power relations, they encompass. This ‘mirage,’ however, has not only to do with the representations themselves but also with the ways they are received and the film or critical narratives that are established as dominant or alternative. Cinema and the criticism of its cultural products are no exception. Even in the rapidly changing media landscape of the 21st century, movies remain an integral and widespread part of popular culture, making films an extremely powerful means of 'legitimizing' or 'delegitimizing' visions of domination and commonsensical gender stereotypes throughout society. And yet it is film criticism, the 'language per se,' that legitimizes, reinforces, rewards and reproduces (or at least ignores) the stereotypical depictions of female roles that remain common in the realm of film images. This creates the need for this issue to have emerged (also) in academic research questioning gender criteria in film reviews as part of the effort for an inclusive art and society. Qualitative content analysis is used to examine female roles in selected Oscar-nominated films against their reviews from leading websites and newspapers. This method was chosen because of the complex nature of the depictions in the films and the narratives they evoke. The films were divided into basic scenes depicting social functions, such as love and work relationships, positions of power and their function, which were analyzed by content analysis, with borrowings from structuralism (Gennette) and the local/universal images of intercultural philology (Wierlacher). In addition to the measurement of the general ‘representation-time’ by gender, other qualitative characteristics were also analyzed, such as: speaking time, sayings or key actions, overall quality of the character's action in relation to the development of the scenario and social representations in general, as well as quantitatively (insufficient number of female lead roles, fewer key supporting roles, relatively few female directors and people in the production chain and how they might affect screen representations. The quantitative analysis in this study was used to complement the qualitative content analysis. Then the focus shifted to the criteria of film criticism and to the rhetorical narratives that exclude or highlight in relation to gender identities and functions. In the criteria and language of film criticism, stereotypes are often reproduced or allegedly overturned within the framework of apolitical "identity politics," which mainly addresses the surface of a self-referential cultural-consumer product without connecting it more deeply with the material and cultural life. One of the prime examples of this failure is the Bechtel Test, which tracks whether female characters speak in a film regardless of whether women's stories are represented or not in the films analyzed. If perceived unbiased male filmmakers still fail to tell truly feminist stories, the same is the case with the criteria of criticism and the related interventions.Keywords: representations, context analysis, reviews, sexist stereotypes
Procedia PDF Downloads 834454 Solving the Quadratic Programming Problem Using a Recurrent Neural Network
Authors: A. A. Behroozpoor, M. M. Mazarei
Abstract:
In this paper, a fuzzy recurrent neural network is proposed for solving the classical quadratic control problem subject to linear equality and bound constraints. The convergence of the state variables of the proposed neural network to achieve solution optimality is guaranteed.Keywords: REFERENCES [1] Xia, Y, A new neural network for solving linear and quadratic programming problems. IEEE Transactions on Neural Networks, 7(6), 1996, pp.1544–1548. [2] Xia, Y., & Wang, J, A recurrent neural network for solving nonlinear convex programs subject to linear constraints. IEEE Transactions on Neural Networks, 16(2), 2005, pp. 379–386. [3] Xia, Y., H, Leung, & J, Wang, A projection neural network and its application to constrained optimization problems. IEEE Transactions Circuits and Systems-I, 49(4), 2002, pp.447–458.B. [4] Q. Liu, Z. Guo, J. Wang, A one-layer recurrent neural network for constrained seudoconvex optimization and its application for dynamic portfolio optimization. Neural Networks, 26, 2012, pp. 99-109.
Procedia PDF Downloads 6444453 Floating Building Potential for Adaptation to Rising Sea Levels: Development of a Performance Based Building Design Framework
Authors: Livia Calcagni
Abstract:
Most of the largest cities in the world are located in areas that are vulnerable to coastal erosion and flooding, both linked to climate change and rising sea levels (RSL). Nevertheless, more and more people are moving to these vulnerable areas as cities keep growing. Architects, engineers and policy makers are called to rethink the way we live and to provide timely and adequate responses not only by investigating measures to improve the urban fabric, but also by developing strategies capable of planning change, exploring unusual and resilient frontiers of living, such as floating architecture. Since the beginning of the 21st century we have seen a dynamic growth of water-based architecture. At the same time, the shortage of land available for urban development also led to reclaim the seabed or to build floating structures. In light of these considerations, time is ripe to consider floating architecture not only as a full-fledged building typology but especially as a full-fledged adaptation solution for RSL. Currently, there is no global international legal framework for urban development on water and there is no structured performance based building design (PBBD) approach for floating architecture in most countries, let alone national regulatory systems. Thus, the research intends to identify the technological, morphological, functional, economic, managerial requirements that must be considered in a the development of the PBBD framework conceived as a meta-design tool. As it is expected that floating urban development is mostly likely to take place as extension of coastal areas, the needs and design criteria are definitely more similar to those of the urban environment than of the offshore industry. Therefor, the identification and categorization of parameters takes the urban-architectural guidelines and regulations as the starting point, taking the missing aspects, such as hydrodynamics, from the offshore and shipping regulatory frameworks. This study is carried out through an evidence-based assessment of performance guidelines and regulatory systems that are effective in different countries around the world addressing on-land and on-water architecture as well as offshore and shipping industries. It involves evidence-based research and logical argumentation methods. Overall, this paper highlights how inhabiting water is not only a viable response to the problem of RSL, thus a resilient frontier for urban development, but also a response to energy insecurity, clean water and food shortages, environmental concerns and urbanization, in line with Blue Economy principles and the Agenda 2030. Moreover, the discipline of architecture is presented as a fertile field for investigating solutions to cope with climate change and its effects on life safety and quality. Future research involves the development of a decision support system as an information tool to guide the user through the decision-making process, emphasizing the logical interaction between the different potential choices, based on the PBBD.Keywords: adaptation measures, floating architecture, performance based building design, resilient architecture, rising sea levels
Procedia PDF Downloads 864452 Variation of Clinical Manifestations of COVID-19 Over Time of Pandemic
Authors: Mahdi Asghari Ozma, Fatemeh Aghamohammadzadeh, Mahin Ahangar Oskouee
Abstract:
In late 2019, the people of the world were involved with a new infection by the coronavirus, named SARS-COV-2 (COVID-19), which disseminated around the world quickly. This infection has the ability to affect various systems of the body, including respiratory, gastrointestinal, urinary, and hematology, which can be transmitted by various body samples in different ways. To control this fast-transmitted infection by preventing its transmission to other people, rapid diagnosis is vital, which can be done by examining the patient's clinical symptoms and also using various serological, molecular, and radiological methods. Symptoms caused by COVID-19 in patients include fever, cough, sore throat, headache, fatigue, shortness of breath, loss of taste or smell, skin rash, myalgia, and conjunctivitis. These clinical features were appearing gradually in different time periods from the onset of the infection, and patients showed varied and new symptoms at different times, which show the variety of symptoms over time during the spread of the infection.Keywords: COVID-19, diagnosis, symptom, variation, novel coronavirus
Procedia PDF Downloads 864451 Enhancing Financial Security: Real-Time Anomaly Detection in Financial Transactions Using Machine Learning
Authors: Ali Kazemi
Abstract:
The digital evolution of financial services, while offering unprecedented convenience and accessibility, has also escalated the vulnerabilities to fraudulent activities. In this study, we introduce a distinct approach to real-time anomaly detection in financial transactions, aiming to fortify the defenses of banking and financial institutions against such threats. Utilizing unsupervised machine learning algorithms, specifically autoencoders and isolation forests, our research focuses on identifying irregular patterns indicative of fraud within transactional data, thus enabling immediate action to prevent financial loss. The data we used in this study included the monetary value of each transaction. This is a crucial feature as fraudulent transactions may have distributions of different amounts than legitimate ones, such as timestamps indicating when transactions occurred. Analyzing transactions' temporal patterns can reveal anomalies (e.g., unusual activity in the middle of the night). Also, the sector or category of the merchant where the transaction occurred, such as retail, groceries, online services, etc. Specific categories may be more prone to fraud. Moreover, the type of payment used (e.g., credit, debit, online payment systems). Different payment methods have varying risk levels associated with fraud. This dataset, anonymized to ensure privacy, reflects a wide array of transactions typical of a global banking institution, ranging from small-scale retail purchases to large wire transfers, embodying the diverse nature of potentially fraudulent activities. By engineering features that capture the essence of transactions, including normalized amounts and encoded categorical variables, we tailor our data to enhance model sensitivity to anomalies. The autoencoder model leverages its reconstruction error mechanism to flag transactions that deviate significantly from the learned normal pattern, while the isolation forest identifies anomalies based on their susceptibility to isolation from the dataset's majority. Our experimental results, validated through techniques such as k-fold cross-validation, are evaluated using precision, recall, and the F1 score alongside the area under the receiver operating characteristic (ROC) curve. Our models achieved an F1 score of 0.85 and a ROC AUC of 0.93, indicating high accuracy in detecting fraudulent transactions without excessive false positives. This study contributes to the academic discourse on financial fraud detection and provides a practical framework for banking institutions seeking to implement real-time anomaly detection systems. By demonstrating the effectiveness of unsupervised learning techniques in a real-world context, our research offers a pathway to significantly reduce the incidence of financial fraud, thereby enhancing the security and trustworthiness of digital financial services.Keywords: anomaly detection, financial fraud, machine learning, autoencoders, isolation forest, transactional data analysis
Procedia PDF Downloads 574450 Some Probiotic Traits of Lactobacillus Strains Isolated from Pollen
Authors: Hani Belhadj, Daoud Harzallah, Seddik Khennouf, Saliha Dahamna, Mouloud Ghadbane
Abstract:
In this study, Lactobacillus strains isolated from pollen were identified by means of phenotypic and genotypic methods, At pH 2, most strains proved to be acid resistants, with losses in cell viability ranging from 0.77 to 4.04 Log orders. In addition, at pH 3 all strains could grew and resist the acidic conditions, with losses in cell viability ranging from 0.40 to 3.61 Log orders. It seems that, 0.3% and 0.5% of bile salts does not affect greatly the survival of most strains, excluding Lactobacillus sp. BH1398. Survival ranged from 81.0±3.5 to 93.5±3.9%. In contrast, in the presence of 1.0% bile salts, survival of five strains was decreased by more than 50%. Lactobacillus fermentum BH1509 was considered the most tolerant strain (77.5% for 1% bile) followed by Lactobacillus plantarum BH1541 (59.9% for 1% bile). Furthermore, all strains were resistant to colistine, clindamycine, chloramphenicol, and ciprofloxacine, but most of the strains were susceptible to Peniciline, Oxacillin, Oxytetracyclin, and Amoxicillin. Functionally interesting Lactobacillus isolates may be used in the future as probiotic cultures for manufacturing fermented foods and as bioactive delivery systems.Keywords: probiotics, lactobacillus, pollen, bile, acid tolerance
Procedia PDF Downloads 4204449 Computer Aided Classification of Architectural Distortion in Mammograms Using Texture Features
Authors: Birmohan Singh, V.K.Jain
Abstract:
Computer aided diagnosis systems provide vital opinion to radiologists in the detection of early signs of breast cancer from mammogram images. Masses and microcalcifications, architectural distortions are the major abnormalities. In this paper, a computer aided diagnosis system has been proposed for distinguishing abnormal mammograms with architectural distortion from normal mammogram. Four types of texture features GLCM texture, GLRLM texture, fractal texture and spectral texture features for the regions of suspicion are extracted. Support Vector Machine has been used as classifier in this study. The proposed system yielded an overall sensitivity of 96.47% and accuracy of 96% for the detection of abnormalities with mammogram images collected from Digital Database for Screening Mammography (DDSM) database.Keywords: architecture distortion, mammograms, GLCM texture features, GLRLM texture features, support vector machine classifier
Procedia PDF Downloads 4914448 All-Silicon Raman Laser with Quasi-Phase-Matched Structures and Resonators
Authors: Isao Tomita
Abstract:
The principle of all-silicon Raman lasers for an output wavelength of 1.3 μm is presented, which employs quasi-phase-matched structures and resonators to enhance the output power. 1.3-μm laser beams for GE-PONs in FTTH systems generated from a silicon device are very important because such a silicon device can be monolithically integrated with the silicon planar lightwave circuits (Si PLCs) used in the GE-PONs. This reduces the device fabrication processes and time and also optical losses at the junctions between optical waveguides of the Si PLCs and Si laser devices when compared with 1.3-μm III-V semiconductor lasers set on the Si PLCs employed at present. We show that the quasi-phase-matched Si Raman laser with resonators can produce about 174 times larger laser power at 1.3 μm (at maximum) than that without resonators for a Si waveguide of Raman gain 20 cm/GW and optical loss 1.2 dB/cm, pumped at power 10 mW, where the length of the waveguide is 3 mm and its cross-section is (1.5 μm)2.Keywords: All-Silicon Raman Laser, FTTH, GE-PON, Quasi-Phase-Matched Structure, resonator
Procedia PDF Downloads 2544447 Small Target Recognition Based on Trajectory Information
Authors: Saad Alkentar, Abdulkareem Assalem
Abstract:
Recognizing small targets has always posed a significant challenge in image analysis. Over long distances, the image signal-to-noise ratio tends to be low, limiting the amount of useful information available to detection systems. Consequently, visual target recognition becomes an intricate task to tackle. In this study, we introduce a Track Before Detect (TBD) approach that leverages target trajectory information (coordinates) to effectively distinguish between noise and potential targets. By reframing the problem as a multivariate time series classification, we have achieved remarkable results. Specifically, our TBD method achieves an impressive 97% accuracy in separating target signals from noise within a mere half-second time span (consisting of 10 data points). Furthermore, when classifying the identified targets into our predefined categories—airplane, drone, and bird—we achieve an outstanding classification accuracy of 96% over a more extended period of 1.5 seconds (comprising 30 data points).Keywords: small targets, drones, trajectory information, TBD, multivariate time series
Procedia PDF Downloads 474446 Sliding Mode Control of Bilateral Teleoperation System with Time Delay
Authors: Ahmad Forouzantabar, Mohammad Azadi
Abstract:
This paper presents sliding mode controller for bilateral teleoperation systems with robotic master and slave under constant communication delays. We extend the passivity-based coordination architecture to enhance position and force tracking in the presence of offset in initial conditions, environmental contacts and unknown parameters such as friction coefficient. To address these difficulties, a nonlinear sliding mode controller is designed to approximate the nonlinear dynamics of master and slave robots and improve both position and force tracking. Using the Lyapunov theory, the boundedness of master- slave tracking errors and the stability of the teleoperation system are also guaranteed. Numerical simulations show that proposed controller position and force tracking performances are superior to that of conventional coordination controller tracking performances.Keywords: Lyapunov stability, teleoperation system, time delay, sliding mode controller
Procedia PDF Downloads 3854445 Towards Interconnectedness: A Study of Collaborative School Culture and Principal Curriculum Leadership
Authors: Fan Chih-Wen
Abstract:
The Ministry of Education (2014) released the 12-year National Basic Education Curriculum Syllabus. Curriculum implementation has evolved from a loose connection of cooperation to a closely structured relationship of coordination and collaboration. Collaboration opens the door to teachers' culture of isolation and classrooms and allows them to discuss educational issues from multiple perspectives and achieve shared goals. The purpose of study is to investigate facilitating factors of collaborative school culture and implications for principal curriculum leadership. The development and implementation of the new curriculum involves collaborative governance across systems and levels, including cooperation between central governments and schools. First, it analyzes the connotation of the 12-year National Basic Education Curriculum; Second, it analyzes the meaning of collaborative culture; Third, it analyzes the motivating factors of collaborative culture. Finally, based on this, it puts forward relevant suggestions for principal curriculum leadership.Keywords: curriculum leadership, collaboration culture, tracher culture, school improvement
Procedia PDF Downloads 224444 Comparing Radiographic Detection of Simulated Syndesmosis Instability Using Standard 2D Fluoroscopy Versus 3D Cone-Beam Computed Tomography
Authors: Diane Ghanem, Arjun Gupta, Rohan Vijayan, Ali Uneri, Babar Shafiq
Abstract:
Introduction: Ankle sprains and fractures often result in syndesmosis injuries. Unstable syndesmotic injuries result from relative motion between the distal ends of the tibia and fibula, anatomic juncture which should otherwise be rigid, and warrant operative management. Clinical and radiological evaluations of intraoperative syndesmosis stability remain a challenging task as traditional 2D fluoroscopy is limited to a uniplanar translational displacement. The purpose of this pilot cadaveric study is to compare the 2D fluoroscopy and 3D cone beam computed tomography (CBCT) stress-induced syndesmosis displacements. Methods: Three fresh-frozen lower legs underwent 2D fluoroscopy and 3D CIOS CBCT to measure syndesmosis position before dissection. Syndesmotic injury was simulated by resecting the (1) anterior inferior tibiofibular ligament (AITFL), the (2) posterior inferior tibiofibular ligament (PITFL) and the inferior transverse ligament (ITL) simultaneously, followed by the (3) interosseous membrane (IOM). Manual external rotation and Cotton stress test were performed after each of the three resections and 2D and 3D images were acquired. Relevant 2D and 3D parameters included the tibiofibular overlap (TFO), tibiofibular clear space (TCS), relative rotation of the fibula, and anterior-posterior (AP) and medial-lateral (ML) translations of the fibula relative to the tibia. Parameters were measured by two independent observers. Inter-rater reliability was assessed by intraclass correlation coefficient (ICC) to determine measurement precision. Results: Significant mismatches were found in the trends between the 2D and 3D measurements when assessing for TFO, TCS and AP translation across the different resection states. Using 3D CBCT, TFO was inversely proportional to the number of resected ligaments while TCS was directly proportional to the latter across all cadavers and ‘resection + stress’ states. Using 2D fluoroscopy, this trend was not respected under the Cotton stress test. 3D AP translation did not show a reliable trend whereas 2D AP translation of the fibula was positive under the Cotton stress test and negative under the external rotation. 3D relative rotation of the fibula, assessed using the Tang et al. ratio method and Beisemann et al. angular method, suggested slight overall internal rotation with complete resection of the ligaments, with a change < 2mm - threshold which corresponds to the commonly used buffer to account for physiologic laxity as per clinical judgment of the surgeon. Excellent agreement (>0.90) was found between the two independent observers for each of the parameters in both 2D and 3D (overall ICC 0.9968, 95% CI 0.995 - 0.999). Conclusions: The 3D CIOS CBCT appears to reliably depict the trend in TFO and TCS. This might be due to the additional detection of relevant rotational malpositions of the fibula in comparison to the standard 2D fluoroscopy which is limited to a single plane translation. A better understanding of 3D imaging may help surgeons identify the precise measurements planes needed to achieve better syndesmosis repair.Keywords: 2D fluoroscopy, 3D computed tomography, image processing, syndesmosis injury
Procedia PDF Downloads 704443 The Representation of J. D. Salinger’s Views on Changes in American Society in the 1940s in The Catcher in the Rye
Authors: Jessadaporn Achariyopas
Abstract:
The objectives of this study aim to analyze both the protagonist in The Catcher in the Rye in terms of ideological concepts and narrative techniques which influence the construction of the representation and the relationship between the representation and J. D. Salinger’s views on changes in American society in the 1940s. This area of study might concern two theories: namely, a theory of representation and narratology. In addition, this research is intended to answer the following three questions. Firstly, how is the production of meaning through language in The Catcher in the Rye constructed? Secondly, what are J. D. Salinger’s views on changes in American society in the 1940s? Lastly, how is the relationship between the representation and J. D. Salinger’s views? The findings showed that the protagonist’s views, J. D. Salinger’s views, and changes in American society in the 1940s are obviously interrelated. The production of meaning which is the representation of the protagonist’s views was constructed of narrative techniques. J. D. Salinger’s views on changes in American society in the 1940s were the same antisocial perspectives as Holden Caulfield’s which are phoniness, alienation and meltdown.Keywords: representation, construction of the representation, systems of representation, phoniness, alienation, meltdown
Procedia PDF Downloads 321