Search results for: deep graphical model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18187

Search results for: deep graphical model

14767 Unveiling Game Designers’ Designing Practices: Five-Essential-Steps Model

Authors: Mifrah Ahmad

Abstract:

Game designing processes vary with the intentions of the game. Digital games have versatile starting and finishing processes and these have been reported throughout the literature over decades. However, the need to understand how game designers’ practice in designing games is approached in the industry and how do they approach designing games is yet to be informed and whether they consider existing models or frameworks in their practice to assist their designing process of games. Therefore, this paper discusses 17 game designers’ participants' perspectives on how they approach designing games and how their experience of designing various games influences their practice. This research is conducted in an Australian context, through a phenomenology approach, where semi-structured interviews were designed and grounded by theory of experience by John Dewey. The audio data collected was analyzed using NVivo and interpreted using the interpretivism paradigm to contextualize the essence of game designers’ experiences in their practice and unfold their designing, developing, and iterative methodologies. As a result, a generic game-designing model is proposed that illuminates a sequence of steps that enables game designers’ initiatives toward a successful game design process. A ‘Five-Essential-Steps’ model (5ESM) for designing digital games may potentially assist early career game designers, gaming researchers as well as academics pursuing the designing process of games, educational games, or serious games.

Keywords: game designers practice, experiential design, designing models, game design approaches, designing process, software design, top-down model

Procedia PDF Downloads 48
14766 Automation of Savitsky's Method for Power Calculation of High Speed Vessel and Generating Empirical Formula

Authors: M. Towhidur Rahman, Nasim Zaman Piyas, M. Sadiqul Baree, Shahnewaz Ahmed

Abstract:

The design of high-speed craft has recently become one of the most active areas of naval architecture. Speed increase makes these vehicles more efficient and useful for military, economic or leisure purpose. The planing hull is designed specifically to achieve relatively high speed on the surface of the water. Speed on the water surface is closely related to the size of the vessel and the installed power. The Savitsky method was first presented in 1964 for application to non-monohedric hulls and for application to stepped hulls. This method is well known as a reliable comparative to CFD analysis of hull resistance. A computer program based on Savitsky’s method has been developed using MATLAB. The power of high-speed vessels has been computed in this research. At first, the program reads some principal parameters such as displacement, LCG, Speed, Deadrise angle, inclination of thrust line with respect to keel line etc. and calculates the resistance of the hull using empirical planning equations of Savitsky. However, some functions used in the empirical equations are available only in the graphical form, which is not suitable for the automatic computation. We use digital plotting system to extract data from nomogram. As a result, value of wetted length-beam ratio and trim angle can be determined directly from the input of initial variables, which makes the power calculation automated without manually plotting of secondary variables such as p/b and other coefficients and the regression equations of those functions are derived by using data from different charts. Finally, the trim angle, mean wetted length-beam ratio, frictional coefficient, resistance, and power are computed and compared with the results of Savitsky and good agreement has been observed.

Keywords: nomogram, planing hull, principal parameters, regression

Procedia PDF Downloads 397
14765 A Comprehensive Study of Spread Models of Wildland Fires

Authors: Manavjit Singh Dhindsa, Ursula Das, Kshirasagar Naik, Marzia Zaman, Richard Purcell, Srinivas Sampalli, Abdul Mutakabbir, Chung-Horng Lung, Thambirajah Ravichandran

Abstract:

These days, wildland fires, also known as forest fires, are more prevalent than ever. Wildfires have major repercussions that affect ecosystems, communities, and the environment in several ways. Wildfires lead to habitat destruction and biodiversity loss, affecting ecosystems and causing soil erosion. They also contribute to poor air quality by releasing smoke and pollutants that pose health risks, especially for individuals with respiratory conditions. Wildfires can damage infrastructure, disrupt communities, and cause economic losses. The economic impact of firefighting efforts, combined with their direct effects on forestry and agriculture, causes significant financial difficulties for the areas impacted. This research explores different forest fire spread models and presents a comprehensive review of various techniques and methodologies used in the field. A forest fire spread model is a computational or mathematical representation that is used to simulate and predict the behavior of a forest fire. By applying scientific concepts and data from empirical studies, these models attempt to capture the intricate dynamics of how a fire spreads, taking into consideration a variety of factors like weather patterns, topography, fuel types, and environmental conditions. These models assist authorities in understanding and forecasting the potential trajectory and intensity of a wildfire. Emphasizing the need for a comprehensive understanding of wildfire dynamics, this research explores the approaches, assumptions, and findings derived from various models. By using a comparison approach, a critical analysis is provided by identifying patterns, strengths, and weaknesses among these models. The purpose of the survey is to further wildfire research and management techniques. Decision-makers, researchers, and practitioners can benefit from the useful insights that are provided by synthesizing established information. Fire spread models provide insights into potential fire behavior, facilitating authorities to make informed decisions about evacuation activities, allocating resources for fire-fighting efforts, and planning for preventive actions. Wildfire spread models are also useful in post-wildfire mitigation strategies as they help in assessing the fire's severity, determining high-risk regions for post-fire dangers, and forecasting soil erosion trends. The analysis highlights the importance of customized modeling approaches for various circumstances and promotes our understanding of the way forest fires spread. Some of the known models in this field are Rothermel’s wildland fuel model, FARSITE, WRF-SFIRE, FIRETEC, FlamMap, FSPro, cellular automata model, and others. The key characteristics that these models consider include weather (includes factors such as wind speed and direction), topography (includes factors like landscape elevation), and fuel availability (includes factors like types of vegetation) among other factors. The models discussed are physics-based, data-driven, or hybrid models, also utilizing ML techniques like attention-based neural networks to enhance the performance of the model. In order to lessen the destructive effects of forest fires, this initiative aims to promote the development of more precise prediction tools and effective management techniques. The survey expands its scope to address the practical needs of numerous stakeholders. Access to enhanced early warning systems enables decision-makers to take prompt action. Emergency responders benefit from improved resource allocation strategies, strengthening the efficacy of firefighting efforts.

Keywords: artificial intelligence, deep learning, forest fire management, fire risk assessment, fire simulation, machine learning, remote sensing, wildfire modeling

Procedia PDF Downloads 76
14764 Uplift Segmentation Approach for Targeting Customers in a Churn Prediction Model

Authors: Shivahari Revathi Venkateswaran

Abstract:

Segmenting customers plays a significant role in churn prediction. It helps the marketing team with proactive and reactive customer retention. For the reactive retention, the retention team reaches out to customers who already showed intent to disconnect by giving some special offers. When coming to proactive retention, the marketing team uses churn prediction model, which ranks each customer from rank 1 to 100, where 1 being more risk to churn/disconnect (high ranks have high propensity to churn). The churn prediction model is built by using XGBoost model. However, with the churn rank, the marketing team can only reach out to the customers based on their individual ranks. To profile different groups of customers and to frame different marketing strategies for targeted groups of customers are not possible with the churn ranks. For this, the customers must be grouped in different segments based on their profiles, like demographics and other non-controllable attributes. This helps the marketing team to frame different offer groups for the targeted audience and prevent them from disconnecting (proactive retention). For segmentation, machine learning approaches like k-mean clustering will not form unique customer segments that have customers with same attributes. This paper finds an alternate approach to find all the combination of unique segments that can be formed from the user attributes and then finds the segments who have uplift (churn rate higher than the baseline churn rate). For this, search algorithms like fast search and recursive search are used. Further, for each segment, all customers can be targeted using individual churn ranks from the churn prediction model. Finally, a UI (User Interface) is developed for the marketing team to interactively search for the meaningful segments that are formed and target the right set of audience for future marketing campaigns and prevent them from disconnecting.

Keywords: churn prediction modeling, XGBoost model, uplift segments, proactive marketing, search algorithms, retention, k-mean clustering

Procedia PDF Downloads 67
14763 Deep Convolutional Neural Network for Detection of Microaneurysms in Retinal Fundus Images at Early Stage

Authors: Goutam Kumar Ghorai, Sandip Sadhukhan, Arpita Sarkar, Debprasad Sinha, G. Sarkar, Ashis K. Dhara

Abstract:

Diabetes mellitus is one of the most common chronic diseases in all countries and continues to increase in numbers significantly. Diabetic retinopathy (DR) is damage to the retina that occurs with long-term diabetes. DR is a major cause of blindness in the Indian population. Therefore, its early diagnosis is of utmost importance towards preventing progression towards imminent irreversible loss of vision, particularly in the huge population across rural India. The barriers to eye examination of all diabetic patients are socioeconomic factors, lack of referrals, poor access to the healthcare system, lack of knowledge, insufficient number of ophthalmologists, and lack of networking between physicians, diabetologists and ophthalmologists. A few diabetic patients often visit a healthcare facility for their general checkup, but their eye condition remains largely undetected until the patient is symptomatic. This work aims to focus on the design and development of a fully automated intelligent decision system for screening retinal fundus images towards detection of the pathophysiology caused by microaneurysm in the early stage of the diseases. Automated detection of microaneurysm is a challenging problem due to the variation in color and the variation introduced by the field of view, inhomogeneous illumination, and pathological abnormalities. We have developed aconvolutional neural network for efficient detection of microaneurysm. A loss function is also developed to handle severe class imbalance due to very small size of microaneurysms compared to background. The network is able to locate the salient region containing microaneurysms in case of noisy images captured by non-mydriatic cameras. The ground truth of microaneurysms is created by expert ophthalmologists for MESSIDOR database as well as private database, collected from Indian patients. The network is trained from scratch using the fundus images of MESSIDOR database. The proposed method is evaluated on DIARETDB1 and the private database. The method is successful in detection of microaneurysms for dilated and non-dilated types of fundus images acquired from different medical centres. The proposed algorithm could be used for development of AI based affordable and accessible system, to provide service at grass root-level primary healthcare units spread across the country to cater to the need of the rural people unaware of the severe impact of DR.

Keywords: retinal fundus image, deep convolutional neural network, early detection of microaneurysms, screening of diabetic retinopathy

Procedia PDF Downloads 136
14762 Agriculture and Global Economy vis-à-vis the Climate Change

Authors: Assaad Ghazouani, Ati Abdessatar

Abstract:

In the world, agriculture maintains a social and economic importance in the national economy. Its importance is distinguished by its ripple effects not only downstream but also upstream vis-à-vis the non-agricultural sector. However, the situation is relatively fragile because of weather conditions. In this work, we propose a model to highlight the impacts of climate change (CC) on economic growth in the world where agriculture is considered as a strategic sector. The CC is supposed to directly and indirectly affect economic growth by reducing the performance of the agricultural sector. The model is tested for Tunisia. The results validate the hypothesis that the potential economic damage of the CC is important. Indeed, an increase in CO2 concentration (temperatures and disruption of rainfall patterns) will have an impact on global economic growth particularly by reducing the performance of the agricultural sector. Analysis from a vector error correction model also highlights the magnitude of climate impact on the performance of the agricultural sector and its repercussions on economic growth

Keywords: Climate Change, Agriculture, Economic Growth, World, VECM, Cointegration.

Procedia PDF Downloads 615
14761 A Selection Approach: Discriminative Model for Nominal Attributes-Based Distance Measures

Authors: Fang Gong

Abstract:

Distance measures are an indispensable part of many instance-based learning (IBL) and machine learning (ML) algorithms. The value difference metrics (VDM) and inverted specific-class distance measure (ISCDM) are among the top-performing distance measures that address nominal attributes. VDM performs well in some domains owing to its simplicity and poorly in others that exist missing value and non-class attribute noise. ISCDM, however, typically works better than VDM on such domains. To maximize their advantages and avoid disadvantages, in this paper, a selection approach: a discriminative model for nominal attributes-based distance measures is proposed. More concretely, VDM and ISCDM are built independently on a training dataset at the training stage, and the most credible one is recorded for each training instance. At the test stage, its nearest neighbor for each test instance is primarily found by any of VDM and ISCDM and then chooses the most reliable model of its nearest neighbor to predict its class label. It is simply denoted as a discriminative distance measure (DDM). Experiments are conducted on the 34 University of California at Irvine (UCI) machine learning repository datasets, and it shows DDM retains the interpretability and simplicity of VDM and ISCDM but significantly outperforms the original VDM and ISCDM and other state-of-the-art competitors in terms of accuracy.

Keywords: distance measure, discriminative model, nominal attributes, nearest neighbor

Procedia PDF Downloads 108
14760 Sea-Spray Calculations Using the MESO-NH Model

Authors: Alix Limoges, William Bruch, Christophe Yohia, Jacques Piazzola

Abstract:

A number of questions arise concerning the long-term impact of the contribution of marine aerosol fluxes generated at the air-sea interface on the occurrence of intense events (storms, floods, etc.) in the coastal environment. To this end, knowledge is needed on sea-spray emission rates and the atmospheric dynamics of the corresponding particles. Our aim is to implement the mesoscale model MESO-NH on the study area using an accurate sea-spray source function to estimate heat fluxes and impact on the precipitations. Based on an original and complete sea-spray source function, which covers a large size spectrum since taking into consideration the sea-spray produced by both bubble bursting and surface tearing process, we propose a comparison between model simulations and experimental data obtained during an oceanic scientific cruise on board the navy ship Atalante. The results show the relevance of the sea-spray flux calculations as well as their impact on the heat fluxes and AOD.

Keywords: atmospheric models, sea-spray source, sea-spray dynamics, aerosols

Procedia PDF Downloads 142
14759 The Intention to Use Telecare in People of Fall Experience: Application of Fuzzy Neural Network

Authors: Jui-Chen Huang, Shou-Hsiung Cheng

Abstract:

This study examined their willingness to use telecare for people who have had experience falling in the last three months in Taiwan. This study adopted convenience sampling and a structural questionnaire to collect data. It was based on the definition and the constructs related to the Health Belief Model (HBM). HBM is comprised of seven constructs: perceived benefits (PBs), perceived disease threat (PDT), perceived barriers of taking action (PBTA), external cues to action (ECUE), internal cues to action (ICUE), attitude toward using (ATT), and behavioral intention to use (BI). This study adopted Fuzzy Neural Network (FNN) to put forward an effective method. It shows the dependence of ATT on PB, PDT, PBTA, ECUE, and ICUE. The training and testing data RMSE (root mean square error) are 0.028 and 0.166 in the FNN, respectively. The training and testing data RMSE are 0.828 and 0.578 in the regression model, respectively. On the other hand, as to the dependence of ATT on BI, as presented in the FNN, the training and testing data RMSE are 0.050 and 0.109, respectively. The training and testing data RMSE are 0.529 and 0.571 in the regression model, respectively. The results show that the FNN method is better than the regression analysis. It is an effective and viable good way.

Keywords: fall, fuzzy neural network, health belief model, telecare, willingness

Procedia PDF Downloads 190
14758 The Case for Strategic Participation: How Facilitated Engagement Can Be Shown to Reduce Resistance and Improve Outcomes Through the Use of Strategic Models

Authors: Tony Mann

Abstract:

This paper sets out the case for involving and engaging employees/workers/stakeholders/staff in any significant change that is being considered by the senior executives of the organization. It establishes the rationale, the approach, the methodology of engagement and the benefits of a participative approach. It challenges the new norm of imposing change for fear of resistance and instead suggests that involving people has better outcomes and a longer-lasting impact. Various strategic models are introduced and illustrated to explain how the process can be most effective. The paper highlights one model in particular (the Process Iceberg® Organizational Change model) that has proven to be instrumental in developing effective change. Its use is demonstrated in its various forms and explains why so much change fails to address the key elements and how we can be more productive in managing change. ‘Participation’ in change is too often seen as negative, expensive and unwieldy. The paper aims to show that another model: UIA=O+E, can offset the difficulties and, in fact, produce much more positive and effective change.

Keywords: facilitation, stakeholders, buy-in, digital workshops

Procedia PDF Downloads 97
14757 Investigation of a Technology Enabled Model of Home Care: the eShift Model of Palliative Care

Authors: L. Donelle, S. Regan, R. Booth, M. Kerr, J. McMurray, D. Fitzsimmons

Abstract:

Palliative home health care provision within the Canadian context is challenged by: (i) a shortage of registered nurses (RN) and RNs with palliative care expertise, (ii) an aging population, (iii) reliance on unpaid family caregivers to sustain home care services with limited support to conduct this ‘care work’, (iv) a model of healthcare that assumes client self-care, and (v) competing economic priorities. In response, an interprofessional team of service provider organizations, a software/technology provider, and health care providers developed and implemented a technology-enabled model of home care, the eShift model of palliative home care (eShift). The eShift model combines communication and documentation technology with non-traditional utilization of health human resources to meet patient needs for palliative care in the home. The purpose of this study was to investigate the structure, processes, and outcomes of the eShift model of care. Methodology: Guided by Donebedian’s evaluation framework for health care, this qualitative-descriptive study investigated the structure, processes, and outcomes care of the eShift model of palliative home care. Interviews and focus groups were conducted with health care providers (n= 45), decision-makers (n=13), technology providers (n=3) and family care givers (n=8). Interviews were recorded, transcribed, and a deductive analysis of transcripts was conducted. Study Findings (1) Structure: The eShift model consists of a remotely-situated RN using technology to direct care provision virtually to patients in their home. The remote RN is connected virtually to a health technician (an unregulated care provider) in the patient’s home using real-time communication. The health technician uses a smartphone modified with the eShift application and communicates with the RN who uses a computer with the eShift application/dashboard. Documentation and communication about patient observations and care activities occur in the eShift portal. The RN is typically accountable for four to six health technicians and patients over an 8-hour shift. The technology provider was identified as an important member of the healthcare team. Other members of the team include family members, care coordinators, nurse practitioners, physicians, and allied health. (2) Processes: Conventionally, patient needs are the focus of care; however within eShift, the patient and the family caregiver were the focus of care. Enhanced medication administration was seen as one of the most important processes, and family caregivers reported high satisfaction with the care provided. There was perceived enhanced teamwork among health care providers. (3) Outcomes: Patients were able to die at home. The eShift model enabled consistency and continuity of care, and effective management of patient symptoms and caregiver respite. Conclusion: More than a technology solution, the eShift model of care was viewed as transforming home care practice and an innovative way to resolve the shortage of palliative care nurses within home care.

Keywords: palliative home care, health information technology, patient-centred care, interprofessional health care team

Procedia PDF Downloads 410
14756 Multi-Objective Simulated Annealing Algorithms for Scheduling Just-In-Time Assembly Lines

Authors: Ghorbanali Mohammadi

Abstract:

New approaches to sequencing mixed-model manufacturing systems are present. These approaches have attracted considerable attention due to their potential to deal with difficult optimization problems. This paper presents Multi-Objective Simulated Annealing Algorithms (MOSAA) approaches to the Just-In-Time (JIT) sequencing problem where workload-smoothing (WL) and the number of set-ups (St) are to be optimized simultaneously. Mixed-model assembly lines are types of production lines where varieties of product models similar in product characteristics are assembled. Moreover, this type of problem is NP-hard. Two annealing methods are proposed to solve the multi-objective problem and find an efficient frontier of all design configurations. The performances of the two methods are tested on several problems from the literature. Experimentation demonstrates the relative desirable performance of the presented methodology.

Keywords: scheduling, just-in-time, mixed-model assembly line, sequencing, simulated annealing

Procedia PDF Downloads 122
14755 A Sharp Interface Model for Simulating Seawater Intrusion in the Coastal Aquifer of Wadi Nador (Algeria)

Authors: Abdelkader Hachemi, Boualem Remini

Abstract:

Seawater intrusion is a significant challenge faced by coastal aquifers in the Mediterranean basin. This study aims to determine the position of the sharp interface between seawater and freshwater in the aquifer of Wadi Nador, located in the Wilaya of Tipaza, Algeria. A numerical areal sharp interface model using the finite element method is developed to investigate the spatial and temporal behavior of seawater intrusion. The aquifer is assumed to be homogeneous and isotropic. The simulation results are compared with geophysical prospection data obtained through electrical methods in 2011 to validate the model. The simulation results demonstrate a good agreement with the geophysical prospection data, confirming the accuracy of the sharp interface model. The position of the sharp interface in the aquifer is found to be approximately 1617 meters from the sea. Two scenarios are proposed to predict the interface position for the year 2024: one without pumping and the other with pumping. The results indicate a noticeable retreat of the sharp interface position in the first scenario, while a slight decline is observed in the second scenario. The findings of this study provide valuable insights into the dynamics of seawater intrusion in the Wadi Nador aquifer. The predicted changes in the sharp interface position highlight the potential impact of pumping activities on the aquifer's vulnerability to seawater intrusion. This study emphasizes the importance of implementing measures to manage and mitigate seawater intrusion in coastal aquifers. The sharp interface model developed in this research can serve as a valuable tool for assessing and monitoring the vulnerability of aquifers to seawater intrusion.

Keywords: seawater intrusion, sharp interface, coastal aquifer, algeria

Procedia PDF Downloads 110
14754 Molecular Communication Noise Effect Analysis of Diffusion-Based Channel for Considering Minimum-Shift Keying and Molecular Shift Keying Modulations

Authors: A. Azari, S. S. K. Seyyedi

Abstract:

One of the unaddressed and open challenges in the nano-networking is the characteristics of noise. The previous analysis, however, has concentrated on end-to-end communication model with no separate modelings for propagation channel and noise. By considering a separate signal propagation and noise model, the design and implementation of an optimum receiver will be much easier. In this paper, we justify consideration of a separate additive Gaussian noise model of a nano-communication system based on the molecular communication channel for which are applicable for MSK and MOSK modulation schemes. The presented noise analysis is based on the Brownian motion process, and advection molecular statistics, where the received random signal has a probability density function whose mean is equal to the mean number of the received molecules. Finally, the justification of received signal magnitude being uncorrelated with additive non-stationary white noise is provided.

Keywords: molecular, noise, diffusion, channel

Procedia PDF Downloads 273
14753 Decoding Socio-Cultural Trends in Indian Urban Youth Using Ogilvy 3E Model

Authors: Falguni Vasavada, Pradyumna Malladi

Abstract:

The research focuses on studying the ecosystem of the youth using Ogilvy's 3E model, Ethnography and Thematic Analysis. It has been found that urban Indian youth today is an honest generation, hungry for success, living life by the moment, fiercely independent, are open about sex, sexuality and embrace individual differences. Technology and social media dominate their life. However, they are also phobic about commitments, often drifting along life and engage in unsubstantiated brave-talk.

Keywords: ethnography, youth, culture, track, buyer behavior

Procedia PDF Downloads 353
14752 Estimation of Optimum Parameters of Non-Linear Muskingum Model of Routing Using Imperialist Competition Algorithm (ICA)

Authors: Davood Rajabi, Mojgan Yazdani

Abstract:

Non-linear Muskingum model is an efficient method for flood routing, however, the efficiency of this method is influenced by three applied parameters. Therefore, efficiency assessment of Imperialist Competition Algorithm (ICA) to evaluate optimum parameters of non-linear Muskingum model was addressed through this study. In addition to ICA, Genetic Algorithm (GA) and Particle Swarm Optimization (PSO) were also used aiming at an available criterion to verdict ICA. In this regard, ICA was applied for Wilson flood routing; then, routing of two flood events of DoAab Samsami River was investigated. In case of Wilson flood that the target function was considered as the sum of squared deviation (SSQ) of observed and calculated discharges. Routing two other floods, in addition to SSQ, another target function was also considered as the sum of absolute deviations of observed and calculated discharge. For the first floodwater based on SSQ, GA indicated the best performance, however, ICA was on first place, based on SAD. For the second floodwater, based on both target functions, ICA indicated a better operation. According to the obtained results, it can be said that ICA could be used as an appropriate method to evaluate the parameters of Muskingum non-linear model.

Keywords: Doab Samsami river, genetic algorithm, imperialist competition algorithm, meta-exploratory algorithms, particle swarm optimization, Wilson flood

Procedia PDF Downloads 497
14751 Modeling of Electrokinetic Mixing in Lab on Chip Microfluidic Devices

Authors: Virendra J. Majarikar, Harikrishnan N. Unni

Abstract:

This paper sets to demonstrate a modeling of electrokinetic mixing employing electroosmotic stationary and time-dependent microchannel using alternate zeta patches on the lower surface of the micromixer in a lab on chip microfluidic device. Electroosmotic flow is amplified using different 2D and 3D model designs with alternate and geometric zeta potential values such as 25, 50, and 100 mV, respectively, to achieve high concentration mixing in the electrokinetically-driven microfluidic system. The enhancement of electrokinetic mixing is studied using Finite Element Modeling, and simulation workflow is accomplished with defined integral steps. It can be observed that the presence of alternate zeta patches can help inducing microvortex flows inside the channel, which in turn can improve mixing efficiency. Fluid flow and concentration fields are simulated by solving Navier-Stokes equation (implying Helmholtz-Smoluchowski slip velocity boundary condition) and Convection-Diffusion equation. The effect of the magnitude of zeta potential, the number of alternate zeta patches, etc. are analysed thoroughly. 2D simulation reveals that there is a cumulative increase in concentration mixing, whereas 3D simulation differs slightly with low zeta potential as that of the 2D model within the T-shaped micromixer for concentration 1 mol/m3 and 0 mol/m3, respectively. Moreover, 2D model results were compared with those of 3D to indicate the importance of the 3D model in a microfluidic design process.

Keywords: COMSOL Multiphysics®, electrokinetic, electroosmotic, microfluidics, zeta potential

Procedia PDF Downloads 239
14750 Therapy Finding and Perspectives on Limbic Resonance in Gifted Adults

Authors: Andreas Aceranti, Riccardo Dossena, Marco Colorato, Simonetta Vernocchi

Abstract:

By the term “limbic resonance,” we usually refer to a state of deep connection, both emotional and physiological, between people who, when in resonance, find their limbic systems in tune with one another. Limbic resonance is not only about sharing emotions but also physiological states. In fact, people in such resonance can influence each other’s heart rate, blood pressure, and breathing. Limbic resonance is fundamental for human beings to connect and create deep bonds among a certain group. It is fundamental for our social skills. A relationship between gifted and resonant subjects is perceived as feeling safe, living the relation like an isle of serenity where it is possible to recharge, to communicate without words, to understand each others without giving explanations, to strengthen the balance of each member of the group. Within the circle, self-esteem is consolidated and makes it stronger to face what is outside, others, and reality. The idea that gifted people who are together may be unfit for the world does not correspond to the truth. The circle made up of people with high cognitive potential characterized by a limbic resonance is, in general, experienced as a solid platform from which you can safely move away and where you can return to recover strength. We studied 8 adults (between 21 and 47 years old). All of them with IQ higher than 130. We monitored their brain waves frequency (alpha, beta, theta, gamma, delta) by means of biosensing tracker along with their physiological states (heart beat frequency, blood pressure, breathing frequency, pO2, pCO2) and some blood works only (5-HT, dopamine, catecholamines, cortisol). The subjects of the study were asked to adhere to a protocol involving bonding activities (such as team building activities), role plays, meditation sessions, and group therapy. All these activities were carried out together. We observed that after about 4 months of activities, their brain waves frequencies tended to tune quicker and quicker. After 9 months, the bond among them was so important that they could “sense” each other inner states and sometimes also guess each others’ thoughts. According to our findings, it may be hypothesized that large synchronized outbursts of cortex neurons produces not only brain waves but also electromagnetic fields that may be able to influence the cortical neurons’ activity of other people’s brain by inducing action potentials in large groups of neurons and this is reasonably conceivable to be able to transmit information such as different emotions and cognition cues to the other’s brain. We also believe that upcoming research should focus on clarifying the role of brain magnetic particles in brain-to-brain communication. We also believe that further investigations should be carried out on the presence and role of cryptochromes to evaluate their potential roles in direct brain-to-brain communication.

Keywords: limbic resonance, psychotherapy, brain waves, emotion regulation, giftedness

Procedia PDF Downloads 84
14749 Competency Based Talent Acquisition: Concept, Practice, and Model, with Reference to Indian Industries

Authors: Manasi V. Shah

Abstract:

Organizations, in the competitive era, are participating in the competency act. They have discerned that, strategically researched and defined competencies when put up on the shelf, can help in achieving business goals. The research focuses on critical elements of competency-based talent acquisition process from practical vantage, with significant experience in a variety of business settings. The research is exploratory and descriptive in nature. The research conduct and outcome is the hinge on with reference to Indian Industries. It elaborates about the concept, practice and a brief model that human resource practitioner can use for effective talent acquisition process, which in turn would be in alignment with business performance. The research helps to present a prudent understanding of recruiting and selecting apt human capital, that can fit in a given job role and has action oriented competency based assessment approach for measuring the probable success of a job incumbent in a given job role.

Keywords: competency based talent acquisition, competency model, talent acquisition concept, talent acquisition practice

Procedia PDF Downloads 306
14748 Using Social Network Analysis for Cyber Threat Intelligence

Authors: Vasileios Anastopoulos

Abstract:

Cyber threat intelligence assists organizations in understanding the threats they face and helps them make educated decisions on preparing their defenses. Sharing of threat intelligence and threat information is increasingly leveraged by organizations and enterprises, and various software solutions are already available, with the open-source malware information sharing platform (MISP) being a popular one. In this work, a methodology for the production of cyber threat intelligence using the threat information stored in MISP is proposed. The methodology leverages the discipline of social network analysis and the diamond model, a model used for intrusion analysis, to produce cyber threat intelligence. The workings are demonstrated with a case study on a production MISP instance of a real organization. The paper concluded with a discussion on the proposed methodology and possible directions for further research.

Keywords: cyber threat intelligence, diamond model, malware information sharing platform, social network analysis

Procedia PDF Downloads 159
14747 Digital Employment of Disabled People: Empirical Study from Shanghai

Authors: Yan Zi, Han Xiao

Abstract:

Across the globe, ICTs are influencing employment both as an industry that creates jobs and as a tool that empowers disabled people to access new forms of work, in innovative and more flexible ways. The advancements in ICT and the number of apps and solutions that support persons with physical, cognitive and intellectual disabilities challenge traditional biased notions and offer a pathway out of traditional sheltered workshops. As the global leader in digital technology innovation, China is arguably a leader in the use of digital technology as a 'lever' in ending the economic and social marginalization of the disabled. This study investigates factors that influence adoption and use of employment-oriented ICT applications among disabled people in China and seeks to integrate three theoretical approaches: the technology acceptance model (TAM), the uses and gratifications (U&G) approach, and the social model of disability. To that end, the study used data from self-reported survey of 214 disabled adults who have been involved in two top-down 'Internet + employment' programs promoted by local disabled persons’ federation in Shanghai. A structural equation model employed in the study demonstrates that the use of employment-oriented ICT applications is affected by demographic factors of gender, categories of disability, education and marital status. The organizational support of local social organizations demonstrates significate effects on the motivations of disabled people. Results from the focus group interviews particularly suggested that to maximize the positive impact of ICTs on employment, there is significant need to build stakeholder capacity on how ICTs could benefits persons with disabilities.

Keywords: disabled people, ICTs, technology acceptance model, uses and gratifications, the social model of disability

Procedia PDF Downloads 102
14746 Combined Analysis of m⁶A and m⁵C Modulators on the Prognosis of Hepatocellular Carcinoma

Authors: Hongmeng Su, Luyu Zhao, Yanyan Qian, Hong Fan

Abstract:

Aim: Hepatocellular carcinoma (HCC) is one of the most common malignant tumors that endanger human health seriously. RNA methylation, especially N6-methyladenosine (m⁶A) and 5-methylcytosine (m⁵C), a crucial epigenetic transcriptional regulatory mechanism, plays an important role in tumorigenesis, progression and prognosis. This research aims to systematically evaluate the prognostic value of m⁶A and m⁵C modulators in HCC patients. Methods: Twenty-four modulators of m⁶A and m⁵C were candidates to analyze their expression level and their contribution to predict the prognosis of HCC. Consensus clustering analysis was applied to classify HCC patients. Cox and LASSO regression were used to construct the risk model. According to the risk score, HCC patients were divided into high-risk and low/medium-risk groups. The clinical pathology factors of HCC patients were analyzed by univariate and multivariate Cox regression analysis. Results: The HCC patients were classified into 2 clusters with significant differences in overall survival and clinical characteristics. Nine-gene risk model was constructed including METTL3, VIRMA, YTHDF1, YTHDF2, NOP2, NSUN4, NSUN5, DNMT3A and ALYREF. It was indicated that the risk score could serve as an independent prognostic factor for patients with HCC. Conclusion: This study constructed a Nine-gene risk model by modulators of m⁶A and m⁵C and investigated its effect on the clinical prognosis of HCC. This model may provide important consideration for the therapeutic strategy and prognosis evaluation analysis of patients with HCC.

Keywords: hepatocellular carcinoma, m⁶A, m⁵C, prognosis, RNA methylation

Procedia PDF Downloads 63
14745 Model Predictive Control Applied to Thermal Regulation of Thermoforming Process Based on the Armax Linear Model and a Quadratic Criterion Formulation

Authors: Moaine Jebara, Lionel Boillereaux, Sofiane Belhabib, Michel Havet, Alain Sarda, Pierre Mousseau, Rémi Deterre

Abstract:

Energy consumption efficiency is a major concern for the material processing industry such as thermoforming process and molding. Indeed, these systems should deliver the right amount of energy at the right time to the processed material. Recent technical development, as well as the particularities of the heating system dynamics, made the Model Predictive Control (MPC) one of the best candidates for thermal control of several production processes like molding and composite thermoforming to name a few. The main principle of this technique is to use a dynamic model of the process inside the controller in real time in order to anticipate the future behavior of the process which allows the current timeslot to be optimized while taking future timeslots into account. This study presents a procedure based on a predictive control that brings balance between optimality, simplicity, and flexibility of its implementation. The development of this approach is progressive starting from the case of a single zone before its extension to the multizone and/or multisource case, taking thus into account the thermal couplings between the adjacent zones. After a quadratic formulation of the MPC criterion to ensure the thermal control, the linear expression is retained in order to reduce calculation time thanks to the use of the ARMAX linear decomposition methods. The effectiveness of this approach is illustrated by experiment and simulation.

Keywords: energy efficiency, linear decomposition methods, model predictive control, mold heating systems

Procedia PDF Downloads 264
14744 Model-Independent Price Bounds for the Swiss Re Mortality Bond 2003

Authors: Raj Kumari Bahl, Sotirios Sabanis

Abstract:

In this paper, we are concerned with the valuation of the first Catastrophic Mortality Bond that was launched in the market namely the Swiss Re Mortality Bond 2003. This bond encapsulates the behavior of a well-defined mortality index to generate payoffs for the bondholders. Pricing this bond is a challenging task. We adapt the payoff of the terminal principal of the bond in terms of the payoff of an Asian put option and present an approach to derive model-independent bounds exploiting comonotonic theory. We invoke Jensen’s inequality for the computation of lower bounds and employ Lagrange optimization technique to achieve the upper bound. The success of these bounds is based on the availability of compatible European mortality options in the market. We carry out Monte Carlo simulations to estimate the bond price and illustrate the strength of these bounds across a variety of models. The fact that our bounds are model-independent is a crucial breakthrough in the pricing of catastrophic mortality bonds.

Keywords: mortality bond, Swiss Re Bond, mortality index, comonotonicity

Procedia PDF Downloads 244
14743 Application of Numerical Modeling and Field Investigations for Groundwater Recharge Characterization at Abydos Archeological Site, Sohag, Egypt

Authors: Sherif A. Abu El-Magd, Ahmed M. Sefelnasr, Ahmed M. Masoud

Abstract:

Groundwater modeling is the way and tool for assessing and managing groundwater resources efficiently. The present work was carried out in the ancient Egyptian archeological site (Abydos) fromDynastyIandII.Theareaislocated about 13km west of the River Nilecourse, Upper Egypt. The main problem in this context is that the ground water level rise threatens and damages fragile carvings and paintings of the ancient buildings. The main objective of the present work is to identify the sources of the groundwater recharge in the site, further more, equally important there is to control the ground water level rise. Numerical modeling combined with field water level measurements was implemented to understand the ground water recharge sources. However, building a conceptual model was an important step in the groundwater modeling to phase to satisfy the modeling objectives. Therefore, boreholes, crosssections, and a high-resolution digital elevation model were used to construct the conceptual model. To understand the hydrological system in the site, the model was run under both steady state and transient conditions. Then, the model was calibrated agains the observation of the water level measurements. Finally, the results based on the modeling indicated that the groundwater recharge is originating from an indirect flow path mainly from the southeast. Besides, there is a hydraulic connection between the surface water and groundwater in the study site. The decision-makers and archeologyists could consider the present work to understand the behavior of groundwater recharge and water table level rise.

Keywords: numerical modeling, archeological site, groundwater recharge, egypt

Procedia PDF Downloads 117
14742 The Influence of Gilles Deleuze and Felix Guattari's Thoughts and Ideas on Post-Modern Architecture

Authors: A. Nabi, S. Panahi

Abstract:

In the recent years, due to the countless changes in the world and various sciences, architecture has faced a new approach and different concepts more than any other times. The direct influence of philosophy on architecture is one of the features of contemporary architecture. Linking these two learnings directly together needs deep reflection. Gilles Deleuze and Félix Guattari are among the people who greatly influenced the thinking of future architects and artists by bringing up new concepts. If we focus on the works of these architects and artists whose works resemble anti-Platonism and who subvert the western philosophy, we can extract concepts which we can see their influence on art and architecture. Using content analysis, this study has come to this conclusion that the ideas of Deleuze and Guattari could influence the contemporary architecture.

Keywords: Gilles Deleuze, Felix Guattari, anti-platonism, post-modern architecture, folding

Procedia PDF Downloads 195
14741 Modeling Study of Short Fiber Orientation in Simple Injection Molding Processes

Authors: Ihsane Modhaffar, Kamal Gueraoui, Abouelkacem Qais, Abderrahmane Maaouni, Samir Men-La-Yakhaf, Hamid Eltourroug

Abstract:

The main objective of this paper is to develop a Computational Fluid Dynamics (CFD) model to simulate and characterize the fiber suspension in flow in rectangular cavities. The model is intended to describe the velocity profile and to predict the fiber orientation. The flow was considered to be incompressible, and behave as Newtonian fluid containing suspensions of short-fibers. The numerical model for determination of velocity profile and fiber orientation during mold-filling stage of injection molding process was solved using finite volume method. The governing equations of this problem are: the continuity, the momentum and the energy. The obtained results were compared to available experimental findings. A good agreement between the numerical results and the experimental data was achieved.

Keywords: injection, composites, short-fiber reinforced thermoplastics, fiber orientation, incompressible fluid, numerical simulation

Procedia PDF Downloads 462
14740 Vulnerability Assessment of Groundwater Quality Deterioration Using PMWIN Model

Authors: A. Shakoor, M. Arshad

Abstract:

The utilization of groundwater resources in irrigation has significantly increased during the last two decades due to constrained canal water supplies. More than 70% of the farmers in the Punjab, Pakistan, depend directly or indirectly on groundwater to meet their crop water demands and hence, an unchecked paradigm shift has resulted in aquifer depletion and deterioration. Therefore, a comprehensive research was carried at central Punjab-Pakistan, regarding spatiotemporal variation in groundwater level and quality. Processing MODFLOW for window (PMWIN) and MT3D (solute transport model) models were used for existing and future prediction of groundwater level and quality till 2030. The comprehensive data set of aquifer lithology, canal network, groundwater level, groundwater salinity, evapotranspiration, groundwater abstraction, recharge etc. were used in PMWIN model development. The model was thus, successfully calibrated and validated with respect to groundwater level for the periods of 2003 to 2007 and 2008 to 2012, respectively. The coefficient of determination (R2) and model efficiency (MEF) for calibration and validation period were calculated as 0.89 and 0.98, respectively, which argued a high level of correlation between the calculated and measured data. For solute transport model (MT3D), the values of advection and dispersion parameters were used. The model used for future scenario up to 2030, by assuming that there would be no uncertain change in climate and groundwater abstraction rate would increase gradually. The model predicted results revealed that the groundwater would decline from 0.0131 to 1.68m/year during 2013 to 2030 and the maximum decline would be on the lower side of the study area, where infrastructure of canal system is very less. This lowering of groundwater level might cause an increase in the tubewell installation and pumping cost. Similarly, the predicted total dissolved solids (TDS) of the groundwater would increase from 6.88 to 69.88mg/L/year during 2013 to 2030 and the maximum increase would be on lower side. It was found that in 2030, the good quality would reduce by 21.4%, while marginal and hazardous quality water increased by 19.28 and 2%, respectively. It was found from the simulated results that the salinity of the study area had increased due to the intrusion of salts. The deterioration of groundwater quality would cause soil salinity and ultimately the reduction in crop productivity. It was concluded from the predicted results of groundwater model that the groundwater deteriorated with the depth of water table i.e. TDS increased with declining groundwater level. It is recommended that agronomic and engineering practices i.e. land leveling, rainwater harvesting, skimming well, ASR (Aquifer Storage and Recovery Wells) etc. should be integrated to meliorate management of groundwater for higher crop production in salt affected soils.

Keywords: groundwater quality, groundwater management, PMWIN, MT3D model

Procedia PDF Downloads 372
14739 Removal of Cr⁶⁺, Co²⁺ and Ni²⁺ Ions from Aqueous Solutions by Algerian Enteromorpha compressa (L.) Biomass

Authors: Asma Aid, Samira Amokrane, Djamel Nibou, Hadj Mekatel

Abstract:

The marine Enteromorpha Compressa (L.) (ECL) biomass was used as a low-cost biological adsorbent for the removal of Cr⁶⁺, Co²⁺ and Ni²⁺ ions from artificially contaminated aqueous solutions. The operating variables pH, the initial concentration C₀, the solid/liquid ratio R and the temperature T were studied. A full factorial experimental design technique enabled us to obtain a mathematical model describing the adsorption of Cr⁶⁺, Co²⁺ and Ni²⁺ ions and to study the main effects and interactions among operational parameters. The equilibrium isotherm has been analyzed by Langmuir, Freundlich, and Dubinin-Radushkevich models; it has been found that the adsorption process follows the Langmuir model for the used ions. Kinetic studies showed that the pseudo-second-order model correlates our experimental data. Thermodynamic parameters showed the endothermic heat of adsorption and the spontaneity of the adsorption process for Cr⁶⁺ ions and exothermic heat of adsorption for Co²⁺ and Ni²⁺ ions.

Keywords: enteromorpha Compressa, adsorption process, Cr⁶⁺, Co²⁺ and Ni²⁺, equilibrium isotherm

Procedia PDF Downloads 188
14738 Location3: A Location Scouting Platform for the Support of Film and Multimedia Industries

Authors: Dimitrios Tzilopoulos, Panagiotis Symeonidis, Michael Loufakis, Dimosthenis Ioannidis, Dimitrios Tzovaras

Abstract:

The domestic film industry in Greece has traditionally relied heavily on state support. While film productions are crucial for the country's economy, it has not fully capitalized on attracting and promoting foreign productions. The lack of motivation, organized state support for attraction and licensing, and the absence of location scouting have hindered its potential. Although recent legislative changes have addressed the first two of these issues, the development of a comprehensive location database and a search engine that would effectively support location scouting at the pre-production location scouting is still in its early stages. In addition to the expected benefits of the film, television, marketing, and multimedia industries, a location-scouting service platform has the potential to yield significant financial gains locally and nationally. By promoting featured places like cultural and archaeological sites, natural monuments, and attraction points for visitors, it plays a vital role in both cultural promotion and facilitating tourism development. This study introduces LOCATION3, an internet platform revolutionizing film production location management. It interconnects location providers, film crews, and multimedia stakeholders, offering a comprehensive environment for seamless collaboration. The platform's central geodatabase (PostgreSQL) stores each location’s attributes, while web technologies like HTML, JavaScript, CSS, React.js, and Redux power the user-friendly interface. Advanced functionalities, utilizing deep learning models, developed in Python, are integrated via Node.js. Visual data presentation is achieved using the JS Leaflet library, delivering an interactive map experience. LOCATION3 sets a new standard, offering a range of essential features to enhance the management of film production locations. Firstly, it empowers users to effortlessly upload audiovisual material enriched with geospatial and temporal data, such as location coordinates, photographs, videos, 360-degree panoramas, and 3D location models. With the help of cutting-edge deep learning algorithms, the application automatically tags these materials, while users can also manually tag them. Moreover, the application allows users to record locations directly through its user-friendly mobile application. Users can then embark on seamless location searches, employing spatial or descriptive criteria. This intelligent search functionality considers a combination of relevant tags, dominant colors, architectural characteristics, emotional associations, and unique location traits. One of the application's standout features is the ability to explore locations by their visual similarity to other materials, facilitated by a reverse image search. Also, the interactive map serves as both a dynamic display for locations and a versatile filter, adapting to the user's preferences and effortlessly enhancing location searches. To further streamline the process, the application facilitates the creation of location lightboxes, enabling users to efficiently organize and share their content via email. Going above and beyond location management, the platform also provides invaluable liaison, matchmaking, and online marketplace services. This powerful functionality bridges the gap between visual and three-dimensional geospatial material providers, local agencies, film companies, production companies, etc. so that those interested in a specific location can access additional material beyond what is stored on the platform, as well as access production services supporting the functioning and completion of productions in a location (equipment provision, transportation, catering, accommodation, etc.).

Keywords: deep learning models, film industry, geospatial data management, location scouting

Procedia PDF Downloads 62