Search results for: adaptable business models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9326

Search results for: adaptable business models

5366 Some Aspects of Improving Service Sphere Management in Georgia

Authors: Gechbaia Badri

Abstract:

In the article, it is studied and realized the perfection issues of service sphere management in Georgia’s reality. As stated above, to transfer the country's economy onto marketing relationships, to form competitive dynamic market is dictated by the time and represents objective necessity. In the last period, the abruptly increasing of changes on science and education caused servicing sphere and producing skills, consumptions based on fields of places and changing role in a structure of the national economy. The main recourse in the new system of the economy became the intellectual capital. The economical progress is significantly determined by developing informational technologies. In the article, it is investigated the service problems of different fields of national economy and are given sentences to settle these problems.

Keywords: service management, service, paradigm, business and management engineering

Procedia PDF Downloads 407
5365 ChatGPT 4.0 Demonstrates Strong Performance in Standardised Medical Licensing Examinations: Insights and Implications for Medical Educators

Authors: K. O'Malley

Abstract:

Background: The emergence and rapid evolution of large language models (LLMs) (i.e., models of generative artificial intelligence, or AI) has been unprecedented. ChatGPT is one of the most widely used LLM platforms. Using natural language processing technology, it generates customized responses to user prompts, enabling it to mimic human conversation. Responses are generated using predictive modeling of vast internet text and data swathes and are further refined and reinforced through user feedback. The popularity of LLMs is increasing, with a growing number of students utilizing these platforms for study and revision purposes. Notwithstanding its many novel applications, LLM technology is inherently susceptible to bias and error. This poses a significant challenge in the educational setting, where academic integrity may be undermined. This study aims to evaluate the performance of the latest iteration of ChatGPT (ChatGPT4.0) in standardized state medical licensing examinations. Methods: A considered search strategy was used to interrogate the PubMed electronic database. The keywords ‘ChatGPT’ AND ‘medical education’ OR ‘medical school’ OR ‘medical licensing exam’ were used to identify relevant literature. The search included all peer-reviewed literature published in the past five years. The search was limited to publications in the English language only. Eligibility was ascertained based on the study title and abstract and confirmed by consulting the full-text document. Data was extracted into a Microsoft Excel document for analysis. Results: The search yielded 345 publications that were screened. 225 original articles were identified, of which 11 met the pre-determined criteria for inclusion in a narrative synthesis. These studies included performance assessments in national medical licensing examinations from the United States, United Kingdom, Saudi Arabia, Poland, Taiwan, Japan and Germany. ChatGPT 4.0 achieved scores ranging from 67.1 to 88.6 percent. The mean score across all studies was 82.49 percent (SD= 5.95). In all studies, ChatGPT exceeded the threshold for a passing grade in the corresponding exam. Conclusion: The capabilities of ChatGPT in standardized academic assessment in medicine are robust. While this technology can potentially revolutionize higher education, it also presents several challenges with which educators have not had to contend before. The overall strong performance of ChatGPT, as outlined above, may lend itself to unfair use (such as the plagiarism of deliverable coursework) and pose unforeseen ethical challenges (arising from algorithmic bias). Conversely, it highlights potential pitfalls if users assume LLM-generated content to be entirely accurate. In the aforementioned studies, ChatGPT exhibits a margin of error between 11.4 and 32.9 percent, which resonates strongly with concerns regarding the quality and veracity of LLM-generated content. It is imperative to highlight these limitations, particularly to students in the early stages of their education who are less likely to possess the requisite insight or knowledge to recognize errors, inaccuracies or false information. Educators must inform themselves of these emerging challenges to effectively address them and mitigate potential disruption in academic fora.

Keywords: artificial intelligence, ChatGPT, generative ai, large language models, licensing exam, medical education, medicine, university

Procedia PDF Downloads 11
5364 Production of Rhamnolipids from Different Resources and Estimating the Kinetic Parameters for Bioreactor Design

Authors: Olfat A. Mohamed

Abstract:

Rhamnolipids biosurfactants have distinct properties given them importance in many industrial applications, especially their great new future applications in cosmetic and pharmaceutical industries. These applications have encouraged the search for diverse and renewable resources to control the cost of production. The experimental results were then applied to find a suitable mathematical model for obtaining the design criteria of the batch bioreactor. This research aims to produce Rhamnolipids from different oily wastewater sources such as petroleum crude oil (PO) and vegetable oil (VO) by using Pseudomonas aeruginosa ATCC 9027. Different concentrations of the PO and the VO are added to the media broth separately are in arrangement (0.5 1, 1.5, 2, 2.5 % v/v) and (2, 4, 6, 8 and 10%v/v). The effect of the initial concentration of oil residues and the addition of glycerol and palmitic acid was investigated as an inducer in the production of rhamnolipid and the surface tension of the broth. It was found that 2% of the waste (PO) and 6% of the waste (VO) was the best initial substrate concentration for the production of rhamnolipids (2.71, 5.01 g rhamnolipid/l) as arrangement. Addition of glycerol (10-20% v glycerol/v PO) to the 2% PO fermentation broth led to increase the rhamnolipid production (about 1.8-2 times fold). However, the addition of palmitic acid (5 and 10 g/l) to fermentation broth contained 6% VO rarely enhanced the production rate. The experimental data for 2% initially (PO) was used to estimate the various kinetic parameters. The following results were obtained, maximum rate or velocity of reaction (Vmax) = 0.06417 g/l.hr), yield of cell weight per unit weight of substrate utilized (Yx/s = 0.324 g Cx/g Cs) maximum specific growth rate (μmax = 0.05791 hr⁻¹), yield of rhamnolipid weight per unit weight of substrate utilized (Yp/s)=0.2571gCp/g Cs), maintenance coefficient (Ms =0.002419), Michaelis-Menten constant, (Km=6.1237 gmol/l), endogenous decay coefficient (Kd=0.002375 hr⁻¹). Predictive parameters and advanced mathematical models were applied to evaluate the time of the batch bioreactor. The results were as follows: 123.37, 129 and 139.3 hours in respect of microbial biomass, substrate and product concentration, respectively compared with experimental batch time of 120 hours in all cases. The expected mathematical models are compatible with the laboratory results and can, therefore, be considered as tools for expressing the actual system.

Keywords: batch bioreactor design, glycerol, kinetic parameters, petroleum crude oil, Pseudomonas aeruginosa, rhamnolipids biosurfactants, vegetable oil

Procedia PDF Downloads 120
5363 Unlocking Health Insights: Studying Data for Better Care

Authors: Valentina Marutyan

Abstract:

Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.

Keywords: data mining, healthcare, big data, large amounts of data

Procedia PDF Downloads 58
5362 Kinetic Modelling of Drying Process of Jumbo Squid (Dosidicus Gigas) Slices Subjected to an Osmotic Pretreatment under High Pressure

Authors: Mario Perez-Won, Roberto Lemus-Mondaca, Constanza Olivares-Rivera, Fernanda Marin-Monardez

Abstract:

This research presents the simultaneous application of high hydrostatic pressure (HHP) and osmotic dehydration (DO) as a pretreatment to hot –air drying of jumbo squid (Dosidicus gigas) cubes. The drying time was reduced to 2 hours at 60ºC and 5 hours at 40°C as compared to the jumbo squid samples untreated. This one was due to osmotic pressure under high-pressure treatment where increased salt saturation what caused an increasing water loss. Thus, a more reduced time during convective drying was reached, and so water effective diffusion in drying would play an important role in this research. Different working conditions such as pressure (350-550 MPa), pressure time (5-10 min), salt concentration, NaCl (10 y 15%) and drying temperature (40-60ºC) were optimized according to kinetic parameters of each mathematical model. The models used for drying experimental curves were those corresponding to Weibull, Page and Logarithmic models, however, the latest one was the best fitted to the experimental data. The values for water effective diffusivity varied from 4.82 to 6.59x10-9 m2/s for the 16 curves (DO+HHP) whereas the control samples obtained a value of 1.76 and 5.16×10-9 m2/s, for 40 and 60°C, respectively. On the other hand, quality characteristics such as color, texture, non-enzymatic browning, water holding capacity (WHC) and rehydration capacity (RC) were assessed. The L* (lightness) color parameter increased, however, b * (yellowish) and a* (reddish) parameters decreased for the DO+HHP treated samples, indicating treatment prevents sample browning. The texture parameters such as hardness and elasticity decreased, but chewiness increased with treatment, which resulted in a product with a higher tenderness and less firmness compared to the untreated sample. Finally, WHC and RC values of the most treatments increased owing to a minor damage in tissue cellular compared to untreated samples. Therefore, a knowledge regarding to the drying kinetic as well as quality characteristics of dried jumbo squid samples subjected to a pretreatment of osmotic dehydration under high hydrostatic pressure is extremely important to an industrial level so that the drying process can be successful at different pretreatment conditions and/or variable processes.

Keywords: diffusion coefficient, drying process, high pressure, jumbo squid, modelling, quality aspects

Procedia PDF Downloads 231
5361 Hydrogen Production Using an Anion-Exchange Membrane Water Electrolyzer: Mathematical and Bond Graph Modeling

Authors: Hugo Daneluzzo, Christelle Rabbat, Alan Jean-Marie

Abstract:

Water electrolysis is one of the most advanced technologies for producing hydrogen and can be easily combined with electricity from different sources. Under the influence of electric current, water molecules can be split into oxygen and hydrogen. The production of hydrogen by water electrolysis favors the integration of renewable energy sources into the energy mix by compensating for their intermittence through the storage of the energy produced when production exceeds demand and its release during off-peak production periods. Among the various electrolysis technologies, anion exchange membrane (AEM) electrolyser cells are emerging as a reliable technology for water electrolysis. Modeling and simulation are effective tools to save time, money, and effort during the optimization of operating conditions and the investigation of the design. The modeling and simulation become even more important when dealing with multiphysics dynamic systems. One of those systems is the AEM electrolysis cell involving complex physico-chemical reactions. Once developed, models may be utilized to comprehend the mechanisms to control and detect flaws in the systems. Several modeling methods have been initiated by scientists. These methods can be separated into two main approaches, namely equation-based modeling and graph-based modeling. The former approach is less user-friendly and difficult to update as it is based on ordinary or partial differential equations to represent the systems. However, the latter approach is more user-friendly and allows a clear representation of physical phenomena. In this case, the system is depicted by connecting subsystems, so-called blocks, through ports based on their physical interactions, hence being suitable for multiphysics systems. Among the graphical modelling methods, the bond graph is receiving increasing attention as being domain-independent and relying on the energy exchange between the components of the system. At present, few studies have investigated the modelling of AEM systems. A mathematical model and a bond graph model were used in previous studies to model the electrolysis cell performance. In this study, experimental data from literature were simulated using OpenModelica using bond graphs and mathematical approaches. The polarization curves at different operating conditions obtained by both approaches were compared with experimental ones. It was stated that both models predicted satisfactorily the polarization curves with error margins lower than 2% for equation-based models and lower than 5% for the bond graph model. The activation polarization of hydrogen evolution reactions (HER) and oxygen evolution reactions (OER) were behind the voltage loss in the AEM electrolyzer, whereas ion conduction through the membrane resulted in the ohmic loss. Therefore, highly active electro-catalysts are required for both HER and OER while high-conductivity AEMs are needed for effectively lowering the ohmic losses. The bond graph simulation of the polarisation curve for operating conditions at various temperatures has illustrated that voltage increases with temperature owing to the technology of the membrane. Simulation of the polarisation curve can be tested virtually, hence resulting in reduced cost and time involved due to experimental testing and improved design optimization. Further improvements can be made by implementing the bond graph model in a real power-to-gas-to-power scenario.

Keywords: hydrogen production, anion-exchange membrane, electrolyzer, mathematical modeling, multiphysics modeling

Procedia PDF Downloads 74
5360 Satisfaction on English Language Learning with Online System

Authors: Suwaree Yordchim

Abstract:

The objective is to study the satisfaction on English with an online learning. Online learning system mainly consists of English lessons, exercises, tests, web boards, and supplementary lessons for language practice. The sample groups are 80 Thai students studying English for Business Communication, majoring in Hotel and Lodging Management. The data are analyzed by mean, standard deviation (S.D.) value from the questionnaires. The results were found that the most average of satisfaction on academic aspects are technological searching tool through E-learning system that support the students’ learning (4.51), knowledge evaluation on prepost learning and teaching (4.45), and change for project selections according to their interest, subject contents including practice in the real situations (4.45), respectively.

Keywords: English language learning, online system, online learning, supplementary lessons

Procedia PDF Downloads 449
5359 Early Prediction of Disposable Addresses in Ethereum Blockchain

Authors: Ahmad Saleem

Abstract:

Ethereum is the second largest crypto currency in blockchain ecosystem. Along with standard transactions, it supports smart contracts and NFT’s. Current research trends are focused on analyzing the overall structure of the network its growth and behavior. Ethereum addresses are anonymous and can be created on fly. The nature of Ethereum network and addresses make it hard to predict their behavior. The activity period of an ethereum address is not much analyzed. Using machine learning we can make early prediction about the disposability of the address. In this paper we analyzed the lifetime of the addresses. We also identified and predicted the disposable addresses using machine learning models and compared the results.

Keywords: blockchain, Ethereum, cryptocurrency, prediction

Procedia PDF Downloads 86
5358 Bio-Hub Ecosystems: Profitability through Circularity for Sustainable Forestry, Energy, Agriculture and Aquaculture

Authors: Kimberly Samaha

Abstract:

The Bio-Hub Ecosystem model was developed to address a critical area of concern within the global energy market regarding biomass as a feedstock for power plants. Yet the lack of an economically-viable business model for bioenergy facilities has resulted in the continuation of idled and decommissioned plants. This study analyzed data and submittals to the Born Global Maine Innovation Challenge. The Innovation Challenge was a global innovation challenge to identify process innovations that could address a ‘whole-tree’ approach of maximizing the products, byproducts, energy value and process slip-streams into a circular zero-waste design. Participating companies were at various stages of developing bioproducts and included biofuels, lignin-based products, carbon capture platforms and biochar used as both a filtration medium and as a soil amendment product. This case study shows the QCA (Qualitative Comparative Analysis) methodology of the prequalification process and the resulting techno-economic model that was developed for the maximizing profitability of the Bio-Hub Ecosystem through continuous expansion of system waste streams into valuable process inputs for co-hosts. A full site plan for the integration of co-hosts (biorefinery, land-based shrimp and salmon aquaculture farms, a tomato green-house and a hops farm) at an operating forestry-based biomass to energy plant in West Enfield, Maine USA. This model and process for evaluating the profitability not only proposes models for integration of forestry, aquaculture and agriculture in cradle-to-cradle linkages of what have typically been linear systems, but the proposal also allows for the early measurement of the circularity and impact of resource use and investment risk mitigation, for these systems. In this particular study, profitability is assessed at two levels CAPEX (Capital Expenditures) and in OPEX (Operating Expenditures). Given that these projects start with repurposing facilities where the industrial level infrastructure is already built, permitted and interconnected to the grid, the addition of co-hosts first realizes a dramatic reduction in permitting, development times and costs. In addition, using the biomass energy plant’s waste streams such as heat, hot water, CO₂ and fly ash as valuable inputs to their operations and a significant decrease in the OPEX costs, increasing overall profitability to each of the co-hosts bottom line. This case study utilizes a proprietary techno-economic model to demonstrate how utilizing waste streams of a biomass energy plant and/or biorefinery, results in significant reduction in OPEX for both the biomass plants and the agriculture and aquaculture co-hosts. Economically viable Bio-Hubs with favorable environmental and community impacts may prove critical in garnering local and federal government support for pilot programs and more wide-scale adoption, especially for those living in severely economically depressed rural areas where aging industrial sites have been shuttered and local economies devastated.

Keywords: bio-economy, biomass energy, financing, zero-waste

Procedia PDF Downloads 121
5357 Predicting Aggregation Propensity from Low-Temperature Conformational Fluctuations

Authors: Hamza Javar Magnier, Robin Curtis

Abstract:

There have been rapid advances in the upstream processing of protein therapeutics, which has shifted the bottleneck to downstream purification and formulation. Finding liquid formulations with shelf lives of up to two years is increasingly difficult for some of the newer therapeutics, which have been engineered for activity, but their formulations are often viscous, can phase separate, and have a high propensity for irreversible aggregation1. We explore means to develop improved predictive ability from a better understanding of how protein-protein interactions on formulation conditions (pH, ionic strength, buffer type, presence of excipients) and how these impact upon the initial steps in protein self-association and aggregation. In this work, we study the initial steps in the aggregation pathways using a minimal protein model based on square-well potentials and discontinuous molecular dynamics. The effect of model parameters, including range of interaction, stiffness, chain length, and chain sequence, implies that protein models fold according to various pathways. By reducing the range of interactions, the folding- and collapse- transition come together, and follow a single-step folding pathway from the denatured to the native state2. After parameterizing the model interaction-parameters, we developed an understanding of low-temperature conformational properties and fluctuations, and the correlation to the folding transition of proteins in isolation. The model fluctuations increase with temperature. We observe a low-temperature point, below which large fluctuations are frozen out. This implies that fluctuations at low-temperature can be correlated to the folding transition at the melting temperature. Because proteins “breath” at low temperatures, defining a native-state as a single structure with conserved contacts and a fixed three-dimensional structure is misleading. Rather, we introduce a new definition of a native-state ensemble based on our understanding of the core conservation, which takes into account the native fluctuations at low temperatures. This approach permits the study of a large range of length and time scales needed to link the molecular interactions to the macroscopically observed behaviour. In addition, these models studied are parameterized by fitting to experimentally observed protein-protein interactions characterized in terms of osmotic second virial coefficients.

Keywords: protein folding, native-ensemble, conformational fluctuation, aggregation

Procedia PDF Downloads 350
5356 Predicting the Success of Bank Telemarketing Using Artificial Neural Network

Authors: Mokrane Selma

Abstract:

The shift towards decision making (DM) based on artificial intelligence (AI) techniques will change the way in which consumer markets and our societies function. Through AI, predictive analytics is being used by businesses to identify these patterns and major trends with the objective to improve the DM and influence future business outcomes. This paper proposes an Artificial Neural Network (ANN) approach to predict the success of telemarketing calls for selling bank long-term deposits. To validate the proposed model, we uses the bank marketing data of 41188 phone calls. The ANN attains 98.93% of accuracy which outperforms other conventional classifiers and confirms that it is credible and valuable approach for telemarketing campaign managers.

Keywords: bank telemarketing, prediction, decision making, artificial intelligence, artificial neural network

Procedia PDF Downloads 138
5355 Heat Transfer and Trajectory Models for a Cloud of Spray over a Marine Vessel

Authors: S. R. Dehghani, G. F. Naterer, Y. S. Muzychka

Abstract:

Wave-impact sea spray creates many droplets which form a spray cloud traveling over marine objects same as marine vessels and offshore structures. In cold climates such as Arctic reigns, sea spray icing, which is ice accretion on cold substrates, is strongly dependent on the wave-impact sea spray. The rate of cooling of droplets affects the process of icing that can yield to dry or wet ice accretion. Trajectories of droplets determine the potential places for ice accretion. Combining two models of trajectories and heat transfer for droplets can predict the risk of ice accretion reasonably. The majority of the cooling of droplets is because of droplet evaporations. In this study, a combined model using trajectory and heat transfer evaluate the situation of a cloud of spray from the generation to impingement. The model uses some known geometry and initial information from the previous case studies. The 3D model is solved numerically using a standard numerical scheme. Droplets are generated in various size ranges from 7 mm to 0.07 mm which is a suggested range for sea spray icing. The initial temperature of droplets is considered to be the sea water temperature. Wind velocities are assumed same as that of the field observations. Evaluations are conducted using some important heading angles and wind velocities. The characteristic of size-velocity dependence is used to establish a relation between initial sizes and velocities of droplets. Time intervals are chosen properly to maintain a stable and fast numerical solution. A statistical process is conducted to evaluate the probability of expected occurrences. The medium size droplets can reach the highest heights. Very small and very large droplets are limited to lower heights. Results show that higher initial velocities create the most expanded cloud of spray. Wind velocities affect the extent of the spray cloud. The rate of droplet cooling at the start of spray formation is higher than the rest of the process. This is because of higher relative velocities and also higher temperature differences. The amount of water delivery and overall temperature for some sample surfaces over a marine vessel are calculated. Comparing results and some field observations show that the model works accurately. This model is suggested as a primary model for ice accretion on marine vessels.

Keywords: evaporation, sea spray, marine icing, numerical solution, trajectory

Procedia PDF Downloads 211
5354 Virtual Metrology for Copper Clad Laminate Manufacturing

Authors: Misuk Kim, Seokho Kang, Jehyuk Lee, Hyunchang Cho, Sungzoon Cho

Abstract:

In semiconductor manufacturing, virtual metrology (VM) refers to methods to predict properties of a wafer based on machine parameters and sensor data of the production equipment, without performing the (costly) physical measurement of the wafer properties (Wikipedia). Additional benefits include avoidance of human bias and identification of important factors affecting the quality of the process which allow improving the process quality in the future. It is however rare to find VM applied to other areas of manufacturing. In this work, we propose to use VM to copper clad laminate (CCL) manufacturing. CCL is a core element of a printed circuit board (PCB) which is used in smartphones, tablets, digital cameras, and laptop computers. The manufacturing of CCL consists of three processes: Treating, lay-up, and pressing. Treating, the most important process among the three, puts resin on glass cloth, heat up in a drying oven, then produces prepreg for lay-up process. In this process, three important quality factors are inspected: Treated weight (T/W), Minimum Viscosity (M/V), and Gel Time (G/T). They are manually inspected, incurring heavy cost in terms of time and money, which makes it a good candidate for VM application. We developed prediction models of the three quality factors T/W, M/V, and G/T, respectively, with process variables, raw material, and environment variables. The actual process data was obtained from a CCL manufacturer. A variety of variable selection methods and learning algorithms were employed to find the best prediction model. We obtained prediction models of M/V and G/T with a high enough accuracy. They also provided us with information on “important” predictor variables, some of which the process engineers had been already aware and the rest of which they had not. They were quite excited to find new insights that the model revealed and set out to do further analysis on them to gain process control implications. T/W did not turn out to be possible to predict with a reasonable accuracy with given factors. The very fact indicates that the factors currently monitored may not affect T/W, thus an effort has to be made to find other factors which are not currently monitored in order to understand the process better and improve the quality of it. In conclusion, VM application to CCL’s treating process was quite successful. The newly built quality prediction model allowed one to reduce the cost associated with actual metrology as well as reveal some insights on the factors affecting the important quality factors and on the level of our less than perfect understanding of the treating process.

Keywords: copper clad laminate, predictive modeling, quality control, virtual metrology

Procedia PDF Downloads 344
5353 Contemporary Mexican Shadow Politics: The War on Drugs and the Issue of Security

Authors: Lisdey Espinoza Pedraza

Abstract:

Organised crime in Mexico evolves faster that our capacity to understand and explain it. Organised gangs have become successful entrepreneurs in many ways ad they have somehow mimicked the working ways of the authorities and in many cases, they have successfully infiltrated the governmental spheres. This business model is only possible under a clear scheme of rampant impunity. Impunity, however, is not exclusive to the PRI. Nor the PRI, PAN, or PRD can claim the monopoly of corruption, but what is worse is that none can claim full honesty in their acts either. The current security crisis in Mexico shows a crisis in the Mexican political party system. Corruption today is not only a problem of dishonesty and the correct use of public resources. It is the principal threat to Mexican democracy, governance, and national security.

Keywords: security, war on drugs, drug trafficking, Mexico, Latin America, United States

Procedia PDF Downloads 408
5352 Ontology as Knowledge Capture Tool in Organizations: A Literature Review

Authors: Maria Margaretha, Dana Indra Sensuse, Lukman

Abstract:

Knowledge capture is a step in knowledge life cycle to get knowledge in the organization. Tacit and explicit knowledge are needed to organize in a path, so the organization will be easy to choose which knowledge will be use. There are many challenges to capture knowledge in the organization, such as researcher must know which knowledge has been validated by an expert, how to get tacit knowledge from experts and make it explicit knowledge, and so on. Besides that, the technology will be a reliable tool to help the researcher to capture knowledge. Some paper wrote how ontology in knowledge management can be used for proposed framework to capture and reuse knowledge. Organization has to manage their knowledge, process capture and share will decide their position in the business area. This paper will describe further from literature review about the tool of ontology that will help the organization to capture its knowledge.

Keywords: knowledge capture, ontology, technology, organization

Procedia PDF Downloads 589
5351 Employer Brand Image and Employee Engagement: An Exploratory Study in Britain

Authors: Melisa Mete, Gary Davies, Susan Whelan

Abstract:

Maintaining a good employer brand image is crucial for companies since it has numerous advantages such as better recruitment, retention and employee engagement, and commitment. This study aims to understand the relationship between employer brand image and employee satisfaction and engagement in the British context. A panel survey data (N=228) is tested via the regression models from the Hayes (2012) PROCESS macro, in IBM SPSS 23.0. The results are statistically significant and proves that the more positive employer brand image, the greater employee’ engagement and satisfaction, and the greater is employee satisfaction, the greater their engagement.

Keywords: employer brand, employer brand image, employee engagement, employee satisfaction

Procedia PDF Downloads 327
5350 Bio-Hub Ecosystems: Expansion of Traditional Life Cycle Analysis Metrics to Include Zero-Waste Circularity Measures

Authors: Kimberly Samaha

Abstract:

In order to attract new types of investors into the emerging Bio-Economy, a new set of metrics and measurement system is needed to better quantify the environmental, social and economic impacts of circular zero-waste design. The Bio-Hub Ecosystem model was developed to address a critical area of concern within the global energy market regarding the use of biomass as a feedstock for power plants. Lack of an economically-viable business model for bioenergy facilities has resulted in the continuation of idled and decommissioned plants. In particular, the forestry-based plants which have been an invaluable outlet for woody biomass surplus, forest health improvement, timber production enhancement, and especially reduction of wildfire risk. This study looked at repurposing existing biomass-energy plants into Circular Zero-Waste Bio-Hub Ecosystems. A Bio-Hub model that first targets a ‘whole-tree’ approach and then looks at the circular economics of co-hosting diverse industries (wood processing, aquaculture, agriculture) in the vicinity of the Biomass Power Plants facilities. It proposes not only models for integration of forestry, aquaculture, and agriculture in cradle-to-cradle linkages of what have typically been linear systems, but the proposal also allows for the early measurement of the circularity and impact of resource use and investment risk mitigation, for these systems. Typically, life cycle analyses measure environmental impacts of different industrial production stages and are not integrated with indicators of material use circularity. This concept paper proposes the further development of a new set of metrics that would illustrate not only the typical life-cycle analysis (LCA), which shows the reduction in greenhouse gas (GHG) emissions, but also the zero-waste circularity measures of mass balance of the full value chain of the raw material and energy content/caloric value. These new measures quantify key impacts in making hyper-efficient use of natural resources and eliminating waste to landfills. The project utilized traditional LCA using the GREET model where the standalone biomass energy plant case was contrasted with the integration of a jet-fuel biorefinery. The methodology was then expanded to include combinations of co-hosts that optimize the life cycle of woody biomass from tree to energy, CO₂, heat and wood ash both from an energy/caloric value and for mass balance to include reuse of waste streams which are typically landfilled. The major findings of both a formal LCA study resulted in the masterplan for the first Bio-Hub to be built in West Enfield, Maine. Bioenergy facilities are currently at a critical juncture where they have an opportunity to be repurposed into efficient, profitable and socially responsible investments, or be idled and scrapped. If proven as a model, the expedited roll-out of these innovative scenarios can set a new standard for circular zero-waste projects that advance the critical transition from the current ‘take-make-dispose’ paradigm inherent in the energy, forestry and food industries to a more sustainable bio-economy paradigm where waste streams become valuable inputs, supporting local and rural communities in simple, sustainable ways.

Keywords: bio-economy, biomass energy, financing, metrics

Procedia PDF Downloads 145
5349 Project Management Tools within SAP S/4 Hana Program Environment

Authors: Jagoda Bruni, Jan Müller-Lucanus, Gernot Stöger-Knes

Abstract:

The purpose of this article is to demonstrate modern project management approaches in the SAP S/R Hana surrounding a programming environment composed of multiple focus-diversified projects. We would like to propose innovative and goal-oriented management standards based on the specificity of the SAP transformations and customer-driven expectations. Due to the regular sprint-based controlling and management tools' application, it has been data-proven that extensive analysis of productive hours of the employees as much as a thorough review of the project progress (per GAP, per business process, and per Lot) within the whole program, can have a positive impact on customer satisfaction and improvement for projects' budget. This has been a collaborative study based on real-life experience and measurements in collaboration with our customers.

Keywords: project management, program management, SAP, controlling

Procedia PDF Downloads 72
5348 The Contemporary Issues of Quality Management: Relationship between Total Quality Management and Knowledge Management

Authors: Mehrnoosh Askarizadeh

Abstract:

To meet the challenges of the new global environment, companies have started paying great attention towards quality management as an integral part of their strategic business plans. The purpose of this article is to investigate the relationship between total quality management (TQM) and knowledge management (KM). Successful total quality management implementation throughout the organizations requires major changes in the main four aspects of knowledge management, namely: Creating, storage, sharing and application. Skill, knowledge and productivity are important factors in organization’s success and have important role. Therefore, TQM management system pays special attention to it. However, knowledge as the source is essential for organization’s survival. Our study points out how the quality management and knowledge management have been incorporated into each other for the development of the quality culture within the organization.

Keywords: knowledge management (KM), total quality management (TQM), organizational performance (OP), deming cycle

Procedia PDF Downloads 468
5347 Remaining Useful Life (RUL) Assessment Using Progressive Bearing Degradation Data and ANN Model

Authors: Amit R. Bhende, G. K. Awari

Abstract:

Remaining useful life (RUL) prediction is one of key technologies to realize prognostics and health management that is being widely applied in many industrial systems to ensure high system availability over their life cycles. The present work proposes a data-driven method of RUL prediction based on multiple health state assessment for rolling element bearings. Bearing degradation data at three different conditions from run to failure is used. A RUL prediction model is separately built in each condition. Feed forward back propagation neural network models are developed for prediction modeling.

Keywords: bearing degradation data, remaining useful life (RUL), back propagation, prognosis

Procedia PDF Downloads 423
5346 A Graph SEIR Cellular Automata Based Model to Study the Spreading of a Transmittable Disease

Authors: Natasha Sharma, Kulbhushan Agnihotri

Abstract:

Cellular Automata are discrete dynamical systems which are based on local character and spatial disparateness of the spreading process. These factors are generally neglected by traditional models based on differential equations for epidemic spread. The aim of this work is to introduce an SEIR model based on cellular automata on graphs to imitate epidemic spreading. Distinctively, it is an SEIR-type model where the population is divided into susceptible, exposed, infected and recovered individuals. The results obtained from simulations are in accordance with the spreading behavior of a real time epidemics.

Keywords: cellular automata, epidemic spread, graph, susceptible

Procedia PDF Downloads 446
5345 Evaluation of Research in the Field of Energy Efficiency and MCA Methods Using Publications Databases

Authors: Juan Sepúlveda

Abstract:

Energy is a fundamental component in sustainability, the access and use of this resource is related with economic growth, social improvements, and environmental impacts. In this sense, energy efficiency has been studied as a factor that enhances the positive impacts of energy in communities; however, the implementation of efficiency requires strong policy and strategies that usually rely on individual measures focused in independent dimensions. In this paper, the problem of energy efficiency as a multi-objective problem is studied, using scientometric analysis to discover trends and patterns that allow to identify the main variables and study approximations related with a further development of models to integrate energy efficiency and MCA into policy making for small communities.

Keywords: energy efficiency, MCA, scientometric, trends

Procedia PDF Downloads 356
5344 Implementation of 'Bay Al-Salam' in Agricultural Banking of Bangladesh: An Islamic Banking Perspective

Authors: M. Obydul Haque Kamaly

Abstract:

This paper aims to provide a brief discussion on bay al-salam as a method of implementing Islamic Banking in the agricultural arena of Bangladesh. For this purpose, the nature and conditions of bay al-salam contracts will be first discussed. Next, the paper will focus on the comparison between conventional banks and Islamic banks and should answer how bay al-salam can be used as a popular method in agricultural transactions in the country. The paper is based on secondary data which is to describe bay al-salam as future proceedings for Islamic banking. Evidence suggests Islamic banking is very much practiced like modern conventional banking with certain restrictions imposed by Sharia and addresses a large number of business requirements successfully. Thus, it’s time for us to implement Islamic banking (bay al-salam) on our agricultural arena and to get most benefits from them.

Keywords: bay al-salam, agricultural banking, Islamic banking, implementation

Procedia PDF Downloads 249
5343 An Exploratory Study on the Impact of Climate Change on Design Rainfalls in the State of Qatar

Authors: Abdullah Al Mamoon, Niels E. Joergensen, Ataur Rahman, Hassan Qasem

Abstract:

Intergovernmental Panel for Climate Change (IPCC) in its fourth Assessment Report AR4 predicts a more extreme climate towards the end of the century, which is likely to impact the design of engineering infrastructure projects with a long design life. A recent study in 2013 developed new design rainfall for Qatar, which provides an improved design basis of drainage infrastructure for the State of Qatar under the current climate. The current design standards in Qatar do not consider increased rainfall intensity caused by climate change. The focus of this paper is to update recently developed design rainfalls in Qatar under the changing climatic conditions based on IPCC's AR4 allowing a later revision to the proposed design standards, relevant for projects with a longer design life. The future climate has been investigated based on the climate models released by IPCC’s AR4 and A2 story line of emission scenarios (SRES) using a stationary approach. Annual maximum series (AMS) of predicted 24 hours rainfall data for both wet (NCAR-CCSM) scenario and dry (CSIRO-MK3.5) scenario for the Qatari grid points in the climate models have been extracted for three periods, current climate 2010-2039, medium term climate (2040-2069) and end of century climate (2070-2099). A homogeneous region of the Qatari grid points has been formed and L-Moments based regional frequency approach is adopted to derive design rainfalls. The results indicate no significant changes in the design rainfall on the short term 2040-2069, but significant changes are expected towards the end of the century (2070-2099). New design rainfalls have been developed taking into account climate change for 2070-2099 scenario and by averaging results from the two scenarios. IPCC’s AR4 predicts that the rainfall intensity for a 5-year return period rain with duration of 1 to 2 hours will increase by 11% in 2070-2099 compared to current climate. Similarly, the rainfall intensity for more extreme rainfall, with a return period of 100 years and duration of 1 to 2 hours will increase by 71% in 2070-2099 compared to current climate. Infrastructure with a design life exceeding 60 years should add safety factors taking the predicted effects from climate change into due consideration.

Keywords: climate change, design rainfalls, IDF, Qatar

Procedia PDF Downloads 380
5342 Parasitic Capacitance Modeling in Pulse Transformer Using FEA

Authors: D. Habibinia, M. R. Feyzi

Abstract:

Nowadays, specialized software is vastly used to verify the performance of an electric machine prototype by evaluating a model of the system. These models mainly consist of electrical parameters such as inductances and resistances. However, when the operating frequency of the device is above one kHz, the effect of parasitic capacitances grows significantly. In this paper, a software-based procedure is introduced to model these capacitances within the electromagnetic simulation of the device. The case study is a high-frequency high-voltage pulse transformer. The Finite Element Analysis (FEA) software with coupled field analysis is used in this method.

Keywords: finite element analysis, parasitic capacitance, pulse transformer, high frequency

Procedia PDF Downloads 508
5341 Relation of Consumer Satisfaction on Organization by Focusing on the Different Aspects of Buying Behavior

Authors: I. Gupta, N. Setia

Abstract:

Introduction. Buyer conduct is a progression of practices or examples that buyers pursue before making a buy. It begins when the shopper ends up mindful of a need or wish for an item, at that point finishes up with the buying exchange. Business visionaries can't generally simply shake hands with their intended interest group people and become more acquainted with them. Research is often necessary, so every organization primarily involves doing continuous research to understand and satisfy consumer needs pattern. Aims and Objectives: The aim of the present study is to examine the different behaviors of the consumer, including pre-purchase, purchase, and post-purchase behavior. Materials and Methods: In order to get results, face to face interview held with 80 people which comprise a larger part of female individuals having upper as well as middle-class status. The prime source of data collection was primary. However, the study has also used the theoretical contribution of many researchers in their respective field. Results: Majority of the respondents were females (70%) from the age group of 20-50. The collected data was analyzed through hypothesis testing statistical techniques such as correlation analysis, single regression analysis, and ANOVA which has rejected the null hypothesis that there is no relation between researching the consumer behavior at different stages and organizational performance. The real finding of this study is that simply focusing on the buying part isn't enough to gain profits and fame, however, understanding the pre, buy and post-buy behavior of consumer performs a huge role in organization success. The outcomes demonstrated that the organization, which deals with the three phases of research of purchasing conduct is able to establish a great brand image as compare to their competitors. Alongside, enterprises can observe customer conduct in a considerably more proficient manner. Conclusion: The analyses of consumer behavior presented in this study is an attempt to understand the factors affecting consumer purchasing behavior. This study has revealed that those corporations are more successful, which work on understanding buying behavior instead to just focus on the selling products. As a result, organizations perform good and grow rapidly because consumers are the one who can make or break the company. The interviews that were conducted face to face, clearly revealed that those organizations become at top-notch whom consumers are satisfied, not just with product but also with services of the company. The study is not targeting the particular class of audience; however, it brings out benefits to the masses, in particular to business organizations.

Keywords: consumer behavior, pre purchase, post purchase, consumer satisfaction

Procedia PDF Downloads 103
5340 Internet Protocol Television: A Research Study of Undergraduate Students Analyze the Effects

Authors: Sabri Serkan Gulluoglu

Abstract:

The study is aimed at examining the effects of internet marketing with IPTV on human beings. Internet marketing with IPTV is emerging as an integral part of business strategies in today’s technologically advanced world and the business activities all over the world are influences with the emergence of this modern marketing tool. As the population of the Internet and on-line users’ increases, new research issues have arisen concerning the demographics and psychographics of the on-line user and the opportunities for a product or service. In recent years, we have seen a tendency of various services converging to the ubiquitous Internet Protocol based networks. Besides traditional Internet applications such as web browsing, email, file transferring, and so forth, new applications have been developed to replace old communication networks. IPTV is one of the solutions. In the future, we expect a single network, the IP network, to provide services that have been carried by different networks today. For finding some important effects of a video based technology market web site on internet, we determine to apply a questionnaire on university students. Recently some researches shows that in Turkey the age of people 20 to 24 use internet when they buy some electronic devices such as cell phones, computers, etc. In questionnaire there are ten categorized questions to evaluate the effects of IPTV when shopping. There were selected 30 students who are filling the question form after watching an IPTV channel video for 10 minutes. This sample IPTV channel is “buy.com”, it look like an e-commerce site with an integrated IPTV channel on. The questionnaire for the survey is constructed by using the Likert scale that is a bipolar scaling method used to measure either positive or negative response to a statement (Likert, R) it is a common system that is used is the surveys. By following the Likert Scale “the respondents are asked to indicate their degree of agreement with the statement or any kind of subjective or objective evaluation of the statement. Traditionally a five-point scale is used under this methodology”. For this study also the five point scale system is used and the respondents were asked to express their opinions about the given statement by picking the answer from the given 5 options: “Strongly disagree, Disagree, Neither agree Nor disagree, Agree and Strongly agree”. These points were also rates from 1-5 (Strongly disagree, Disagree, Neither disagree Nor agree, Agree, Strongly agree). On the basis of the data gathered from the questionnaire some results are drawn in order to get the figures and graphical representation of the study results that can demonstrate the outcomes of the research clearly.

Keywords: IPTV, internet marketing, online, e-commerce, video based technology

Procedia PDF Downloads 228
5339 Nanoporous Metals Reinforced with Fullerenes

Authors: Deni̇z Ezgi̇ Gülmez, Mesut Kirca

Abstract:

Nanoporous (np) metals have attracted considerable attention owing to their cellular morphological features at atomistic scale which yield ultra-high specific surface area awarding a great potential to be employed in diverse applications such as catalytic, electrocatalytic, sensing, mechanical and optical. As one of the carbon based nanostructures, fullerenes are also another type of outstanding nanomaterials that have been extensively investigated due to their remarkable chemical, mechanical and optical properties. In this study, the idea of improving the mechanical behavior of nanoporous metals by inclusion of the fullerenes, which offers a new metal-carbon nanocomposite material, is examined and discussed. With this motivation, tensile mechanical behavior of nanoporous metals reinforced with carbon fullerenes is investigated by classical molecular dynamics (MD) simulations. Atomistic models of the nanoporous metals with ultrathin ligaments are obtained through a stochastic process simply based on the intersection of spherical volumes which has been used previously in literature. According to this technique, the atoms within the ensemble of intersecting spherical volumes is removed from the pristine solid block of the selected metal, which results in porous structures with spherical cells. Following this, fullerene units are added into the cellular voids to obtain final atomistic configurations for the numerical tensile tests. Several numerical specimens are prepared with different number of fullerenes per cell and with varied fullerene sizes. LAMMPS code was used to perform classical MD simulations to conduct uniaxial tension experiments on np models filled by fullerenes. The interactions between the metal atoms are modeled by using embedded atomic method (EAM) while adaptive intermolecular reactive empirical bond order (AIREBO) potential is employed for the interaction of carbon atoms. Furthermore, atomic interactions between the metal and carbon atoms are represented by Lennard-Jones potential with appropriate parameters. In conclusion, the ultimate goal of the study is to present the effects of fullerenes embedded into the cellular structure of np metals on the tensile response of the porous metals. The results are believed to be informative and instructive for the experimentalists to synthesize hybrid nanoporous materials with improved properties and multifunctional characteristics.

Keywords: fullerene, intersecting spheres, molecular dynamic, nanoporous metals

Procedia PDF Downloads 233
5338 Adaption to Climate Change as a Challenge for the Manufacturing Industry: Finding Business Strategies by Game-Based Learning

Authors: Jan Schmitt, Sophie Fischer

Abstract:

After the Corona pandemic, climate change is a further, long-lasting challenge the society must deal with. An ongoing climate change need to be prevented. Nevertheless, the adoption tothe already changed climate conditionshas to be focused in many sectors. Recently, the decisive role of the economic sector with high value added can be seen in the Corona crisis. Hence, manufacturing industry as such a sector, needs to be prepared for climate change and adaption. Several examples from the manufacturing industry show the importance of a strategic effort in this field: The outsourcing of a major parts of the value chain to suppliers in other countries and optimizing procurement logistics in a time-, storage- and cost-efficient manner within a network of global value creation, can lead vulnerable impacts due to climate-related disruptions. E.g. the total damage costs after the 2011 flood disaster in Thailand, including costs for delivery failures, were estimated at 45 billion US dollars worldwide. German car manufacturers were also affected by supply bottlenecks andhave close its plant in Thailand for a short time. Another OEM must reduce the production output. In this contribution, a game-based learning approach is presented, which should enable manufacturing companies to derive their own strategies for climate adaption out of a mix of different actions. Based on data from a regional study of small, medium and large manufacturing companies in Mainfranken, a strongly industrialized region of northern Bavaria (Germany) the game-based learning approach is designed. Out of this, the actual state of efforts due to climate adaption is evaluated. First, the results are used to collect single actions for manufacturing companies and second, further actions can be identified. Then, a variety of climate adaption activities can be clustered according to the scope of activity of the company. The combination of different actions e.g. the renewal of the building envelope with regard to thermal insulation, its benefits and drawbacks leads to a specific strategy for climate adaption for each company. Within the game-based approach, the players take on different roles in a fictionalcompany and discuss the order and the characteristics of each action taken into their climate adaption strategy. Different indicators such as economic, ecologic and stakeholder satisfaction compare the success of the respective measures in a competitive format with other virtual companies deriving their own strategy. A "play through" climate change scenarios with targeted adaptation actions illustrate the impact of different actions and their combination onthefictional company.

Keywords: business strategy, climate change, climate adaption, game-based learning

Procedia PDF Downloads 196
5337 A Cohort and Empirical Based Multivariate Mortality Model

Authors: Jeffrey Tzu-Hao Tsai, Yi-Shan Wong

Abstract:

This article proposes a cohort-age-period (CAP) model to characterize multi-population mortality processes using cohort, age, and period variables. Distinct from the factor-based Lee-Carter-type decomposition mortality model, this approach is empirically based and includes the age, period, and cohort variables into the equation system. The model not only provides a fruitful intuition for explaining multivariate mortality change rates but also has a better performance in forecasting future patterns. Using the US and the UK mortality data and performing ten-year out-of-sample tests, our approach shows smaller mean square errors in both countries compared to the models in the literature.

Keywords: longevity risk, stochastic mortality model, multivariate mortality rate, risk management

Procedia PDF Downloads 36