Search results for: sharing problems of heritage
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7812

Search results for: sharing problems of heritage

1542 Cross-Sectional Association between Socio-Demographic Factors and Paid Blood Donation in Half Million Chinese Population

Authors: Jiashu Shen, Guoting Zhang, Zhicheng Wang, Yu Wang, Yun Liang, Siyu Zou, Fan Yang, Kun Tang

Abstract:

Objectives: This study aims to enhance the understanding of paid blood donors’ characteristics in Chinese population and devise strategies to protect these paid donors. Background: Paid blood donation was the predominant mode of blood donation in China from the 1970s to 1998 and caused several health and social problems including largely increased the risk of infectious diseases with nonstandard operation in unhygienic conditions. Methods: This study utilized the cross-sectional data from the China Kadoorie Biobank with about 0.5 million people from 10 regions of China from 2004 to 2008. Multivariable logistic regression was performed to examine the associations between socio-demographic factors and paid blood donation. Furthermore, a stratified analysis was applied in education level and annual household income by rural and urban areas. Results: The prevalence of paid blood donation was 0.50% in China and males were more likely to donate blood than females (Adjusted odds ratio (AOR) =0.81, 95%Confident Intervals (CI): 0.75-0.88). Urban people had much lower odds than rural people (AOR =0.24, 95%CI: 0.21-0.27). People with a high annual household income had lower odds of paid blood donation compared with that of people with low income (AOR=0.37, 95%CI: 0.31-0.44). Compared with people who didn’t receive school education, people in a higher level of education had increased odds of paid blood donation (AOR=2.31, 95%CI: 1.94-2.74). Conclusion: Paid blood donors in China were associated with those who were males, living in rural areas, with low annual household income and educational background.

Keywords: China Kadoorie Biobank, Chinese population, paid blood donation, socio-demographic factors

Procedia PDF Downloads 138
1541 Development of Automated Quality Management System for the Management of Heat Networks

Authors: Nigina Toktasynova, Sholpan Sagyndykova, Zhanat Kenzhebayeva, Maksat Kalimoldayev, Mariya Ishimova, Irbulat Utepbergenov

Abstract:

Any business needs a stable operation and continuous improvement, therefore it is necessary to constantly interact with the environment, to analyze the work of the enterprise in terms of employees, executives and consumers, as well as to correct any inconsistencies of certain types of processes and their aggregate. In the case of heat supply organizations, in addition to suppliers, local legislation must be considered which often is the main regulator of pricing of services. In this case, the process approach used to build a functional organizational structure in these types of businesses in Kazakhstan is a challenge not only in the implementation, but also in ways of analyzing the employee's salary. To solve these problems, we investigated the management system of heating enterprise, including strategic planning based on the balanced scorecard (BSC), quality management in accordance with the standards of the Quality Management System (QMS) ISO 9001 and analysis of the system based on expert judgment using fuzzy inference. To carry out our work we used the theory of fuzzy sets, the QMS in accordance with ISO 9001, BSC according to the method of Kaplan and Norton, method of construction of business processes according to the notation IDEF0, theory of modeling using Matlab software simulation tools and graphical programming LabVIEW. The results of the work are as follows: We determined possibilities of improving the management of heat-supply plant-based on QMS; after the justification and adaptation of software tool it has been used to automate a series of functions for the management and reduction of resources and for the maintenance of the system up to date; an application for the analysis of the QMS based on fuzzy inference has been created with novel organization of communication software with the application enabling the analysis of relevant data of enterprise management system.

Keywords: balanced scorecard, heat supply, quality management system, the theory of fuzzy sets

Procedia PDF Downloads 349
1540 Role of Microplastics on Reducing Heavy Metal Pollution from Wastewater

Authors: Derin Ureten

Abstract:

Plastic pollution does not disappear, it gets smaller and smaller through photolysis which are caused mainly by sun’s radiation, thermal oxidation, thermal degradation, and biodegradation which is the action of organisms digesting larger plastics. All plastic pollutants have exceedingly harmful effects on the environment. Together with the COVID-19 pandemic, the number of plastic products such as masks and gloves flowing into the environment has increased more than ever. However, microplastics are not the only pollutants in water, one of the most tenacious and toxic pollutants are heavy metals. Heavy metal solutions are also capable of causing varieties of health problems in organisms such as cancer, organ damage, nervous system damage, and even death. The aim of this research is to prove that microplastics can be used in wastewater treatment systems by proving that they could adsorb heavy metals in solutions. Experiment for this research will include two heavy metal solutions; one including microplastics in a heavy metal contaminated water solution, and one that just includes heavy metal solution. After being sieved, absorbance of both mediums will be measured with the help of a spectrometer. Iron (III) chloride (FeCl3) will be used as the heavy metal solution since the solution becomes darker as the presence of this substance increases. The experiment will be supported by Pure Nile Red powder in order to observe if there are any visible differences under the microscope. Pure Nile Red powder is a chemical that binds to hydrophobic materials such as plastics and lipids. If proof of adsorbance could be observed by the rates of the solutions' final absorbance rates and visuals ensured by the Pure Nile Red powder, the experiment will be conducted with different temperature levels in order to analyze the most accurate temperature level to proceed with removal of heavy metals from water. New wastewater treatment systems could be generated with the help of microplastics, for water contaminated with heavy metals.

Keywords: microplastics, heavy metal, pollution, adsorbance, wastewater treatment

Procedia PDF Downloads 73
1539 Biogas Production from Lake Bottom Biomass from Forest Management Areas

Authors: Dessie Tegegne Tibebu, Kirsi Mononen, Ari Pappinen

Abstract:

In areas with forest management, agricultural, and industrial activity, sediments and biomass are accumulated in lakes through drainage system, which might be a cause for biodiversity loss and health problems. One possible solution can be utilization of lake bottom biomass and sediments for biogas production. The main objective of this study was to investigate the potentials of lake bottom materials for production of biogas by anaerobic digestion and to study the effect of pretreatment methods for feed materials on biogas yield. In order to study the potentials of biogas production lake bottom materials were collected from two sites, Likokanta and Kutunjärvi lake. Lake bottom materials were mixed with straw-horse manure to produce biogas in a laboratory scale reactor. The results indicated that highest yields of biogas values were observed when feeds were composed of 50% lake bottom materials with 50% straw horse manure mixture-while with above 50% lake bottom materials in the feed biogas production decreased. CH4 content from Likokanta lake materials with straw-horse manure and Kutunjärvi lake materials with straw-horse manure were similar values when feed consisted of 50% lake bottom materials with 50% straw horse manure mixtures. However, feeds with lake bottom materials above 50%, the CH4 concentration started to decrease, impairing gas process. Pretreatment applied on Kutunjärvi lake materials showed a slight negative effect on the biogas production and lowest CH4 concentration throughout the experiment. The average CH4 production (ml g-1 VS) from pretreated Kutunjärvi lake materials with straw horse manure (208.9 ml g-1 VS) and untreated Kutunjärvi lake materials with straw horse manure (182.2 ml g-1 VS) were markedly higher than from Likokanta lake materials with straw horse manure (157.8 ml g-1 VS). According to the experimental results, utilization of 100% lake bottom materials for biogas production is likely to be impaired negatively. In the future, further analyses to improve the biogas yields, assessment of costs and benefits is needed before utilizing lake bottom materials for the production of biogas.

Keywords: anaerobic digestion, biogas, lake bottom materials, sediments, pretreatment

Procedia PDF Downloads 307
1538 Chromatographic Preparation and Performance on Zinc Ion Imprinted Monolithic Column and Its Adsorption Property

Authors: X. Han, S. Duan, C. Liu, C. Zhou, W. Zhu, L. Kong

Abstract:

The ionic imprinting technique refers to the three-dimensional rigid structure with the fixed pore sizes, which was formed by the binding interactions of ions and functional monomers and used ions as the template, it has a high level of recognition to the ionic template. The preparation of monolithic column by the in-situ polymerization need to put the compound of template, functional monomers, cross-linking agent and initiating agent into the solution, dissolve it and inject to the column tube, and then the compound will have a polymerization reaction at a certain temperature, after the synthetic reaction, we washed out the unread template and solution. The monolithic columns are easy to prepare, low consumption and cost-effective with fast mass transfer, besides, they have many chemical functions. But the monolithic columns have some problems in the practical application, such as low-efficiency, quantitative analysis cannot be performed accurately because of the peak shape is wide and has tailing phenomena; the choice of polymerization systems is limited and the lack of theoretical foundations. Thus the optimization of components and preparation methods is an important research direction. During the preparation of ionic imprinted monolithic columns, pore-forming agent can make the polymer generate the porous structure, which can influence the physical properties of polymer, what’ s more, it can directly decide the stability and selectivity of polymerization reaction. The compounds generated in the pre-polymerization reaction could directly decide the identification and screening capabilities of imprinted polymer; thus the choice of pore-forming agent is quite critical in the preparation of imprinted monolithic columns. This article mainly focuses on the research that when using different pore-forming agents, the impact of zinc ion imprinted monolithic column on the enrichment performance of zinc ion.

Keywords: high performance liquid chromatography (HPLC), ionic imprinting, monolithic column, pore-forming agent

Procedia PDF Downloads 202
1537 Discrete-Event Modeling and Simulation Methodologies: Past, Present and Future

Authors: Gabriel Wainer

Abstract:

Modeling and Simulation methods have been used to better analyze the behavior of complex physical systems, and it is now common to use simulation as a part of the scientific and technological discovery process. M&S advanced thanks to the improvements in computer technology, which, in many cases, resulted in the development of simulation software using ad-hoc techniques. Formal M&S appeared in order to try to improve the development task of very complex simulation systems. Some of these techniques proved to be successful in providing a sound base for the development of discrete-event simulation models, improving the ease of model definition and enhancing the application development tasks; reducing costs and favoring reuse. The DEVS formalism is one of these techniques, which proved to be successful in providing means for modeling while reducing development complexity and costs. DEVS model development is based on a sound theoretical framework. The independence of M&S tasks made possible to run DEVS models on different environments (personal computers, parallel computers, real-time equipment, and distributed simulators) and middleware. We will present a historical perspective of discrete-event M&S methodologies, showing different modeling techniques. We will introduce DEVS origins and general ideas, and compare it with some of these techniques. We will then show the current status of DEVS M&S, and we will discuss a technological perspective to solve current M&S problems (including real-time simulation, interoperability, and model-centered development techniques). We will show some examples of the current use of DEVS, including applications in different fields. We will finally show current open topics in the area, which include advanced methods for centralized, parallel or distributed simulation, the need for real-time modeling techniques, and our view in these fields.

Keywords: modeling and simulation, discrete-event simulation, hybrid systems modeling, parallel and distributed simulation

Procedia PDF Downloads 309
1536 Design and Development of a Mechanical Force Gauge for the Square Watermelon Mold

Authors: Morteza Malek Yarand, Hadi Saebi Monfared

Abstract:

This study aimed at designing and developing a mechanical force gauge for the square watermelon mold for the first time. It also tried to introduce the square watermelon characteristics and its production limitations. The mechanical force gauge performance and the product itself were also described. There are three main designable gauge models: a. hydraulic gauge, b. strain gauge, and c. mechanical gauge. The advantage of the hydraulic model is that it instantly displays the pressure and thus the force exerted by the melon. However, considering the inability to measure forces at all directions, complicated development, high cost, possible hydraulic fluid leak into the fruit chamber and the possible influence of increased ambient temperature on the fluid pressure, the development of this gauge was overruled. The second choice was to calculate pressure using the direct force a strain gauge. The main advantage of these strain gauges over spring types is their high precision in measurements; but with regard to the lack of conformity of strain gauge working range with water melon growth, calculations were faced with problems. Finally the mechanical pressure gauge has advantages, including the ability to measured forces and pressures on the mold surface during melon growth; the ability to display the peak forces; the ability to produce melon growth graph thanks to its continuous force measurements; the conformity of its manufacturing materials with the required physical conditions of melon growth; high air conditioning capability; the ability to permit sunlight reaches the melon rind (no yellowish skin and quality loss); fast and straightforward calibration; no damages to the product during assembling and disassembling; visual check capability of the product within the mold; applicable to all growth environments (field, greenhouses, etc.); simple process; low costs and so forth.

Keywords: mechanical force gauge, mold, reshaped fruit, square watermelon

Procedia PDF Downloads 262
1535 Study of Mobile Game Addiction Using Electroencephalography Data Analysis

Authors: Arsalan Ansari, Muhammad Dawood Idrees, Maria Hafeez

Abstract:

Use of mobile phones has been increasing considerably over the past decade. Currently, it is one of the main sources of communication and information. Initially, mobile phones were limited to calls and messages, but with the advent of new technology smart phones were being used for many other purposes including video games. Despite of positive outcomes, addiction to video games on mobile phone has become a leading cause of psychological and physiological problems among many people. Several researchers examined the different aspects of behavior addiction with the use of different scales. Objective of this study is to examine any distinction between mobile game addicted and non-addicted players with the use of electroencephalography (EEG), based upon psycho-physiological indicators. The mobile players were asked to play a mobile game and EEG signals were recorded by BIOPAC equipment with AcqKnowledge as data acquisition software. Electrodes were places, following the 10-20 system. EEG was recorded at sampling rate of 200 samples/sec (12,000samples/min). EEG recordings were obtained from the frontal (Fp1, Fp2), parietal (P3, P4), and occipital (O1, O2) lobes of the brain. The frontal lobe is associated with behavioral control, personality, and emotions. The parietal lobe is involved in perception, understanding logic, and arithmetic. The occipital lobe plays a role in visual tasks. For this study, a 60 second time window was chosen for analysis. Preliminary analysis of the signals was carried out with Acqknowledge software of BIOPAC Systems. From the survey based on CGS manual study 2010, it was concluded that five participants out of fifteen were in addictive category. This was used as prior information to group the addicted and non-addicted by physiological analysis. Statistical analysis showed that by applying clustering analysis technique authors were able to categorize the addicted and non-addicted players specifically on theta frequency range of occipital area.

Keywords: mobile game, addiction, psycho-physiology, EEG analysis

Procedia PDF Downloads 149
1534 Design and Assessment of Traffic Management Strategies for Improved Mobility on Major Arterial Roads in Lahore City

Authors: N. Ali, S. Nakayama, H. Yamaguchi, M. Nadeem

Abstract:

Traffic congestion is a matter of prime concern in developing countries. This can be primarily attributed due to poor design practices and biased allocation of resources based on political will neglecting the technical feasibilities in infrastructure design. During the last decade, Lahore has expanded at an unprecedented rate as compared to surrounding cities due to more funding and resource allocation by the previous governments. As a result of this, people from surrounding cities and areas moved to the Lahore city for better opportunities and quality of life. This migration inflow inherited the city with an increased population yielding the inefficiency of the existing infrastructure to accommodate enhanced traffic demand. This leads to traffic congestion on major arterial roads of the city. In this simulation study, a major arterial road was selected to evaluate the performance of the five intersections by changing the geometry of the intersections or signal control type. Simulations were done in two software; Highway Capacity Software (HCS) and Synchro Studio and Sim Traffic Software. Some of the traffic management strategies that were employed include actuated-signal control, semi-actuated signal control, fixed-time signal control, and roundabout. The most feasible solution for each intersection in the above-mentioned traffic management techniques was selected with the least delay time (seconds) and improved Level of Service (LOS). The results showed that Jinnah Hospital Intersection and Akbar Chowk Intersection improved 92.97% and 92.67% in delay time reduction, respectively. These results can be used by traffic planners and policy makers for decision making for the expansion of these intersections keeping in mind the traffic demand in future years.

Keywords: traffic congestion, traffic simulation, traffic management, congestion problems

Procedia PDF Downloads 458
1533 Evaluation of Classification Algorithms for Diagnosis of Asthma in Iranian Patients

Authors: Taha SamadSoltani, Peyman Rezaei Hachesu, Marjan GhaziSaeedi, Maryam Zolnoori

Abstract:

Introduction: Data mining defined as a process to find patterns and relationships along data in the database to build predictive models. Application of data mining extended in vast sectors such as the healthcare services. Medical data mining aims to solve real-world problems in the diagnosis and treatment of diseases. This method applies various techniques and algorithms which have different accuracy and precision. The purpose of this study was to apply knowledge discovery and data mining techniques for the diagnosis of asthma based on patient symptoms and history. Method: Data mining includes several steps and decisions should be made by the user which starts by creation of an understanding of the scope and application of previous knowledge in this area and identifying KD process from the point of view of the stakeholders and finished by acting on discovered knowledge using knowledge conducting, integrating knowledge with other systems and knowledge documenting and reporting.in this study a stepwise methodology followed to achieve a logical outcome. Results: Sensitivity, Specifity and Accuracy of KNN, SVM, Naïve bayes, NN, Classification tree and CN2 algorithms and related similar studies was evaluated and ROC curves were plotted to show the performance of the system. Conclusion: The results show that we can accurately diagnose asthma, approximately ninety percent, based on the demographical and clinical data. The study also showed that the methods based on pattern discovery and data mining have a higher sensitivity compared to expert and knowledge-based systems. On the other hand, medical guidelines and evidence-based medicine should be base of diagnostics methods, therefore recommended to machine learning algorithms used in combination with knowledge-based algorithms.

Keywords: asthma, datamining, classification, machine learning

Procedia PDF Downloads 433
1532 Studies on Space-Based Laser Targeting System for the Removal of Orbital Space Debris

Authors: Krima M. Rohela, Raja Sabarinath Sundaralingam

Abstract:

Humans have been launching rockets since the beginning of the space age in the late 1950s. We have come a long way since then, and the success rate for the launch of rockets has increased considerably. With every successful launch, there is a large amount of junk or debris which is released into the upper layers of the atmosphere. Space debris has been a huge concern for a very long time now. This includes the rocket shells released from the launch and the parts of defunct satellites. Some of this junk will come to fall towards the Earth and burn in the atmosphere. But most of the junk goes into orbit around the Earth, and they remain in orbits for at least 100 years. This can cause a lot of problems to other functioning satellites and may affect the future manned missions to space. The main concern of the space-debris is the increase in space activities, which leads to risks of collisions if not taken care of soon. These collisions may result in what is known as Kessler Syndrome. This debris can be removed by a space-based laser targeting system. Hence, the matter is investigated and discussed. The first step in this involves launching a satellite with a high-power laser device into space, above the debris belt. Then the target material is ablated with a focussed laser beam. This step of the process is highly dependent on the attitude and orientation of the debris with respect to the Earth and the device. The laser beam will cause a jet of vapour and plasma to be expelled from the material. Hence, the force is applied in the opposite direction, and in accordance with Newton’s third law of motion, this will cause the material to move towards the Earth and get pulled down due to gravity, where it will get disintegrated in the upper layers of the atmosphere. The larger pieces of the debris can be directed towards the oceans. This method of removal of the orbital debris will enable safer passage for future human-crewed missions into space.

Keywords: altitude, Kessler syndrome, laser ablation, Newton’s third law of motion, satellites, Space debris

Procedia PDF Downloads 133
1531 On-Line Super Critical Fluid Extraction, Supercritical Fluid Chromatography, Mass Spectrometry, a Technique in Pharmaceutical Analysis

Authors: Narayana Murthy Akurathi, Vijaya Lakshmi Marella

Abstract:

The literature is reviewed with regard to online Super critical fluid extraction (SFE) coupled directly with supercritical fluid chromatography (SFC) -mass spectrometry that have typically more sensitive than conventional LC-MS/MS and GC-MS/MS. It is becoming increasingly interesting to use on-line techniques that combine sample preparation, separation and detection in one analytical set up. This provides less human intervention, uses small amount of sample and organic solvent and yields enhanced analyte enrichment in a shorter time. The sample extraction is performed under light shielding and anaerobic conditions, preventing the degradation of thermo labile analytes. It may be able to analyze compounds over a wide polarity range as SFC generally uses carbon dioxide which was collected as a by-product of other chemical reactions or is collected from the atmosphere as it contributes no new chemicals to the environment. The diffusion of solutes in supercritical fluids is about ten times greater than that in liquids and about three times less than in gases which results in a decrease in resistance to mass transfer in the column and allows for fast high resolution separations. The drawback of SFC when using carbon dioxide as mobile phase is that the direct introduction of water samples poses a series of problems, water must therefore be eliminated before it reaches the analytical column. Hundreds of compounds analysed simultaneously by simple enclosing in an extraction vessel. This is mainly applicable for pharmaceutical industry where it can analyse fatty acids and phospholipids that have many analogues as their UV spectrum is very similar, trace additives in polymers, cleaning validation can be conducted by putting swab sample in an extraction vessel, analysing hundreds of pesticides with good resolution.

Keywords: super critical fluid extraction (SFE), super critical fluid chromatography (SFC), LCMS/MS, GCMS/MS

Procedia PDF Downloads 377
1530 Mental Accounting Theory Development Review and Application

Authors: Kang-Hsien Li

Abstract:

Along with global industries in using technology to enhance the application, make the study drawn more close to the people’s behavior and produce data analysis, extended out from the mental accounting of prospect theory, this paper provides the marketing and financial applications in the field of exploration and discussions with the future. For the foreseeable future, the payment behavior depends on the form of currency, which affects a variety of product types on the marketing of marketing strategy to provide diverse payment methods to enhance the overall sales performance. This not only affects people's consumption also affects people's investments. Credit card, PayPal, Apple pay, Bitcoin and any other with advances in technology and other emerging payment instruments, began to affect people for the value and the concept of money. Such as the planning of national social welfare policies, monetary and financial regulators and regulators. The expansion can be expected to discuss marketing and finance-related mental problems at the same time, recent studies reflect two different ideas, the first idea is that individuals affected by situational frames, not broad impact at the event level, affected by the people basically mental, second idea is that when an individual event affects a broader range, and majority of people will choose the same at the time that the rational choice. That are applied to practical application of marketing, at the same time provide an explanation in the financial market under the anomalies, due to the financial markets has varied investment products and different market participants, that also highlights these two points. It would provide in-depth description of humanity's mental. Certainly, about discuss mental accounting aspects, while artificial intelligence application development, although people would be able to reduce prejudice decisions, that will also lead to more discussion on the economic and marketing strategy.

Keywords: mental accounting, behavior economics, consumer behaviors, decision-making

Procedia PDF Downloads 441
1529 Environmental Accounting Practice: Analyzing the Extent and Qualification of Environmental Disclosures of Turkish Companies Located in BIST-XKURY Index

Authors: Raif Parlakkaya, Mustafa Nihat Demirci, Mehmet Nuri Salur

Abstract:

Environmental pollution has detrimental effects on the quality of our life and its scope has reached such an extent that measures are being taken both at the national and international levels to reduce, prevent and mitigate its impact on social, economic and political spheres. Therefore, awareness of environmental problems has been increasing among stakeholders and accordingly among companies. It is seen that corporate reporting is expanding beyond environmental performance. Primary purpose of publishing an environmental report is to provide specific audiences with useful, meaningful information. This paper is intended to analyze the extent and qualification of environmental disclosures of Turkish publicly quoted firms and see how it varies from one sector to another. The data for the study were collected from annual activity reports of companies, listed on the corporate governance index (BIST-XKURY) of Istanbul Stock Exchange. Content analysis was the research methodology used to measure the extent of environmental disclosure. Accordingly, 2015 annual activity reports of companies that carry out business in some particular fields were acquired from Capital Market Board, websites of Public Disclosure Platform and companies’ own websites. These reports were categorized into five main aspects: Environmental policies, environmental management systems, environmental protection and conservation activities, environmental awareness and information on environmental lawsuits. Subsequently, each component was divided into several variables related to what each firm is supposed to disclose about environmental information. In this context, the nature and scope of the information disclosed on each item were assessed according to five different ways (N.I: No Information; G.E.: General Explanations; Q.E.: Qualitative Detailed Explanations; N.E.: Quantitative (numerical) Detailed Explanations; Q.&N.E.: Both Qualitative and Quantitative Explanations).

Keywords: environmental accounting, disclosure, corporate governance, content analysis

Procedia PDF Downloads 246
1528 Dynamics of Soil Fertility Management in India: An Empirical Analysis

Authors: B. Suresh Reddy

Abstract:

The over dependence on chemical fertilizers for nutrient management in crop production for the last few decades has led to several problems affecting soil health, environment and farmers themselves. Based on the field work done in 2012-13 with 1080 farmers of different size-classes in semi-arid regions of Uttar Pradesh, Jharkhand and Madhya Pradesh states of India, this paper reveals that the farmers in semi-arid regions of India are actively managing soil fertility and other soil properties through a wide range of practices that are based on local resources and knowledge. It also highlights the socio-economic web woven around these soil fertility management practices. This study highlights the contribution of organic matter by traditional soil fertility management practices in maintaining the soil health. Livestock has profound influence on the soil fertility enhancement through supply of organic manure. Empirical data of this study has clearly revealed how farmers’ soil fertility management options are being undermined by government policies that give more priority to chemical fertiliser-based strategies. Based on the findings it is argued that there should be a 'level playing field' for both organic and inorganic soil fertility management methods by promoting and supporting farmers in using organic methods. There is a need to provide credit to farmers for adopting his choice of soil fertility management methods which suits his socio-economic conditions and that best suits the long term productivity of soils. The study suggests that the government policies related to soil fertility management must be enabling, creating the conditions for development based more on locally available resources and local skills and knowledge. This will not only keep Indian soils in healthy condition but also support the livelihoods of millions of people, especially the small and marginal farmers.

Keywords: livestock, organic matter, small farmers, soil fertility

Procedia PDF Downloads 157
1527 The Prevalence and Associated Factors of Frailty and Its Relationship with Falls in Patients with Schizophrenia

Authors: Bo-Jian Wu, Si-Heng Wu

Abstract:

Objectives: Frailty is a condition of a person who has chronic health problems complicated by a loss of physiological reserve and deteriorating functional abilities. The frailty syndrome was defined by Fried and colleagues, i.e., weight loss, fatigue, decreased grip strength, slow gait speed, and low physical activity. However, to our best knowledge, there have been rare studies exploring the prevalence of frailty and its association with falls in patients with schizophrenia. Methods: A total of 559 hospitalized patients were recruited from a public psychiatric hospital in 2013. The majority of the subjects were males (361, 64.6%). The average age was 53.5 years. All patients received the assessment of frailty status defined by Fried and colleagues. The status of a fall within one year after the assessment of frailty, clinical and demographic data was collected from medical records. Logistic regression was used to calculate the odds ratio of associated factors. Results : A total of 9.2% of the participants met the criteria of frailty. The percentage of patients having a fall was 7.2%. Age were significantly associated with frailty (odds ratio = 1.057, 95% confidence interval = 1.025-1.091); however, sex was not associated with frailty (p = 0.17). After adjustment for age and sex, frailty status was associated with a fall (odds ratio = 3.62, 95% confidence interval = 1.58-8.28). Concerning the components of frailty, decreased grip strength (odds ratio = 2.44, 95% confidence interval = 1.16-5.14), slow gait speed (odds ratio = 2.82, 95% confidence interval = 1.21-6.53), and low physical activity (odds ratio = 2.64, 95% confidence interval = 1.21-5.78) were found to be associated with a fall. Conclusions: Our findings suggest the prevalence of frailty was about 10% in hospitalized patients with chronic patients with schizophrenia, and frailty status was significant with a fall in this group. By using the status of frailty, it may be beneficial to potential target candidates having fallen in the future as early as possible. The effective intervention of prevention of further falls may be given in advance. Our results bridge this gap and open a potential avenue for the prevention of falls in patients with schizophrenia. Frailty is certainly an important factor for maintaining wellbeing among these patients.

Keywords: fall, frailty, schizophrenia, Taiwan

Procedia PDF Downloads 142
1526 Development and Validation of Work Movement Task Analysis: Part 1

Authors: Mohd Zubairy Bin Shamsudin

Abstract:

Work-related Musculoskeletal Disorder (WMSDs) is one of the occupational health problems encountered by workers over the world. In Malaysia, there is increasing in trend over the years, particularly in the manufacturing sectors. Current method to observe workplace WMSDs is self-report questionnaire, observation and direct measurement. Observational method is most frequently used by the researcher and practitioner because of the simplified, quick and versatile when it applies to the worksite. However, there are some limitations identified e.g. some approach does not cover a wide spectrum of biomechanics activity and not sufficiently sensitive to assess the actual risks. This paper elucidates the development of Work Movement Task Analysis (WMTA), which is an observational tool for industrial practitioners’ especially untrained personnel to assess WMSDs risk factors and provide a basis for suitable intervention. First stage of the development protocol involved literature reviews, practitioner survey, tool validation and reliability. A total of six themes/comments were received in face validity stage. New revision of WMTA consisted of four sections of postural (neck, back, shoulder, arms, and legs) and associated risk factors; movement, load, coupling and basic environmental factors (lighting, noise, odorless, heat and slippery floor). For inter-rater reliability study shows substantial agreement among rater with K = 0.70. Meanwhile, WMTA validation shows significant association between WMTA score and self-reported pain or discomfort for the back, shoulder&arms and knee&legs with p<0.05. This tool is expected to provide new workplace ergonomic observational tool to assess WMSDs for the next stage of the case study.

Keywords: assessment, biomechanics, musculoskeletal disorders, observational tools

Procedia PDF Downloads 454
1525 Predicting Intention and Readiness to Alcohol Consumption Reduction and Cessation among Thai Teenagers Using Scales Based on the Theory of Planned Behavior

Authors: Rewadee Watakakosol, Arunya Tuicomepee, Panrapee Suttiwan, Sakkaphat T. Ngamake

Abstract:

Health problems caused by alcohol consumption not only have short-term effects at the time of drinking but also leave long-lasting health conditions. Teenagers who start drinking in their middle-high or high school years or before entering college have higher likelihood to increase their alcohol use and abuse, and they were found to be less healthy compared with their non-drinking peers when entering adulthood. This study aimed to examine factors that predict intention and readiness to reduce and quit alcohol consumption among Thai teenagers. Participants were 826 high-school and vocational school students, most of whom were females (64.4%) with the average age of 16.4 (SD = 0.9) and the average age of first drinking at 13.7 (SD = 2.2). Instruments included the scales that developed based on the Theory of Planned Behaviour theoretical framework. They were the Attitude toward Alcohol Reduction and Cessation Scale, Normative Group and Influence Scale, Perceived Behavioral Control toward Alcohol Reduction and Cessation Scale, Behavioral Intent toward Alcohol Reduction and Cessation Scale, and Readiness to Reduce and Quit Alcohol Consumption Scale. Findings revealed that readiness to reduce / quit alcohol was the most powerful predictive factor (β=. 53, p < .01), followed by attitude of easiness in alcohol reduction and cessation (β=.46, p < .01), perceived behavioral control toward alcohol reduction and cessation (β =.41, p < .01), normative group and influence (β=.15, p < .01), and attitude of being accepted from alcohol reduction and cessation (β = -.12, p < .01), respectively. Attitude of improved health after alcohol reduction and cessation did not show statistically significantly predictive power. All factors significantly predict teenagers’ alcohol reduction and cessation behavior and accounted for 59 percent of total variance of alcohol consumption reduction and cessation.

Keywords: alcohol consumption reduction and cessation, intention, readiness to change, Thai teenagers

Procedia PDF Downloads 315
1524 The Effects of Water Fraction and Salinity on Crude Oil-Water Dispersions

Authors: Ramin Dabirian, Yi Zhang, Ilias Gavrielatos, Ram Mohan, Ovadia Shoham

Abstract:

Oil-water emulsions can be found in almost every part of the petroleum industry, namely in reservoir rocks, drilling cuttings circulation, production in wells, transportation pipelines, surface facilities and refining process. However, it is necessary for oil production and refinery engineers to resolve the petroleum emulsion problems as well as to eliminate the contaminants in order to meet environmental standards, achieve the desired product quality and to improve equipment reliability and efficiency. A state-of-art Dispersion Characterization Rig (DCR) has been utilized to investigate crude oil-distilled water dispersion separation. Over 80 experimental tests were ran to investigate the flow behavior and stability of the dispersions. The experimental conditions include the effects of water cuts (25%, 50% and 75%), NaCl concentrations (0, 3.5% and 18%), mixture flow velocities (0.89 and 1.71 ft/s), and also orifice place types on the separation rate. The experimental data demonstrate that the water cut can significantly affects the separation time and efficiency. The dispersion with lower water cut takes longer time to separate and have low separation efficiency. The medium and lower water cuts will result in the formation of Mousse emulsion and the phase inversion happens around the medium water cut. The data also confirm that increasing the NaCl concentration in aqueous phase can increase the crude oil water dispersion separation efficiency especially at higher salinities. The separation profile for dispersions with lower salt concentrations has a lower sedimentation rate slope before the inflection point. Dispersions in all tests with higher salt concentrations have a larger sedimenting rate. The presence of NaCl can influence the interfacial tension gradients along the interface and it plays a role in avoiding the Mousse emulsion formation.

Keywords: oil-water dispersion, separation mechanism, phase inversion, emulsion formation

Procedia PDF Downloads 170
1523 Review on Implementation of Artificial Intelligence and Machine Learning for Controlling Traffic and Avoiding Accidents

Authors: Neha Singh, Shristi Singh

Abstract:

Accidents involving motor vehicles are more likely to cause serious injuries and fatalities. It also has a host of other perpetual issues, such as the regular loss of life and goods in accidents. To solve these issues, appropriate measures must be implemented, such as establishing an autonomous incident detection system that makes use of machine learning and artificial intelligence. In order to reduce traffic accidents, this article examines the overview of artificial intelligence and machine learning in autonomous event detection systems. The paper explores the major issues, prospective solutions, and use of artificial intelligence and machine learning in road transportation systems for minimising traffic accidents. There is a lot of discussion on additional, fresh, and developing approaches that less frequent accidents in the transportation industry. The study structured the following subtopics specifically: traffic management using machine learning and artificial intelligence and an incident detector with these two technologies. The internet of vehicles and vehicle ad hoc networks, as well as the use of wireless communication technologies like 5G wireless networks and the use of machine learning and artificial intelligence for the planning of road transportation systems, are elaborated. In addition, safety is the primary concern of road transportation. Route optimization, cargo volume forecasting, predictive fleet maintenance, real-time vehicle tracking, and traffic management, according to the review's key conclusions, are essential for ensuring the safety of road transportation networks. In addition to highlighting research trends, unanswered problems, and key research conclusions, the study also discusses the difficulties in applying artificial intelligence to road transport systems. Planning and managing the road transportation system might use the work as a resource.

Keywords: artificial intelligence, machine learning, incident detector, road transport systems, traffic management, automatic incident detection, deep learning

Procedia PDF Downloads 86
1522 Optimisation of Dyes Decolourisation by Bacillus aryabhattai

Authors: A. Paz, S. Cortés Diéguez, J. M. Cruz, A. B. Moldes, J. M. Domínguez

Abstract:

Synthetic dyes are extensively used in the paper, food, leather, cosmetics, pharmaceutical and textile industries. Wastewater resulting from their production means several environmental problems. Improper disposal of theirs effluents involves adverse impacts and not only about the colour, also on water quality (Total Organic Carbon, Biological Oxygen Demand, Chemical Oxygen Demand, suspended solids, salinity, etc.) on flora (inhibition of photosynthetic activity), fauna (toxic, carcinogenic, and mutagenic effects) and human health. The aim of this work is to optimize the decolourisation process of different types of dyes by Bacillus aryabhattai. Initially, different types of dyes (Indigo Carmine, Coomassie Brilliant Blue and Remazol Brilliant Blue R) and suitable culture media (Nutritive Broth, Luria Bertani Broth and Trypticasein Soy Broth) were selected. Then, a central composite design (CCD) was employed to optimise and analyse the significance of each abiotic parameter. Three process variables (temperature, salt concentration and agitation) were investigated in the CCD at 3 levels with 2-star points. A total of 23 experiments were carried out according to a full factorial design, consisting of 8 factorial experiments (coded to the usual ± 1 notation), 6 axial experiments (on the axis at a distance of ± α from the centre), and 9 replicates (at the centre of the experimental domain). Experiments results suggest the efficiency of this strain to remove the tested dyes on the 3 media studied, although Trypticasein Soy Broth (TSB) was the most suitable medium. Indigo Carmine and Coomassie Brilliant Blue at maximal tested concentration 150 mg/l were completely decolourised, meanwhile, an acceptable removal was observed using the more complicate dye Remazol Brilliant Blue R at a concentration of 50 mg/l.

Keywords: Bacillus aryabhattai, dyes, decolourisation, central composite design

Procedia PDF Downloads 207
1521 Climate Variability and Its Impacts on Rice (Oryza sativa) Productivity in Dass Local Government Area of Bauchi State, Nigeria

Authors: Auwal Garba, Rabiu Maijama’a, Abdullahi Muhammad Jalam

Abstract:

Variability in climate has affected the agricultural production all over the globe. This concern has motivated important changes in the field of research during the last decade. Climate variability is believed to have declining effects towards rice production in Nigeria. This study examined climate variability and its impact on rice productivity in Dass Local Government Area, Bauchi State, by employing Linear Trend Model (LTM), analysis of variance (ANOVA) and regression analysis. Annual seasonal data of the climatic variables for temperature (min. and max), rainfall, and solar radiation from 1990 to 2015 were used. Results confirmed that 74.4% of the total variation in rice yield in the study area was explained by the changes in the independent variables. That is to say, temperature (minimum and maximum), rainfall, and solar radiation explained rice yield with 74.4% in the study area. Rising mean maximum temperature would lead to reduction in rice production while moderate increase in mean minimum temperature would be advantageous towards rice production, and the persistent rise in the mean maximum temperature, in the long run, will have more negatively affect rice production in the future. It is, therefore, important to promote agro-meteorological advisory services, which will be useful in farm planning and yield sustainability. Closer collaboration among the meteorologist and agricultural scientist is needed to increase the awareness about the existing database, crop weather models among others, with a view to reaping the full benefits of research on specific problems and sustainable yield management and also there should be a special initiative by the ADPs (State Agricultural Development Programme) towards promoting best agricultural practices that are resilient to climate variability in rice production and yield sustainability.

Keywords: climate variability, impact, productivity, rice

Procedia PDF Downloads 87
1520 Nitrogen Fixation in Hare Gastrointestinal Tract

Authors: Tatiana A. Kuznetsova, Maxim V. Vechersky, Natalia V. Kostina, Marat M. Umarov, Elena I. Naumova

Abstract:

One of the main problems of nutrition of phytophagous animals is the insufficiency of protein in their forage. Usually, symbiotic microorganisms highly contribute both to carbohydrates and nitrogen compounds of the food. But it is not easy to utilize microbial biomass in the large intestine and caecum for the animals with hindgut fermentation. So that, some animals, as well hares, developed special mechanism of contribution of such biomass - obligate autocoprophagy, or reingestion. Hares have two types of feces - the hard and the soft. Hard feces are excreted at night, while hares are vigilance ("foraging period"), and the soft ones (caecotrophs) are produced and reingested in the day-time during hares "resting-period". We examine the role of microbial digestion in providing nitrogen nutrition of hare (Lepus europaeus). We determine the ability of nitrogen fixation in fornix and stomach body, small intestine, caecum and colon of hares' gastro-intestinal tract in two main period of hares activity - "resting-period" (day time) and "foraging period" (late-evening and very-early-morning). We use gas chromatography to measure levels of nitrogen fixing activity (acetylene reduction). Nitrogen fixing activity was detected in the contents of all analyzed parts of the gastrointestinal tract. Maximum values were recorded in the large intestine. Also daily dynamics of the process was detected. Thus, during hare “resting-period” (caecotrophs formation) N2-fixing activity was significantly higher than during “foraging period”, reaching 0,3 nmol C2H4/g*h. N2-fixing activity in the gastrointestinal tract can allocate to significant contribution of nitrogen fixers to microbial digestion in hare and confirms the importance of coprophagy as a nitrogen source in lagomorphs.

Keywords: coprophagy, gastrointestinal tract, lagomorphs, nitrogen fixation

Procedia PDF Downloads 345
1519 Topology Enhancement of a Straight Fin Using a Porous Media Computational Fluid Dynamics Simulation Approach

Authors: S. Wakim, M. Nemer, B. Zeghondy, B. Ghannam, C. Bouallou

Abstract:

Designing the optimal heat exchanger is still an essential objective to be achieved. Parametrical optimization involves the evaluation of the heat exchanger dimensions to find those that best satisfy certain objectives. This method contributes to an enhanced design rather than an optimized one. On the contrary, topology optimization finds the optimal structure that satisfies the design objectives. The huge development in metal additive manufacturing allowed topology optimization to find its way into engineering applications especially in the aerospace field to optimize metal structures. Using topology optimization in 3d heat and mass transfer problems requires huge computational time, therefore coupling it with CFD simulations can reduce this it. However, existed CFD models cannot be coupled with topology optimization. The CFD model must allow creating a uniform mesh despite the initial geometry complexity and also to swap the cells from fluid to solid and vice versa. In this paper, a porous media approach compatible with topology optimization criteria is developed. It consists of modeling the fluid region of the heat exchanger as porous media having high porosity and similarly the solid region is modeled as porous media having low porosity. The switching from fluid to solid cells required by topology optimization is simply done by changing each cell porosity using a user defined function. This model is tested on a plate and fin heat exchanger and validated by comparing its results to experimental data and simulations results. Furthermore, this model is used to perform a material reallocation based on local criteria to optimize a plate and fin heat exchanger under a constant heat duty constraint. The optimized fin uses 20% fewer materials than the first while the pressure drop is reduced by about 13%.

Keywords: computational methods, finite element method, heat exchanger, porous media, topology optimization

Procedia PDF Downloads 142
1518 Mike Hat: Coloured-Tape-in-Hat as a Head Circumference Measuring Instrument for Early Detection of Hydrocephalus in an Infant

Authors: Nyimas Annissa Mutiara Andini

Abstract:

Every year, children develop hydrocephalus during the first year of life. If it is not treated, hydrocephalus can lead to brain damage, a loss in mental and physical abilities, and even death. To be treated, first, we have to do a proper diagnosis using some examinations especially to detect hydrocephalus earlier. One of the examination that could be done is using a head circumference measurement. Increased head circumference is a first and main sign of hydrocephalus, especially in infant (0-1 year age). Head circumference is a measurement of a child's head largest area. In this measurement, we want to get the distance from above the eyebrows and ears and around the back of the head using a measurement tape. If the head circumference of an infant is larger than normal, this infant might potentially suffer hydrocephalus. If early diagnosis and timely treatment of hydrocephalus could be done most children can recover successfully. There are some problems with early detection of hydrocephalus using regular tape for head circumference measurement. One of the problem is the infant’s comfort. We need to make the infant feel comfort along the head circumference measurement to get a proper result of the examination. For that, we can use a helpful stuff, like a hat. This paper is aimed to describe the possibility of using a head circumference measuring instrument for early detection of hydrocephalus in an infant with a mike hat, coloured-tape-in-hat. In the first life, infants’ head size is about 35 centimeters. First three months after that infants will gain 2 centimeters each month. The second three months, infant’s head circumference will increase 1 cm each month. And for the six months later, the rate is 0.5 cm per month, and end up with an average of 47 centimeters. This formula is compared to the WHO’s head circumference growth chart. The shape of this tape-in-hat is alike an upper arm measurement. This tape-in-hat diameter is about 47 centimeters. It contains twelve different colours range by age. If it is out of the normal colour, the infant potentially suffers hydrocephalus. This examination should be done monthly. If in two times of measurement there still in the same range abnormal of head circumference, or a rapid growth of the head circumference size, the infant should be referred to a pediatrician. There are the pink hat for girls and blue hat for boys. Based on this paper, we know that this measurement can be used to help early detection of hydrocephalus in an infant.

Keywords: head circumference, hydrocephalus, infant, mike hat

Procedia PDF Downloads 254
1517 Application of a Model-Free Artificial Neural Networks Approach for Structural Health Monitoring of the Old Lidingö Bridge

Authors: Ana Neves, John Leander, Ignacio Gonzalez, Raid Karoumi

Abstract:

Systematic monitoring and inspection are needed to assess the present state of a structure and predict its future condition. If an irregularity is noticed, repair actions may take place and the adequate intervention will most probably reduce the future costs with maintenance, minimize downtime and increase safety by avoiding the failure of the structure as a whole or of one of its structural parts. For this to be possible decisions must be made at the right time, which implies using systems that can detect abnormalities in their early stage. In this sense, Structural Health Monitoring (SHM) is seen as an effective tool for improving the safety and reliability of infrastructures. This paper explores the decision-making problem in SHM regarding the maintenance of civil engineering structures. The aim is to assess the present condition of a bridge based exclusively on measurements using the suggested method in this paper, such that action is taken coherently with the information made available by the monitoring system. Artificial Neural Networks are trained and their ability to predict structural behavior is evaluated in the light of a case study where acceleration measurements are acquired from a bridge located in Stockholm, Sweden. This relatively old bridge is presently still in operation despite experiencing obvious problems already reported in previous inspections. The prediction errors provide a measure of the accuracy of the algorithm and are subjected to further investigation, which comprises concepts like clustering analysis and statistical hypothesis testing. These enable to interpret the obtained prediction errors, draw conclusions about the state of the structure and thus support decision making regarding its maintenance.

Keywords: artificial neural networks, clustering analysis, model-free damage detection, statistical hypothesis testing, structural health monitoring

Procedia PDF Downloads 191
1516 Personality-Focused Intervention for Adolescents: Impact on Bullying and Distress

Authors: Erin V. Kelly, Nicola C. Newton, Lexine A. Stapinski, Maree Teesson

Abstract:

Introduction: There is a lack of targeted prevention programs for reducing bullying and distress among adolescents involved in bullying. The current study aimed to examine the impact of a personality-targeted intervention (Preventure) on bullying (victimization and perpetration) and distress among adolescent victims/bullies with high-risk personality types. Method: A cluster randomized trial (RCT) was conducted in 26 secondary schools (2190 students) in NSW and Victoria, Australia, as part of the Climate Schools and Preventure trial. The schools were randomly allocated to Preventure (13 schools received Preventure, 13 did not). Students were followed up at 4 time points (6, 12, 24 and 36 months post-baseline). Preventure involves two group sessions, based on cognitive behavioral therapy, and tailored to four personality types shown to increase risk of substance misuse and other emotional and behavioural problems, including impulsivity, sensation-seeking, anxiety sensitivity and hopelessness. Students were allocated to the personality-targeted groups based on their scores on the Substance Use Risk Profile Scale. Bullying was measured using an amended version of the Revised Olweus Bully/Victim Scale. Psychological distress was measured using the Kessler Psychological Distress Scale. Results: Among high-risk students classified as victims at baseline, those in Preventure schools reported significantly less victimization and distress over time than those in control schools. Among high-risk students classified as bullies at baseline, those in Preventure schools reported significantly less distress over time than those in control schools (no difference for perpetration). Conclusion: Preventure is a promising intervention for reducing bullying victimization and psychological distress among adolescents involved in bullying.

Keywords: adolescents, bullying, personality, prevention

Procedia PDF Downloads 216
1515 Pterygium Recurrence Rate and Influencing Factors for Recurrence of Pterygium after Pterygium Surgery at an Eastern Thai University Hospital

Authors: Luksanaporn Krungkraipetch

Abstract:

Pterygium is a frequent ocular surface lesion that begins in the limbal conjunctiva within the palpebral fissure and spreads to the cornea. The lesion is more common in the nasal limbus than in the temporal, and it has a wing-like aspect. Indications for surgery, in decreasing order of significance, are growth over the corneal center, decreased vision due to corneal deformation, documented growth, sensations of discomfort, and esthetic concerns. The aim of this study is twofold: first, to determine the frequency of pterygium recurrence after surgery at the mentioned hospital, and second, to identify the factors that influence the recurrence of pterygium. The research design is a retrospective examination of 164 patient samples in an eastern Thai university hospital (Code 13766). Data analysis is descriptive statistics analysis, i.e., basic data details about pterygium surgery and the risk of recurrent pterygium, and for factor analysis, the inferential statistics chi-square and ANOVA are utilized. Twenty-four of the 164 patients who underwent surgery exhibited recurrent pterygium. Consequently, the incidence of recurrent pterygium after surgery was 14.6%. There were an equal number of men and women present. The participants' ages ranged from 41 to 60 years (62, 8 percent). According to the findings, the majority of patients were female (60.4%), over the age of 60 (51.2%), did not live near the beach (83.5%), did not have an underlying disease (92.1%), and 95.7% did not have any other eye problems. Gender (X² = 1.26, p = .289), age (X² = 5.86, p = .119), an address near the sea (X² = 3.30, p = .081)), underlying disease (X² = 0.54, p = .694), and eye disease (X² = 0.00, p = 1.00) had no effect on pterygium recurrence. Recurrences occurred in 79.1% of all surgical procedures and 11.6% of all patients using the bare sclera technique. The recurrence rate for conjunctival autografts was 20.9% for all procedures and 3.0% for all participants. Mitomycin-C and amniotic membrane transplant techniques had no recurrence following surgery. Comparing the surgeries done on people with recurrent pterygium did not show anything important (F = 1.13, p = 0.339). In conclusion, the prevalence of pterygium recurrence following pterygium, 14.6%, does not differ from earlier research. Underlying disease, other eye conditions, and surgical procedures such as pterygium recurrence are unaffected by pterygium surgery.

Keywords: pterygium, recurrence pterygium, pterygium surgery, excision pterygium

Procedia PDF Downloads 54
1514 International and Intercultural Communication Design: Case Study of Manipulative Advertising

Authors: Faiqa Jalal

Abstract:

The purpose of the following research paper is to discuss the differentiating meanings of culture and how popular culture has maintained a great impact on intercultural and international behavior. The following discussion leads to the notion of communicating cultural impact on behavior through advertising and sub-cultural theory in advertising. Although towards the end of the research, the complexities that develop through the above discussion, lead to the solution that ‘advertising gives meaning to the otherwise meaningless and identical objects through linking them to our basic needs’. In today’s fast paced digital world, it is difficult to define culture, literally, since its meaning tends to shift through series of different perceptions such as ‘how’ and ‘why’ it should be used. This notion can be taken towards another notion of popular culture. It is dependent on ‘attitudes, ideas, images, perspectives and other phenomena within the mainstream of a given culture’. Since popular culture is influenced by mass media, it has a way of influencing an individual’s attitude towards certain topics. For example, tattoos are a form of human decorations, that have historic significance, and a huge spectrum of meanings. Advertising is one aspect of marketing that has evolved from the time when it was ‘production oriented’, up till the time it started using different mediums to make its impact more effective. However, this impact has confused us between our needs and desires. The focus in this paper is ‘we consume to acquire a sense of social identity and status, not just for the sake of consumption’. Every culture owns different expressions, which are then used by advertisers to create its impact on the behavior of people sub-culturally and globally, as culture grows through social interaction. Advertisers furthermore play a smart role in highlighting quality of life ranging from ‘survival to well-being’. Hence, this research paper concludes by highlighting that culture is considered as a ‘basic root’ of any community that also provides solution to certain problems; however, advertisers play their part in manipulating society’s literacy and beliefs by rationalizing how relevant certain products/brands are to their beliefs.

Keywords: mass media, popular culture, production oriented, sub-culture

Procedia PDF Downloads 204
1513 Information Security Risk Management in IT-Based Process Virtualization: A Methodological Design Based on Action Research

Authors: Jefferson Camacho Mejía, Jenny Paola Forero Pachón, Luis Carlos Gómez Flórez

Abstract:

Action research is a qualitative research methodology, which leads the researcher to delve into the problems of a community in order to understand its needs in depth and finally, to propose actions that lead to a change of social paradigm. Although this methodology had its beginnings in the human sciences, it has attracted increasing interest and acceptance in the field of information systems research since the 1990s. The countless possibilities offered nowadays by the use of Information Technologies (IT) in the development of different socio-economic activities have meant a change of social paradigm and the emergence of the so-called information and knowledge society. According to this, governments, large corporations, small entrepreneurs and in general, organizations of all kinds are using IT to virtualize their processes, taking them from the physical environment to the digital environment. However, there is a potential risk for organizations related with exposing valuable information without an appropriate framework for protecting it. This paper shows progress in the development of a methodological design to manage the information security risks associated with the IT-based processes virtualization, by applying the principles of the action research methodology and it is the result of a systematic review of the scientific literature. This design consists of seven fundamental stages. These are distributed in the three stages described in the action research methodology: 1) Observe, 2) Analyze and 3) Take actions. Finally, this paper aims to offer an alternative tool to traditional information security management methodologies with a view to being applied specifically in the planning stage of IT-based process virtualization in order to foresee risks and to establish security controls before formulating IT solutions in any type of organization.

Keywords: action research, information security, information technology, methodological design, process virtualization, risk management

Procedia PDF Downloads 152