Search results for: multiple sills emplacement
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4752

Search results for: multiple sills emplacement

4242 Analysis of Atomic Models in High School Physics Textbooks

Authors: Meng-Fei Cheng, Wei Fneg

Abstract:

New Taiwan high school standards emphasize employing scientific models and modeling practices in physics learning. However, to our knowledge. Few studies address how scientific models and modeling are approached in current science teaching, and they do not examine the views of scientific models portrayed in the textbooks. To explore the views of scientific models and modeling in textbooks, this study investigated the atomic unit in different textbook versions as an example and provided suggestions for modeling curriculum. This study adopted a quantitative analysis of qualitative data in the atomic units of four mainstream version of Taiwan high school physics textbooks. The models were further analyzed using five dimensions of the views of scientific models (nature of models, multiple models, purpose of the models, testing models, and changing models); each dimension had three levels (low, medium, high). Descriptive statistics were employed to compare the frequency of describing the five dimensions of the views of scientific models in the atomic unit to understand the emphasis of the views and to compare the frequency of the eight scientific models’ use to investigate the atomic model that was used most often in the textbooks. Descriptive statistics were further utilized to investigate the average levels of the five dimensions of the views of scientific models to examine whether the textbooks views were close to the scientific view. The average level of the five dimensions of the eight atomic models were also compared to examine whether the views of the eight atomic models were close to the scientific views. The results revealed the following three major findings from the atomic unit. (1) Among the five dimensions of the views of scientific models, the most portrayed dimension was the 'purpose of models,' and the least portrayed dimension was 'multiple models.' The most diverse view was the 'purpose of models,' and the most sophisticated scientific view was the 'nature of models.' The least sophisticated scientific view was 'multiple models.' (2) Among the eight atomic models, the most mentioned model was the atomic nucleus model, and the least mentioned model was the three states of matter. (3) Among the correlations between the five dimensions, the dimension of 'testing models' was highly related to the dimension of 'changing models.' In short, this study examined the views of scientific models based on the atomic units of physics textbooks to identify the emphasized and disregarded views in the textbooks. The findings suggest how future textbooks and curriculum can provide a thorough view of scientific models to enhance students' model-based learning.

Keywords: atomic models, textbooks, science education, scientific model

Procedia PDF Downloads 158
4241 Probing Language Models for Multiple Linguistic Information

Authors: Bowen Ding, Yihao Kuang

Abstract:

In recent years, large-scale pre-trained language models have achieved state-of-the-art performance on a variety of natural language processing tasks. The word vectors produced by these language models can be viewed as dense encoded presentations of natural language that in text form. However, it is unknown how much linguistic information is encoded and how. In this paper, we construct several corresponding probing tasks for multiple linguistic information to clarify the encoding capabilities of different language models and performed a visual display. We firstly obtain word presentations in vector form from different language models, including BERT, ELMo, RoBERTa and GPT. Classifiers with a small scale of parameters and unsupervised tasks are then applied on these word vectors to discriminate their capability to encode corresponding linguistic information. The constructed probe tasks contain both semantic and syntactic aspects. The semantic aspect includes the ability of the model to understand semantic entities such as numbers, time, and characters, and the grammatical aspect includes the ability of the language model to understand grammatical structures such as dependency relationships and reference relationships. We also compare encoding capabilities of different layers in the same language model to infer how linguistic information is encoded in the model.

Keywords: language models, probing task, text presentation, linguistic information

Procedia PDF Downloads 110
4240 Using IoT on Single Input Multiple Outputs (SIMO) DC–DC Converter to Control Smart-home

Authors: Auwal Mustapha Imam

Abstract:

The aim of the energy management system is to monitor and control utilization, access, optimize and manage energy availability. This can be realized through real-time analyses and energy sources and loads data control in a predictive way. Smart-home monitoring and control provide convenience and cost savings by controlling appliances, lights, thermostats and other loads. There may be different categories of loads in the various homes, and the homeowner may wish to control access to solar-generated energy to protect the storage from draining completely. Controlling the power system operation by managing the converter output power and controlling how it feeds the appliances will satisfy the residential load demand. The Internet of Things (IoT) provides an attractive technological platform to connect the two and make home automation and domestic energy utilization easier and more attractive. This paper presents the use of IoT-based control topology to monitor and control power distribution and consumption by DC loads connected to single-input multiple outputs (SIMO) DC-DC converter, thereby reducing leakages, enhancing performance and reducing human efforts. A SIMO converter was first developed and integrated with the IoT/Raspberry Pi control topology, which enables the user to monitor and control power scheduling and load forecasting via an Android app.

Keywords: flyback, converter, DC-DC, photovoltaic, SIMO

Procedia PDF Downloads 47
4239 Design and Synthesis of Two Tunable Bandpass Filters Based on Varactors and Defected Ground Structure

Authors: M'Hamed Boulakroune, Mouloud Challal, Hassiba Louazene, Saida Fentiz

Abstract:

This paper presents a new ultra wideband (UWB) microstrip bandpass filter (BPF) at microwave frequencies. The first one is based on multiple-mode resonator (MMR) and rectangular-shaped defected ground structure (DGS). This filter, which is compact size of 25.2 x 3.8 mm2, provides in the pass band an insertion loss of 0.57 dB and a return loss greater than 12 dB. The second structure is a tunable bandpass filters using planar patch resonators based on diode varactor. This filter is formed by a triple mode circular patch resonator with two pairs of slots, in which the varactors are connected. Indeed, this filter is initially centered at 2.4 GHz, the center frequency of the tunable patch filter could be tuned up to 1.8 GHz simultaneously with the bandwidth, reaching high tuning ranges. Lossless simulations were compared to those considering the substrate dielectric, conductor losses, and the equivalent electrical circuit model of the tuning element in order to assess their effects. Within these variations, simulation results showed insertion loss better than 2 dB and return loss better than 10 dB over the passband. The proposed filters presents good performances and the simulation results are in satisfactory agreement with the experimentation ones reported elsewhere.

Keywords: defected ground structure, diode varactor, microstrip bandpass filter, multiple-mode resonator

Procedia PDF Downloads 311
4238 Compression Index Estimation by Water Content and Liquid Limit and Void Ratio Using Statistics Method

Authors: Lizhou Chen, Abdelhamid Belgaid, Assem Elsayed, Xiaoming Yang

Abstract:

Compression index is essential in foundation settlement calculation. The traditional method for determining compression index is consolidation test which is expensive and time consuming. Many researchers have used regression methods to develop empirical equations for predicting compression index from soil properties. Based on a large number of compression index data collected from consolidation tests, the accuracy of some popularly empirical equations were assessed. It was found that primary compression index is significantly overestimated in some equations while it is underestimated in others. The sensitivity analyses of soil parameters including water content, liquid limit and void ratio were performed. The results indicate that the compression index obtained from void ratio is most accurate. The ANOVA (analysis of variance) demonstrates that the equations with multiple soil parameters cannot provide better predictions than the equations with single soil parameter. In other words, it is not necessary to develop the relationships between compression index and multiple soil parameters. Meanwhile, it was noted that secondary compression index is approximately 0.7-5.0% of primary compression index with an average of 2.0%. In the end, the proposed prediction equations using power regression technique were provided that can provide more accurate predictions than those from existing equations.

Keywords: compression index, clay, settlement, consolidation, secondary compression index, soil parameter

Procedia PDF Downloads 162
4237 Pharmacodynamic Enhancement of Repetitive rTMS Treatment Outcomes for Major Depressive Disorder

Authors: A. Mech

Abstract:

Repetitive transcranial magnetic stimulation has proven to be a valuable treatment option for patients who have failed to respond to multiple courses of antidepressant medication. In fact, the American Psychiatric Association recommends TMS after one failed treatment course of antidepressant medication. Genetic testing has proven valuable for pharmacokinetic variables, which, if understood, could lead to more efficient dosing of psychotropic medications to improve outcomes. Pharmacodynamic testing can identify biomarkers, which, if addressed, can improve patients' outcomes in antidepressant therapy. Monotherapy treatment of major depressive disorder with methylated B vitamin treatment has been shown to be safe and effective in patients with MTHFR polymorphisms without waiting for multiple trials of failed medication treatment for depression. Such treatment has demonstrated remission rates similar to antidepressant clinical trials. Combining pharmacodynamics testing with repetitive TMS treatment with NeuroStar has shown promising potential for enhancing remission rates and durability of treatment. In this study, a retrospective chart review (ongoing) of patients who obtained repetitive TMS treatment enhanced by dietary supplementation guided by Pharmacodynamic testing, displayed a greater remission rate (90%) than patients treated with only NeuroStar TMS (62%).

Keywords: improved remission rate, major depressive disorder, pharmacodynamic testing, rTMS outcomes

Procedia PDF Downloads 57
4236 Revolutionizing Gaming Setup Design: Utilizing Generative and Iterative Methods to Prop and Environment Design, Transforming the Landscape of Game Development Through Automation and Innovation

Authors: Rashmi Malik, Videep Mishra

Abstract:

The practice of generative design has become a transformative approach for an efficient way of generating multiple iterations for any design project. The conventional way of modeling the game elements is very time-consuming and requires skilled artists to design. A 3D modeling tool like 3D S Max, Blender, etc., is used traditionally to create the game library, which will take its stipulated time to model. The study is focused on using the generative design tool to increase the efficiency in game development at the stage of prop and environment generation. This will involve procedural level and customized regulated or randomized assets generation. The paper will present the system design approach using generative tools like Grasshopper (visual scripting) and other scripting tools to automate the process of game library modeling. The script will enable the generation of multiple products from the single script, thus creating a system that lets designers /artists customize props and environments. The main goal is to measure the efficacy of the automated system generated to create a wide variety of game elements, further reducing the need for manual content creation and integrating it into the workflow of AAA and Indie Games.

Keywords: iterative game design, generative design, gaming asset automation, generative game design

Procedia PDF Downloads 70
4235 Ectopic Mediastinal Parathyroid Adenoma: A Case Report with Diagnostic and Management Challenges

Authors: Augustina Konadu Larbi-Ampofo, Ekemini Umoinwek

Abstract:

Background: Hypercalcaemia is a common electrolyte imbalance that increases mortality if poorly controlled. Primary hyperparathyroidism often presents like this with a prevalence of 0.1-0.3%. Management due to an ectopic parathyroid adenoma in the mediastinum is challenging, especially in a patient with a pacemaker. Case Presentation: A 79-year-old woman with a history of a previous cardiac arrest, permanent pacemaker, ischaemic heart disease, bilateral renal calculi, rectal polyps, liver cirrhosis, and a family history of hyperthyroidism presented to the emergency department with acute back pain. Management and Outcome: The patient was diagnosed with primary hyperparathyroidism due to her elevated corrected calcium and parathyroid hormone levels. Parathyroid investigations consisting of an NM MIBI scan, SPECT-CT, 4D parathyroid scan, and an ultrasound scan of the neck and thorax confirmed an ectopic parathyroid adenoma in the mediastinum at the level of the aortic arch, along with benign thyroid nodules. The location of the adenoma warranted a thoracoscopic surgical approach; however, the presence of her pacemaker and other cardiovascular conditions predisposed her to a potentially poorer post-operative outcome. Discussion: Mediastinal ectopic parathyroid adenomas are rare and difficult to diagnose and treat, often needing a multimodal imaging approach for accurate localisation. Surgery is a definitive treatment; however, in this patient, long-term medical treatment with cinacalcet was the only next suitable treatment option. The difficulty with this is that cinacalcet tackles the biochemical markers of the disease entity and not the disease itself, leaving room for what happens next if there is refractory/uncontrolled hypercalcaemia in this patient with a pacemaker. Moreover, the coexistence of her multiple conditions raises the suspicion of an underlying multisystemic or multiple endocrine disorder, with multiple endocrine neoplasia coming to mind, necessitating further genetic or autoimmune investigations. Conclusion: Mediastinal ectopic parathyroid adenomas are rare, with diagnostic and management challenges.

Keywords: mediastinal ectopic parathyroid adenoma, hyperparathyroidism, SPECT/CT, nuclear medicine, multimodal imaging

Procedia PDF Downloads 16
4234 Object Negotiation Mechanism for an Intelligent Environment Using Event Agents

Authors: Chiung-Hui Chen

Abstract:

With advancements in science and technology, the concept of the Internet of Things (IoT) has gradually developed. The development of the intelligent environment adds intelligence to objects in the living space by using the IoT. In the smart environment, when multiple users share the living space, if different service requirements from different users arise, then the context-aware system will have conflicting situations for making decisions about providing services. Therefore, the purpose of establishing a communication and negotiation mechanism among objects in the intelligent environment is to resolve those service conflicts among users. This study proposes developing a decision-making methodology that uses “Event Agents” as its core. When the sensor system receives information, it evaluates a user’s current events and conditions; analyses object, location, time, and environmental information; calculates the priority of the object; and provides the user services based on the event. Moreover, when the event is not single but overlaps with another, conflicts arise. This study adopts the “Multiple Events Correlation Matrix” in order to calculate the degree values of incidents and support values for each object. The matrix uses these values as the basis for making inferences for system service, and to further determine appropriate services when there is a conflict.

Keywords: internet of things, intelligent object, event agents, negotiation mechanism, degree of similarity

Procedia PDF Downloads 290
4233 Triassic and Liassic Paleoenvironments during the Central Atlantic Magmatique Province (CAMP) Effusion in the Moroccan Coastal Meseta: The Mohammedia-Benslimane-El Gara-Berrechid Basin

Authors: Rachid Essamoud, Abdelkrim Afenzar, Ahmed Belqadi

Abstract:

During the Early Mesozoic, the northwestern part of the African continent was affected by initial fracturing associated with the early stages of the opening of the Central Atlantic (Atlantic Rift). During this rifting phase, the Moroccan Meseta experienced an extensive tectonic regime. This extension favored the formation of a set of rift-type basins, including the Mohammedia-Benslimane-ElGara-Berrechid basin. Thus, it is essential to know the nature of the deposits in this basin and their evolution over time as well as their relationship with the basaltic effusion of the Central Atlantic Magmatic Province (CAMP). These deposits are subdivided into two large series: The Lower clay-salt series attributed to the Triassic and the Upper clay-salt series attributed to the Liassic. The two series are separated by the Upper Triassic-Lower Liassic basaltic complex. The detailed sedimentological analysis made it possible to characterize four mega-sequences, fifteen types of facies and eight architectural elements and facies associations in the Triassic series. A progressive decrease observed in paleo-slope over time led to the evolution of the paleoenvironment from a proximal system of alluvial fans to a braided fluvial style, then to an anastomosed system. These environments eventually evolved into an alluvial plain associated with a coastal plain where playa lakes, mudflats and lagoons had developed. The pure and massive halitic facies at the top of the series probably indicate an evolution of the depositional environment towards a shallow subtidal environment. The presence of these evaporites indicates a climate that favored their precipitation, in this case, a fairly hot and humid climate. The sedimentological analysis of the supra-basaltic part shows that during the Lower Liassic, the paleopente after basaltic effusion remained weak with distal environments. The faciological analysis revealed the presence of four major sandstone, silty, clayey and evaporitic lithofacies organized in two mega-sequences: the sedimentation of the first rock-salt mega-sequence took place in a brine depression system free, followed by saline mudflats under continental influences. The upper clay mega-sequence displays facies documenting sea level fluctuations from the final transgression of the Tethys or the opening Atlantic. Saliferous sedimentation is therefore favored from the Upper Triassic, but experienced a sudden rupture by the emission of basaltic flows which are interstratified in the azoic salt clays of very shallow seas. This basaltic emission which belongs to the CAMP would come from a fissural volcanism probably carried out through transfer faults located in the NW and SE of the basin. Their emplacement is probably subaquatic to subaerial. From a chronological and paleogeographic point of view, this main volcanism, dated between the Upper Triassic and the Lower Liassic (180-200 MA), is linked to the fragmentation of Pangea and managed by a progressive expansion triggered in the West in close relation with the initial phases of Central Atlantic rifting and seems to coincide with the major mass extinction at the Triassic-Jurassic boundary.

Keywords: Basalt, CAMP, Liassic, sedimentology, Triassic, Morocco

Procedia PDF Downloads 75
4232 Poly(Trimethylene Carbonate)/Poly(ε-Caprolactone) Phase-Separated Triblock Copolymers with Advanced Properties

Authors: Nikola Toshikj, Michel Ramonda, Sylvain Catrouillet, Jean-Jacques Robin, Sebastien Blanquer

Abstract:

Biodegradable and biocompatible block copolymers have risen as the golden materials in both medical and environmental applications. Moreover, if their architecture is of controlled manner, higher applications can be foreseen. In the meantime, organocatalytic ROP has been promoted as more rapid and immaculate route, compared to the traditional organometallic catalysis, towards efficient synthesis of block copolymer architectures. Therefore, herein we report novel organocatalytic pathway with guanidine molecules (TBD) for supported synthesis of trimethylene carbonate initiated by poly(caprolactone) as pre-polymer. Pristine PTMC-b-PCL-b-PTMC block copolymer structure, without any residual products and clear desired block proportions, was achieved under 1.5 hours at room temperature and verified by NMR spectroscopies and size-exclusion chromatography. Besides, when elaborating block copolymer films, further stability and amelioration of mechanical properties can be achieved via additional reticulation step of precedently methacrylated block copolymers. Subsequently, stimulated by the insufficient studies on the phase-separation/crystallinity relationship in these semi-crystalline block copolymer systems, their intrinsic thermal and morphology properties were investigated by differential scanning calorimetry and atomic force microscopy. Firstly, by DSC measurements, the block copolymers with χABN values superior to 20 presented two distinct glass transition temperatures, close to the ones of the respecting homopolymers, demonstrating an initial indication of a phase-separated system. In the interim, the existence of the crystalline phase was supported by the presence of melting temperature. As expected, the crystallinity driven phase-separated morphology predominated in the AFM analysis of the block copolymers. Neither crosslinking at melted state, hence creation of a dense polymer network, disturbed the crystallinity phenomena. However, the later revealed as sensible to rapid liquid nitrogen quenching directly from the melted state. Therefore, AFM analysis of liquid nitrogen quenched and crosslinked block copolymer films demonstrated a thermodynamically driven phase-separation clearly predominating over the originally crystalline one. These AFM films remained stable with their morphology unchanged even after 4 months at room temperature. However, as demonstrated by DSC analysis once rising the temperature above the melting temperature of the PCL block, neither the crosslinking nor the liquid nitrogen quenching shattered the semi-crystalline network, while the access to thermodynamical phase-separated structures was possible for temperatures under the poly (caprolactone) melting point. Precisely this coexistence of dual crosslinked/crystalline networks in the same copolymer structure allowed us to establish, for the first time, the shape-memory properties in such materials, as verified by thermomechanical analysis. Moreover, the response temperature to the material original shape depended on the block copolymer emplacement, hence PTMC or PCL as end-block. Therefore, it has been possible to reach a block copolymer with transition temperature around 40°C thus opening potential real-life medical applications. In conclusion, the initial study of phase-separation/crystallinity relationship in PTMC-b-PCL-b-PTMC block copolymers lead to the discovery of novel shape memory materials with superior properties, widely demanded in modern-life applications.

Keywords: biodegradable block copolymers, organocatalytic ROP, self-assembly, shape-memory

Procedia PDF Downloads 128
4231 Dietary Pattern and Risk of Breast Cancer Among Women:a Case Control Study

Authors: Huma Naqeeb

Abstract:

Epidemiological studies have shown the robust link between breast cancer and dietary pattern. There has been no previous study conducted in Pakistan, which specifically focuses on dietary patterns among breast cancer women. This study aims to examine the association of breast cancer with dietary patterns among Pakistani women. This case-control research was carried in multiple tertiary care facilities. Newly diagnosed primary breast cancer patients were recruited as cases (n = 408); age matched controls (n = 408) were randomly selected from the general population. Data on required parameters were systematically collected using subjective and objective tools. Factor and Principal Component Analysis (PCA) techniques were used to extract women’s dietary patterns. Four dietary patterns were identified based on eigenvalue >1; (i) veg-ovo-fish, (ii) meat-fat-sweet, (iii) mix (milk and its products, and gourds vegetables) and (iv) lentils - spices. Results of the multiple regressions were displayed as adjusted odds ratio (Adj. OR) and their respective confidence intervals (95% CI). After adjusted for potential confounders, veg-ovo-fish dietary pattern was found to be robustly associated with a lower risk of breast cancer among women (Adj. OR: 0.68, 95%CI: (0.46-0.99, p<0.01). The study findings concluded that attachment to the diets majorly composed of fresh vegetables, and high quality protein sources may contribute in lowering the risk of breast cancer among women.

Keywords: breast cancer, dietary pattern, women, principal component analysis

Procedia PDF Downloads 123
4230 Decision Tree Based Scheduling for Flexible Job Shops with Multiple Process Plans

Authors: H.-H. Doh, J.-M. Yu, Y.-J. Kwon, J.-H. Shin, H.-W. Kim, S.-H. Nam, D.-H. Lee

Abstract:

This paper suggests a decision tree based approach for flexible job shop scheduling with multiple process plans, i. e. each job can be processed through alternative operations, each of which can be processed on alternative machines. The main decision variables are: (a) selecting operation/machine pair; and (b) sequencing the jobs assigned to each machine. As an extension of the priority scheduling approach that selects the best priority rule combination after many simulation runs, this study suggests a decision tree based approach in which a decision tree is used to select a priority rule combination adequate for a specific system state and hence the burdens required for developing simulation models and carrying out simulation runs can be eliminated. The decision tree based scheduling approach consists of construction and scheduling modules. In the construction module, a decision tree is constructed using a four-stage algorithm, and in the scheduling module, a priority rule combination is selected using the decision tree. To show the performance of the decision tree based approach suggested in this study, a case study was done on a flexible job shop with reconfigurable manufacturing cells and a conventional job shop, and the results are reported by comparing it with individual priority rule combinations for the objectives of minimizing total flow time and total tardiness.

Keywords: flexible job shop scheduling, decision tree, priority rules, case study

Procedia PDF Downloads 358
4229 Environmental Performance Improvement of Additive Manufacturing Processes with Part Quality Point of View

Authors: Mazyar Yosofi, Olivier Kerbrat, Pascal Mognol

Abstract:

Life cycle assessment of additive manufacturing processes has evolved significantly since these past years. A lot of existing studies mainly focused on energy consumption. Nowadays, new methodologies of life cycle inventory acquisition came through the literature and help manufacturers to take into account all the input and output flows during the manufacturing step of the life cycle of products. Indeed, the environmental analysis of the phenomena that occur during the manufacturing step of additive manufacturing processes is going to be well known. Now it becomes possible to count and measure accurately all the inventory data during the manufacturing step. Optimization of the environmental performances of processes can now be considered. Environmental performance improvement can be made by varying process parameters. However, a lot of these parameters (such as manufacturing speed, the power of the energy source, quantity of support materials) affect directly the mechanical properties, surface finish and the dimensional accuracy of a functional part. This study aims to improve the environmental performance of an additive manufacturing process without deterioration of the part quality. For that purpose, the authors have developed a generic method that has been applied on multiple parts made by additive manufacturing processes. First, a complete analysis of the process parameters is made in order to identify which parameters affect only the environmental performances of the process. Then, multiple parts are manufactured by varying the identified parameters. The aim of the second step is to find the optimum value of the parameters that decrease significantly the environmental impact of the process and keep the part quality as desired. Finally, a comparison between the part made by initials parameters and changed parameters is made. In this study, the major finding claims by authors is to reduce the environmental impact of an additive manufacturing process while respecting the three quality criterion of parts, mechanical properties, dimensional accuracy and surface roughness. Now that additive manufacturing processes can be seen as mature from a technical point of view, environmental improvement of these processes can be considered while respecting the part properties. The first part of this study presents the methodology applied to multiple academic parts. Then, the validity of the methodology is demonstrated on functional parts.

Keywords: additive manufacturing, environmental impact, environmental improvement, mechanical properties

Procedia PDF Downloads 288
4228 The Mediating Role of Store Personality in the Relationship Between Self-Congruity and Manifestations of Loyalty

Authors: María de los Ángeles Crespo López, Carmen García García

Abstract:

The highly competitive nature of today's globalised marketplace requires that brands and stores develop effective commercial strategies to ensure their economic survival. Maintaining the loyalty of existing customers constitutes one key strategy that yields the best results. Although the relationship between consumers' self-congruity and their manifestations of loyalty towards a store has been investigated, the role of store personality in this relationship remains unclear. In this study, multiple parallel mediation analysis was used to examine the effect of Store Personality on the relationship between Self-Congruity of consumers and their Manifestations of Loyalty. For this purpose, 457 Spanish consumers of the Fnac store completed three self-report questionnaires assessing Store Personality, Self-Congruity, and Store Loyalty. The data were analyzed using the SPSS macro PROCESS. The results revealed that three dimensions of Store Personality, namely Exciting, Close and Competent Store, positively and significantly mediated the relationship between Self-Congruity and Manifestations of Loyalty. The indirect effect of Competent Store was the greatest. This means that a consumer with higher levels of Self-Congruity with the store will exhibit more Manifestations of Loyalty when the store is perceived as Exciting, Close or Competent. These findings suggest that more attention should be paid to the perceived personality of stores for the development of effective marketing strategies to maintain or increase consumers' manifestations of loyalty towards stores.

Keywords: multiple parallel mediation, PROCESS, self-congruence, store loyalty, store personality

Procedia PDF Downloads 158
4227 Investigating the Potential for Introduction of Warm Mix Asphalt in Kuwait Using the Volcanic Ash

Authors: H. Al-Baghli, F. Al-Asfour

Abstract:

The current applied asphalt technology for Kuwait roads pavement infrastructure is the hot mix asphalt (HMA) pavement, including both pen grade and polymer modified bitumen (PMBs), that is produced and compacted at high temperature levels ranging from 150 to 180 °C. There are no current specifications for warm and cold mix asphalts in Kuwait’s Ministry of Public Works (MPW) asphalt standard and specifications. The process of the conventional HMA is energy intensive and directly responsible for the emission of greenhouse gases and other environmental hazards into the atmosphere leading to significant environmental impacts and raising health risk to labors at site. Warm mix asphalt (WMA) technology, a sustainable alternative preferred in multiple countries, has many environmental advantages because it requires lower production temperatures than HMA by 20 to 40 °C. The reduction of temperatures achieved by WMA originates from multiple technologies including foaming and chemical or organic additives that aim to reduce bitumen and improve mix workability. This paper presents a literature review of WMA technologies and techniques followed by an experimental study aiming to compare the results of produced WMA samples, using a water containing additive (foaming process), at different compaction temperatures with the HMA control volumetric properties mix designed in accordance to the new MPW’s specifications and guidelines.

Keywords: warm-mix asphalt, water-bearing additives, foaming-based process, chemical additives, organic additives

Procedia PDF Downloads 124
4226 The Relationship between Corporate Governance and Intellectual Capital Disclosure: Malaysian Evidence

Authors: Rabiaal Adawiyah Shazali, Corina Joseph

Abstract:

The disclosure of Intellectual Capital (IC) information is getting more vital in today’s era of a knowledge-based economy. Companies are advised by accounting bodies to enhance IC disclosure which complements the conventional financial disclosures. There are no accounting standards for Intellectual Capital Disclosure (ICD), therefore the disclosure is entirely voluntary. Hence, this study aims to investigate the extent of ICD and to examine the relationship between corporate governance and ICD in Malaysia. This study employed content analysis of 100 annual reports by the top 100 public listed companies in Malaysia during 2012. The uniqueness of this study lies on its underpinning theory used where it applies the institutional isomorphism theory to support the effect of the attributes of corporate governance towards ICD. In order to achieve the stated objective, multiple regression analysis were employed to conduct this study. From the descriptive statistics, it was concluded that public listed companies in Malaysia have increased their awareness towards the importance of ICD. Furthermore, results from the multiple regression analysis confirmed that corporate governance affects the company’s ICD where the frequency of audit committee meetings and the board size has positively influenced the level of ICD in companies. Findings from this study would provide an incentive for companies in Malaysia to enhance the disclosure of IC. In addition, this study would assist Bursa Malaysia and other regulatory bodies to come up with a proper guideline for the disclosure of IC.

Keywords: annual report, content analysis, corporate governance, intellectual capital disclosure

Procedia PDF Downloads 215
4225 Case of A Huge Retroperitoneal Abscess Spanning from the Diaphragm to the Pelvic Brim

Authors: Christopher Leung, Tony Kim, Rebecca Lendzion, Scott Mackenzie

Abstract:

Retroperitoneal abscesses are a rare but serious condition with often delayed diagnosis, non-specific symptoms, multiple causes and high morbidity/mortality. With the advent of more readily available cross-sectional imaging, retroperitoneal abscesses are treated earlier and better outcomes are achieved. Occasionally, a retroperitoneal abscess is present as a huge retroperitoneal abscess, as evident in this 53-year-old male. With a background of chronic renal disease and left partial nephrectomy, this gentleman presented with a one-month history of left flank pain without any other symptoms, including fevers or abdominal pain. CT abdomen and pelvis demonstrated a huge retroperitoneal abscess spanning from the diaphragm, abutting the spleen, down to the iliopsoas muscle and abutting the iliac vessels at the pelvic brim. This large retroperitoneal abscess required open drainage as well as drainage by interventional radiology. A long course of intravenous antibiotics and multiple drainages was required to drain the abscess. His blood culture and fluid culture grew Proteus species suggesting a urinary source, likely from his non-functioning kidney, which had a partial nephrectomy. Such a huge retroperitoneal abscess has rarely been described in the literature. The learning point here is that the basic principle of source control and antibiotics is paramount in treating retroperitoneal abscesses regardless of the size of the abscess.

Keywords: retroperitoneal abscess, retroperitoneal mass, sepsis, genitourinary infection

Procedia PDF Downloads 221
4224 Learning Curve Effect on Materials Procurement Schedule of Multiple Sister Ships

Authors: Vijaya Dixit Aasheesh Dixit

Abstract:

Shipbuilding industry operates in Engineer Procure Construct (EPC) context. Product mix of a shipyard comprises of various types of ships like bulk carriers, tankers, barges, coast guard vessels, sub-marines etc. Each order is unique based on the type of ship and customized requirements, which are engineered into the product right from design stage. Thus, to execute every new project, a shipyard needs to upgrade its production expertise. As a result, over the long run, holistic learning occurs across different types of projects which contributes to the knowledge base of the shipyard. Simultaneously, in the short term, during execution of a project comprising of multiple sister ships, repetition of similar tasks leads to learning at activity level. This research aims to capture above learnings of a shipyard and incorporate learning curve effect in project scheduling and materials procurement to improve project performance. Extant literature provides support for the existence of such learnings in an organization. In shipbuilding, there are sequences of similar activities which are expected to exhibit learning curve behavior. For example, the nearly identical structural sub-blocks which are successively fabricated, erected, and outfitted with piping and electrical systems. Learning curve representation can model not only a decrease in mean completion time of an activity, but also a decrease in uncertainty of activity duration. Sister ships have similar material requirements. The same supplier base supplies materials for all the sister ships within a project. On one hand, this provides an opportunity to reduce transportation cost by batching the order quantities of multiple ships. On the other hand, it increases the inventory holding cost at shipyard and the risk of obsolescence. Further, due to learning curve effect the production scheduled of each consequent ship gets compressed. Thus, the material requirement schedule of every next ship differs from its previous ship. As more and more ships get constructed, compressed production schedules increase the possibility of batching the orders of sister ships. This work aims at integrating materials management with project scheduling of long duration projects for manufacturing of multiple sister ships. It incorporates the learning curve effect on progressively compressing material requirement schedules and addresses the above trade-off of transportation cost and inventory holding and shortage costs while satisfying budget constraints of various stages of the project. The activity durations and lead time of items are not crisp and are available in the form of probabilistic distribution. A Stochastic Mixed Integer Programming (SMIP) model is formulated which is solved using evolutionary algorithm. Its output provides ordering dates of items and degree of order batching for all types of items. Sensitivity analysis determines the threshold number of sister ships required in a project to leverage the advantage of learning curve effect in materials management decisions. This analysis will help materials managers to gain insights about the scenarios: when and to what degree is it beneficial to treat a multiple ship project as an integrated one by batching the order quantities and when and to what degree to practice distinctive procurement for individual ship.

Keywords: learning curve, materials management, shipbuilding, sister ships

Procedia PDF Downloads 502
4223 An Overbooking Model for Car Rental Service with Different Types of Cars

Authors: Naragain Phumchusri, Kittitach Pongpairoj

Abstract:

Overbooking is a very useful revenue management technique that could help reduce costs caused by either undersales or oversales. In this paper, we propose an overbooking model for two types of cars that can minimize the total cost for car rental service. With two types of cars, there is an upgrade possibility for lower type to upper type. This makes the model more complex than one type of cars scenario. We have found that convexity can be proved in this case. Sensitivity analysis of the parameters is conducted to observe the effects of relevant parameters on the optimal solution. Model simplification is proposed using multiple linear regression analysis, which can help estimate the optimal overbooking level using appropriate independent variables. The results show that the overbooking level from multiple linear regression model is relatively close to the optimal solution (with the adjusted R-squared value of at least 72.8%). To evaluate the performance of the proposed model, the total cost was compared with the case where the decision maker uses a naïve method for the overbooking level. It was found that the total cost from optimal solution is only 0.5 to 1 percent (on average) lower than the cost from regression model, while it is approximately 67% lower than the cost obtained by the naïve method. It indicates that our proposed simplification method using regression analysis can effectively perform in estimating the overbooking level.

Keywords: overbooking, car rental industry, revenue management, stochastic model

Procedia PDF Downloads 172
4222 Classification of Land Cover Usage from Satellite Images Using Deep Learning Algorithms

Authors: Shaik Ayesha Fathima, Shaik Noor Jahan, Duvvada Rajeswara Rao

Abstract:

Earth's environment and its evolution can be seen through satellite images in near real-time. Through satellite imagery, remote sensing data provide crucial information that can be used for a variety of applications, including image fusion, change detection, land cover classification, agriculture, mining, disaster mitigation, and monitoring climate change. The objective of this project is to propose a method for classifying satellite images according to multiple predefined land cover classes. The proposed approach involves collecting data in image format. The data is then pre-processed using data pre-processing techniques. The processed data is fed into the proposed algorithm and the obtained result is analyzed. Some of the algorithms used in satellite imagery classification are U-Net, Random Forest, Deep Labv3, CNN, ANN, Resnet etc. In this project, we are using the DeepLabv3 (Atrous convolution) algorithm for land cover classification. The dataset used is the deep globe land cover classification dataset. DeepLabv3 is a semantic segmentation system that uses atrous convolution to capture multi-scale context by adopting multiple atrous rates in cascade or in parallel to determine the scale of segments.

Keywords: area calculation, atrous convolution, deep globe land cover classification, deepLabv3, land cover classification, resnet 50

Procedia PDF Downloads 139
4221 The Impact of Missense Mutation in Phosphatidylinositol Glycan Class A Associated to Paroxysmal Nocturnal Hemoglobinuria and Multiple Congenital Anomalies-Hypotonia-Seizures Syndrome 2: A Computational Study

Authors: Ashish Kumar Agrahari, Amit Kumar

Abstract:

Paroxysmal nocturnal hemoglobinuria (PNH) is an acquired clonal blood disorder that manifests with hemolytic anemia, thrombosis, and peripheral blood cytopenias. The disease is caused by the deficiency of two glycosylphosphatidylinositols (GPI)-anchored proteins (CD55 and CD59) in the hemopoietic stem cells. The deficiency of GPI-anchored proteins has been associated with the somatic mutations in phosphatidylinositol glycan class A (PIGA). However, the mutations that do not cause PNH is associated with the multiple congenital anomalies-hypotonia-seizures syndrome 2 (MCAHS2). To best of our knowledge, no computational study has been performed to explore the atomistic level impact of PIGA mutations on the structure and dynamics of the protein. In the current work, we are mainly interested to get insights into the molecular mechanism of PIGA mutations. In the initial step, we screened the most pathogenic mutations from the pool of publicly available mutations. Further, to get a better understanding, pathogenic mutations were mapped to the modeled structure and subjected to 50ns molecular dynamics simulation. Our computational study suggests that four mutations are highly vulnerable to altering the structural conformation and stability of the PIGA protein, which illustrates its association with PNH and MCAHS2 phenotype.

Keywords: homology modeling, molecular dynamics simulation, missense mutations PNH, MCAHS2, PIGA

Procedia PDF Downloads 145
4220 Semantic Differences between Bug Labeling of Different Repositories via Machine Learning

Authors: Pooja Khanal, Huaming Zhang

Abstract:

Labeling of issues/bugs, also known as bug classification, plays a vital role in software engineering. Some known labels/classes of bugs are 'User Interface', 'Security', and 'API'. Most of the time, when a reporter reports a bug, they try to assign some predefined label to it. Those issues are reported for a project, and each project is a repository in GitHub/GitLab, which contains multiple issues. There are many software project repositories -ranging from individual projects to commercial projects. The labels assigned for different repositories may be dependent on various factors like human instinct, generalization of labels, label assignment policy followed by the reporter, etc. While the reporter of the issue may instinctively give that issue a label, another person reporting the same issue may label it differently. This way, it is not known mathematically if a label in one repository is similar or different to the label in another repository. Hence, the primary goal of this research is to find the semantic differences between bug labeling of different repositories via machine learning. Independent optimal classifiers for individual repositories are built first using the text features from the reported issues. The optimal classifiers may include a combination of multiple classifiers stacked together. Then, those classifiers are used to cross-test other repositories which leads the result to be deduced mathematically. The produce of this ongoing research includes a formalized open-source GitHub issues database that is used to deduce the similarity of the labels pertaining to the different repositories.

Keywords: bug classification, bug labels, GitHub issues, semantic differences

Procedia PDF Downloads 200
4219 Hybrid Weighted Multiple Attribute Decision Making Handover Method for Heterogeneous Networks

Authors: Mohanad Alhabo, Li Zhang, Naveed Nawaz

Abstract:

Small cell deployment in 5G networks is a promising technology to enhance capacity and coverage. However, unplanned deployment may cause high interference levels and high number of unnecessary handovers, which in turn will result in an increase in the signalling overhead. To guarantee service continuity, minimize unnecessary handovers, and reduce signalling overhead in heterogeneous networks, it is essential to properly model the handover decision problem. In this paper, we model the handover decision according to Multiple Attribute Decision Making (MADM) method, specifically Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS). In this paper, we propose a hybrid TOPSIS method to control the handover in heterogeneous network. The proposed method adopts a hybrid weighting, which is a combination of entropy and standard deviation. A hybrid weighting control parameter is introduced to balance the impact of the standard deviation and entropy weighting on the network selection process and the overall performance. Our proposed method shows better performance, in terms of the number of frequent handovers and the mean user throughput, compared to the existing methods.

Keywords: handover, HetNets, interference, MADM, small cells, TOPSIS, weight

Procedia PDF Downloads 149
4218 Joint Modeling of Longitudinal and Time-To-Event Data with Latent Variable

Authors: Xinyuan Y. Song, Kai Kang

Abstract:

Joint models for analyzing longitudinal and survival data are widely used to investigate the relationship between a failure time process and time-variant predictors. A common assumption in conventional joint models in the survival analysis literature is that all predictors are observable. However, this assumption may not always be supported because unobservable traits, namely, latent variables, which are indirectly observable and should be measured through multiple observed variables, are commonly encountered in the medical, behavioral, and financial research settings. In this study, a joint modeling approach to deal with this feature is proposed. The proposed model comprises three parts. The first part is a dynamic factor analysis model for characterizing latent variables through multiple observed indicators over time. The second part is a random coefficient trajectory model for describing the individual trajectories of latent variables. The third part is a proportional hazard model for examining the effects of time-invariant predictors and the longitudinal trajectories of time-variant latent risk factors on hazards of interest. A Bayesian approach coupled with a Markov chain Monte Carlo algorithm to perform statistical inference. An application of the proposed joint model to a study on the Alzheimer's disease neuroimaging Initiative is presented.

Keywords: Bayesian analysis, joint model, longitudinal data, time-to-event data

Procedia PDF Downloads 144
4217 Effect of Climate Variability on Honeybee's Production in Ondo State, Nigeria

Authors: Justin Orimisan Ijigbade

Abstract:

The study was conducted to assess the effect of climate variability on honeybee’s production in Ondo State, Nigeria. Multistage sampling technique was employed to collect the data from 60 beekeepers across six Local Government Areas in Ondo State. Data collected were subjected to descriptive statistics and multiple regression model analyses. The results showed that 93.33% of the respondents were male with 80% above 40 years of age. Majority of the respondents (96.67%) had formal education and 90% produced honey for commercial purpose. The result revealed that 90% of the respondents admitted that low temperature as a result of long hours/period of rainfall affected the foraging efficiency of the worker bees, 73.33% claimed that long period of low humidity resulted in low level of nectar flow, while 70% submitted that high temperature resulted in improper composition of workers, dunes and queen in the hive colony. The result of multiple regression showed that beekeepers’ experience, educational level, access to climate information, temperature and rainfall were the main factors affecting honey bees production in the study area. Therefore, beekeepers should be given more education on climate variability and its adaptive strategies towards ensuring better honeybees production in the study area.

Keywords: climate variability, honeybees production, humidity, rainfall and temperature

Procedia PDF Downloads 272
4216 A Framework for Designing Complex Product-Service Systems with a Multi-Domain Matrix

Authors: Yoonjung An, Yongtae Park

Abstract:

Offering a Product-Service System (PSS) is a well-accepted strategy that companies may adopt to provide a set of systemic solutions to customers. PSSs were initially provided in a simple form but now take diversified and complex forms involving multiple services, products and technologies. With the growing interest in the PSS, frameworks for the PSS development have been introduced by many researchers. However, most of the existing frameworks fail to examine various relations existing in a complex PSS. Since designing a complex PSS involves full integration of multiple products and services, it is essential to identify not only product-service relations but also product-product/ service-service relations. It is also equally important to specify how they are related for better understanding of the system. Moreover, as customers tend to view their purchase from a more holistic perspective, a PSS should be developed based on the whole system’s requirements, rather than focusing only on the product requirements or service requirements. Thus, we propose a framework to develop a complex PSS that is coordinated fully with the requirements of both worlds. Specifically, our approach adopts a multi-domain matrix (MDM). A MDM identifies not only inter-domain relations but also intra-domain relations so that it helps to design a PSS that includes highly desired and closely related core functions/ features. Also, various dependency types and rating schemes proposed in our approach would help the integration process.

Keywords: inter-domain relations, intra-domain relations, multi-domain matrix, product-service system design

Procedia PDF Downloads 641
4215 Automatic Identification and Monitoring of Wildlife via Computer Vision and IoT

Authors: Bilal Arshad, Johan Barthelemy, Elliott Pilton, Pascal Perez

Abstract:

Getting reliable, informative, and up-to-date information about the location, mobility, and behavioural patterns of animals will enhance our ability to research and preserve biodiversity. The fusion of infra-red sensors and camera traps offers an inexpensive way to collect wildlife data in the form of images. However, extracting useful data from these images, such as the identification and counting of animals remains a manual, time-consuming, and costly process. In this paper, we demonstrate that such information can be automatically retrieved by using state-of-the-art deep learning methods. Another major challenge that ecologists are facing is the recounting of one single animal multiple times due to that animal reappearing in other images taken by the same or other camera traps. Nonetheless, such information can be extremely useful for tracking wildlife and understanding its behaviour. To tackle the multiple count problem, we have designed a meshed network of camera traps, so they can share the captured images along with timestamps, cumulative counts, and dimensions of the animal. The proposed method takes leverage of edge computing to support real-time tracking and monitoring of wildlife. This method has been validated in the field and can be easily extended to other applications focusing on wildlife monitoring and management, where the traditional way of monitoring is expensive and time-consuming.

Keywords: computer vision, ecology, internet of things, invasive species management, wildlife management

Procedia PDF Downloads 138
4214 Object-Scene: Deep Convolutional Representation for Scene Classification

Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang

Abstract:

Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.

Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization

Procedia PDF Downloads 331
4213 Targeting Calcium Dysregulation for Treatment of Dementia in Alzheimer's Disease

Authors: Huafeng Wei

Abstract:

Dementia in Alzheimer’s Disease (AD) is the number one cause of dementia internationally, without effective treatments. Increasing evidence suggest that disruption of intracellular calcium homeostasis, primarily pathological elevation of cytosol and mitochondria but reduction of endoplasmic reticulum (ER) calcium concentrations, play critical upstream roles on multiple pathologies and associated neurodegeneration, impaired neurogenesis, synapse, and cognitive dysfunction in various AD preclinical studies. The last federal drug agency (FDA) approved drug for AD dementia treatment, memantine, exert its therapeutic effects by ameliorating N-methyl-D-aspartate (NMDA) glutamate receptor overactivation and subsequent calcium dysregulation. More research works are needed to develop other drugs targeting calcium dysregulation at multiple pharmacological acting sites for future effective AD dementia treatment. Particularly, calcium channel blockers for the treatment of hypertension and dantrolene for the treatment of muscle spasm and malignant hyperthermia can be repurposed for this purpose. In our own research work, intranasal administration of dantrolene significantly increased its brain concentrations and durations, rendering it a more effective therapeutic drug with less side effects for chronic AD dementia treatment. This review summarizesthe progress of various studies repurposing drugs targeting calcium dysregulation for future effective AD dementia treatment as potentially disease-modifying drugs.

Keywords: alzheimer, calcium, cognitive dysfunction, dementia, neurodegeneration, neurogenesis

Procedia PDF Downloads 182