-->

Java Spring Recreate specific Bean

2020-02-27 11:53发布

问题:

I want to re-create (new Object) a specific bean at Runtime (no restarting the server) upon some DB changes. This is how it looks -

@Component
public class TestClass {

    @Autowired 
    private MyShop myShop; //to be refreshed at runtime bean

    @PostConstruct //DB listeners
    public void initializeListener() throws Exception {
        //...
        // code to get listeners config
        //...

        myShop.setListenersConfig(listenersConfig);
        myShop.initialize();
    }

    public void restartListeners() {
        myShop.shutdownListeners();
        initializeListener();
    }
}

This code does not run as myShop object is created by Spring as Singleton & its context does not get refreshed unless the server is restarted. How to refresh (create a new object) myShop ?

One bad way I can think of is to create new myShop object inside restartListeners() but that does not seem right to me.

回答1:

In DefaultListableBeanFactory you have public method destroySingleton("beanName")so you can play with it, but you have to be aware that if your autowired your bean it will keep the same instance of the object that has been autowired in the first place, you can try something like this:

@RestController
public class MyRestController  {

        @Autowired
        SampleBean sampleBean;

        @Autowired
        ApplicationContext context;
        @Autowired
        DefaultListableBeanFactory beanFactory;

        @RequestMapping(value = "/ ")
        @ResponseBody
        public String showBean() throws Exception {

            SampleBean contextBean = (SampleBean) context.getBean("sampleBean");

            beanFactory.destroySingleton("sampleBean");

            return "Compare beans    " + sampleBean + "==" 

    + contextBean;

    //while sampleBean stays the same contextBean gets recreated in the context
            }

    }

It is not pretty but shows how you can approach it. If you were dealing with a controller rather than a component class, you could have an injection in method argument and it would also work, because Bean would not be recreated until needed inside the method, at least that's what it looks like. Interesting question would be who else has reference to the old Bean besides the object it has been autowired into in the first place,because it has been removed from the context, I wonder if it still exists or is garbage colected if released it in the controller above, if some other objects in the context had reference to it, above would cause problems.



回答2:

We have the same use-case. As already mentioned one of the main issues with re-creating a bean during runtime is how to updating the references that have already been injected. This presents the main challenge.

To work around this issue I’ve used Java’s AtomicReference<> class. Instead of injecting the bean directly, I’ve wrapped it as an AtomicReference and then inject that. Because the object wrapped by the AtomicReference can be reset in a thread safe manner, I am able to use this to change the underlying object when a database change is detected. Below is an example config / usage of this pattern:

@Configuration
public class KafkaConfiguration {

    private static final String KAFKA_SERVER_LIST = "kafka.server.list";
    private static AtomicReference<String> serverList;

    @Resource
    MyService myService;

    @PostConstruct
    public void init() {
        serverList = new AtomicReference<>(myService.getPropertyValue(KAFKA_SERVER_LIST));
    }

    // Just a helper method to check if the value for the server list has changed
    // Not a big fan of the static usage but needed a way to compare the old / new values
    public static boolean isRefreshNeeded() {

        MyService service = Registry.getApplicationContext().getBean("myService", MyService.class);    
        String newServerList = service.getPropertyValue(KAFKA_SERVER_LIST);

        // Arguably serverList does not need to be Atomic for this usage as this is executed
        // on a single thread
        if (!StringUtils.equals(serverList.get(), newServerList)) {
            serverList.set(newServerList);
            return true;
        }

        return false;
    }

    public ProducerFactory<String, String> kafkaProducerFactory() {

        Map<String, Object> configProps = new HashMap<>();
        configProps.put(ProducerConfig.CLIENT_ID_CONFIG, "...");

        // Here we are pulling the value for the serverList that has been set
        // see the init() and isRefreshNeeded() methods above
        configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, serverList.get());

        configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        return new DefaultKafkaProducerFactory<>(configProps);
    }

    @Bean
    @Lazy
    public AtomicReference<KafkaTemplate<String, String>> kafkaTemplate() {

        KafkaTemplate<String, String> template = new KafkaTemplate<>(kafkaProducerFactory());
        AtomicReference<KafkaTemplate<String, String>> ref = new AtomicReference<>(template);
        return ref;
    }
}

I then inject the bean where needed, e.g.

public MyClass1 {

    @Resource 
    AtomicReference<KafkaTemplate<String, String>> kafkaTemplate;
    ...
}

public MyClass2 {

    @Resource 
    AtomicReference<KafkaTemplate<String, String>> kafkaTemplate;
    ...
}

In a separate class I run a scheduler thread that is started when the application context is started. The class looks something like this:

class Manager implements Runnable {

    private ScheduledExecutorService scheduler;

    public void start() {
        scheduler = Executors.newSingleThreadScheduledExecutor();
        scheduler.scheduleAtFixedRate(this, 0, 120, TimeUnit.SECONDS);
    }

    public void stop() {
        scheduler.shutdownNow();
    }

    @Override
    public void run() {

        try {
            if (KafkaConfiguration.isRefreshNeeded()) {

                AtomicReference<KafkaTemplate<String, String>> kafkaTemplate = 
                    (AtomicReference<KafkaTemplate<String, String>>) Registry.getApplicationContext().getBean("kafkaTemplate");

                // Get new instance here.  This will have the new value for the server list
                // that was "refreshed"
                KafkaConfiguration config = new KafkaConfiguration();

                // The set here replaces the wrapped objet in a thread safe manner with the new bean
                // and thus all injected instances now use the newly created object
                kafkaTemplate.set(config.kafkaTemplate().get());
            }

        } catch (Exception e){

        } finally {

        }
    }
}

I am still on the fence if this is something I would advocate doing as it does have a slight smell to it. But in limited and careful usage it does provide an alternate approach to the stated use-case. Please be aware that from a Kafka standpoint this code example will leave the old producer open. In reality one would need to properly do a flush() call on the old producer to close it. But that's not what the example is meant to demonstrate.