Designing products using artificial intelligence

文档序号:1602643 发布日期:2020-01-07 浏览:9次 中文

阅读说明:本技术 使用人工智能来设计产品 (Designing products using artificial intelligence ) 是由 E.S.博伊尔 D.西布利 于 2018-04-17 设计创作,主要内容包括:在一个实施例中,一种用于优化计算机机器学习的方法包括接收优化目标。优化目标用于搜索基础选项候选者(BOC)的数据库,以标识至少部分地与该目标匹配的匹配BOC。接收在匹配BOC当中对所选基础选项的选择。至少部分地基于该目标来选择(一个或多个)机器学习预测模型,以确定与所选基础选项的替代特征相关联的预测值,其中使用训练数据来训练该(一个或多个)模型,以至少为这些模型标识与替代特征相关联的权重值。基于预测值,对替代特征的至少一部分进行排序以生成有序列表。提供有序列表,以用于在制造具有该有序列表中的(一个或多个)替代特征的所选基础选项的替代版本中使用。(In one embodiment, a method for optimizing computer machine learning includes receiving an optimization objective. The optimization objective is used to search a database of Basic Option Candidates (BOCs) to identify a matching BOC that at least partially matches the objective. A selection of the selected base option among the matching BOCs is received. Based at least in part on the goal, machine learning prediction model(s) are selected to determine predicted values associated with surrogate features of the selected base option, wherein the model(s) are trained using training data to identify weight values associated with surrogate features for at least the models. At least a portion of the substitute features are sorted based on the predicted values to generate an ordered list. The ordered list is provided for use in manufacturing an alternative version of the selected base option having the alternative feature(s) in the ordered list.)

1. A method for optimizing computer machine learning, comprising:

receiving an optimization objective;

searching a database of basic option candidates using the optimization objective to identify one or more matching basic option candidates that at least partially match the optimization objective;

receiving a selection of a selected basic option among the matched basic option candidates;

determining, with one or more machine-learned predictive models selected based at least in part on the optimization objective, a predicted value associated with a surrogate feature of the selected base option, wherein the one or more machine-learned predictive models are trained using training data to identify at least machine-learned weight values associated with surrogate features for the one or more machine-learned predictive models;

ranking at least a portion of the alternative features based on the predicted values to generate an ordered list of at least a portion of the alternative features for the selected base option; and

providing the ordered list for use in manufacturing an alternative version of the selected base option having one or more alternative features in the ordered list.

2. The method of claim 1, further comprising: one or more components of an optimization objective are identified, wherein the one or more components include at least one of an optimization type and an objective section.

3. The method of claim 1, wherein identifying one or more matching basic option candidates using the optimization objective is based at least in part on past performance data associated with the one or more matching basic option candidates.

4. The method of claim 1, wherein identifying one or more matching basic option candidates using the optimization objective is based at least in part on a category metric of the one or more matching basic option candidates.

5. The method of claim 1, wherein identifying one or more matching underlying option candidates using the optimization objective is based at least in part on a deviation between an actual performance of the one or more matching underlying options and a performance of the one or more matching underlying options predicted by the one or more machine-learned predictive models.

6. The method of claim 1, wherein the database of basic option candidates includes a product catalog and the selection of the selected basic option is made by a user.

7. The method of claim 1, wherein the utilization of the one or more machine learning predictive models comprises selecting training data based on the optimization objective.

8. The method of claim 1, wherein ordering at least a portion of the substitute features comprises: selecting the one or more machine learning prediction models based on the optimization objective to predict a set of features comprising at least a portion of a surrogate feature.

9. The method of claim 1, wherein the utilization of the one or more machine learning predictive models comprises: a combination of at least two features is determined, and an associated machine learning weight value is identified for the combination of the at least two features.

10. The method of claim 1, wherein utilization of the one or more machine learning predictive models comprises supervised learning of the training data.

11. The method of claim 1, wherein the utilization of the one or more machine learning predictive models includes determining a role of a surrogate feature in the predictive performance of an underlying option.

12. The method of claim 1, further comprising: selecting at least a portion of the substitute features based on at least one of natural language processing and computer vision, wherein the substitute features are filtered based on eligibility of the selected base option.

13. The method of claim 1, further comprising:

receiving a selection of at least one of the alternative features in the ordered list;

identifying one or more example base options having a selected at least one of the alternative features; and

providing the one or more example base options.

14. The method of claim 1, wherein the optimization objective comprises a predicted performance with respect to a segment.

15. The method of claim 1, wherein the alternative feature is selected for inclusion in the ordered list of alternative features based at least in part on sales metrics for a set of features that includes the alternative feature.

16. The method of claim 1, wherein the alternative feature is selected for inclusion in the ordered list of alternative features based at least in part on a ranking metric of a feature set that includes the alternative feature.

17. The method of claim 1, wherein the alternative feature is selected for inclusion in the ordered list of alternative features based at least in part on a category metric of an inventory having a set of features that includes the alternative feature.

18. The method of claim 1, further comprising: automatically generating a design for a product, wherein the product includes an alternative version of the selected base option having one or more alternative features in the ordered list.

19. A system for optimizing computer machine learning, comprising:

a communication interface configured to:

receiving an optimization objective; and

receiving a selection of a selected basic option among the matched basic option candidates

A processor configured to:

determining, with one or more machine-learned predictive models selected based at least in part on the optimization objective, a predicted value associated with a surrogate feature of the selected base option, wherein the one or more machine-learned predictive models are trained using training data to identify at least machine-learned weight values associated with surrogate features for the one or more machine-learned predictive models;

ranking at least a portion of the alternative features based on the predicted values to generate an ordered list of at least a portion of the alternative features for the selected base option; and

providing the ordered list for use in manufacturing an alternative version of the selected base option having one or more alternative features in the ordered list.

20. A computer program product for optimizing computer machine learning, the computer program product being embodied in a non-transitory computer readable storage medium and comprising computer instructions for:

receiving an optimization objective;

searching a database of basic option candidates using the optimization objective to identify one or more matching basic option candidates that at least partially match the optimization objective;

receiving a selection of a selected basic option among the matched basic option candidates;

determining, with one or more machine-learned predictive models selected based at least in part on the optimization objective, a predicted value associated with a surrogate feature of the selected base option, wherein the one or more machine-learned predictive models are trained using training data to identify at least machine-learned weight values associated with surrogate features for the one or more machine-learned predictive models;

ranking at least a portion of the alternative features based on the predicted values to generate an ordered list of at least a portion of the alternative features for the selected base option; and

providing the ordered list for use in manufacturing an alternative version of the selected base option having one or more alternative features in the ordered list.

Background

Designing a product that meets one or more objectives can be difficult. For example, it can be challenging to determine what aspects of a product contribute to the success of the product. The design for a particular success metric may also vary. That is, success can be defined by different measures or dimensions. Conventional techniques for product design often rely on the intuition of a human designer. However, human designers are often unable to fully evaluate all of the data that may be collected about features that may be added to a product or with which a product may be modified. Conventional computer-aided product design tools are unable to utilize data to guide designers in making optimal design decisions. For example, conventional product design tools often cannot be flexible with respect to design or performance goals.

Drawings

Various embodiments of the invention are disclosed in the following detailed description and the accompanying drawings.

FIG. 1 is a flow diagram illustrating an embodiment of a process for selecting and providing products.

FIG. 2 is a block diagram illustrating an embodiment of a system for selecting and providing products.

FIG. 3 is a flow diagram illustrating an embodiment of a process for supervised machine learning to train one or more predictive models.

FIG. 4 is a flow diagram illustrating an embodiment of a process for computer-generated design of a product, including basic option determination and feature determination.

FIG. 5 is a flow diagram illustrating an embodiment of a process for identifying basic options.

FIG. 6 is a flow diagram illustrating an embodiment of a process for presenting an example product containing alternative features.

FIG. 7 is an example of a GUI for generating a product design.

FIG. 8 illustrates an example GUI of a design tool for generating a product design.

FIG. 9 illustrates an example GUI of a design tool for generating a product design.

FIG. 10 is an example of a GUI for receiving optimization objectives for generating a product.

FIG. 11 is an example of a GUI for displaying one or more base options and receiving a selection of a base option for generating a product.

FIG. 12 is an example of a GUI for providing alternative feature selection options and receiving a selection of one or more alternative features for generating a product.

FIG. 13 is an example of a GUI for providing alternative feature selection options and receiving a selection of one or more alternative features for generating a product.

FIG. 14 is an example of a GUI for providing sub-feature selection options and receiving a selection of one or more sub-features for generating a product.

FIG. 15 is an example of a design table associated with a computer-generated product.

FIG. 16 is an example of a GUI for generating product previews.

FIG. 17 is a functional diagram illustrating a programmed computer system for generating a product design, in accordance with some embodiments.

Detailed Description

The invention can be implemented in numerous ways, including as a process; a device; a system; composition of matter; a computer program product embodied on a computer readable storage medium; and/or a processor, such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor. In this specification, these implementations, or any other form that the invention may take, may be referred to as techniques. In general, the order of the steps of disclosed processes may be altered within the scope of the invention. Unless otherwise specified, a component such as a processor or a memory described as being configured to perform a task may be implemented as a general purpose component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task. As used herein, the term "processor" refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.

A detailed description of one or more embodiments of the invention is provided below along with accompanying figures that illustrate the principles of the invention. The invention is described in connection with such embodiments, but the invention is not limited to any embodiment. The scope of the invention is limited only by the claims and the invention encompasses numerous alternatives, modifications and equivalents. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. These details are provided for the purpose of example and the invention may be practiced according to the claims without some or all of these specific details. For the purpose of clarity, technical material that is known in the technical fields related to the invention has not been described in detail so that the invention is not unnecessarily obscured.

Product designs using artificial intelligence are disclosed. New products are typically designed to include product features that are selected based on the intuition of the designer. However, given the technological advances in machine learning and artificial intelligence, as well as the big data analysis functionality, new products can be designed based on a huge amount of empirical data, at least partially using machine learning and artificial intelligence, which would not be possible to analyze without the assistance of a computer.

In various embodiments, optimizing computer machine learning to generate a product design includes receiving an identifier of an optimization objective. For example, the goals may include design and/or performance goals. In some embodiments, an example of a performance goal is "designing a new product that will be commercially successful for use by customers in customer Segment (Client Segment) 1 during winter. A "client zone may be a shared characteristic among those clients that fall within the client zone. Customer segments may also be defined by shared observed preferences. An example of a design goal is one or more features that must be included in a product design, such as "designing a new product containing polka dots". The products generated by the processes described herein may be based on a base option, which may be defined by a set of one or more features. In some embodiments, starting from a blank template, one or more features may be added to define the underlying options. Using the example of "designing a new product containing polka dots," the basic option designates "polka dots" as the included features. In some embodiments, one or more features may be exchanged and/or added starting from the basic option. For example, the underlying options may be principals and/or defined by appropriate specifications. The base options may be styles in inventory (e.g., existing styles), and the features may be modified from the base options.

The optimization objective may be used to search a database of basic option candidates to identify one or more matching basic option candidates. For example, an inventory of base options is searched to automatically identify candidates that best match the optimization objective. In response to providing the candidate underlying option, a selection of the underlying option may be received. A selection may be made by a user (e.g., a human designer) from among the matched underlying options. For example, this represents human cooperation with machine learning, and is also referred to as "machine learning of human in loop". The selected base options may be used as a basis for the resulting artificial intelligence aided product design. For example, one or more attributes/features of the base option may be replaced by corresponding substitute feature(s), and/or one or more new substitute attributes/features may be added to the base option with the assistance of machine learning/artificial intelligence to create a product that is an even better product design that optimizes optimization goals. Thus, a product that has been known to be successful may be iteratively improved by selecting a base option rather than starting with a blank list (blank slate) of product characteristics.

To determine attributes/features to be added or replaced in the base option, one or more machine-learned predictive models selected based at least in part on the optimization objective are utilized to determine a predicted value associated with the substitute feature of the selected base option. One or more machine-learned predictive models are trained using the training data to identify at least machine-learned weight values associated with the surrogate features for the one or more machine-learned predictive models. For example, an alternative feature or combination of features may be provided to the trained machine learning model to determine a target optimization prediction score for the feature or combination of features relative to the optimization target and/or the selected base option. Based on the predicted values, at least a portion of the alternative features may be sorted to generate an ordered list of at least a portion of the alternative features for the selected base option. For example, the target optimization prediction scores for features may be used to rank the features relative to a prediction of how the inclusion of the features in the base options will positively affect target optimization. An ordered list can be provided for use in manufacturing an alternative version of the selected base option having one or more alternative features in the ordered list. For example, one of the alternative features is selected for inclusion in the base option due to its high ranking in the order, and a design specification identifying the selected alternative feature may be generated for use in manufacturing a resulting product that is an alternative version of the selected base option.

FIG. 1 is a flow diagram illustrating an embodiment of a process for selecting and providing products. In some embodiments, at least some of the products selected and offered to the customer may be products generated according to the processes described herein. The process of fig. 1 may be at least partially implemented on one or more components of the system 200 shown in fig. 2. For example, the process may be performed by the data platform 204 and recommendation engine 202 with respect to the database 206 and design tools 208. In some embodiments, the process of FIG. 1 is performed by processor 1702 of FIG. 17.

At 102, a request for registration is received. The request for registration may be received from a potential customer requesting a recommendation and/or a product. A product selection and dispensing system, such as the system of fig. 2, may be registered with a customer. At the time of check-in, the potential customer becomes a customer and information about the customer may be stored. The products may be provided to the customer at one time, or may be provided to the customer on a periodic/subscription basis. Products may be selected for a customer based on the customer's preferences, which may be learned over time. In some embodiments, the product selected for the customer may be the following: the product is a product designed with the assistance of artificial intelligence/machine learning in order to modify existing underlying options/products to optimize optimization objectives. As part of the check-in, the customer may provide information about his or her preferences. For example, the customer may provide information directly or indirectly. Information may be provided through a personalized app or third party style or messaging platform. This information may be stored in a database, such as database 206 of FIG. 2.

At 104, customer attributes are determined based on the request for registration. Customer attributes may include objective attributes such as biographical information and measurements, as well as other customer segments. Customer attributes may include subjective attributes such as preferences for style, fitness, color, designer/brand, and budget. For example, a customer may rate specific styles, stamps, and/or attributes, including those in inventory and products from other providers. Information may be collected through third party apps or platforms, such as apps that allow users to indicate interests and/or share interests in products with other users. Customer attributes may be collected when a customer registers with the system. For example, the customer may complete a survey regarding his or her measurements (height, weight, etc.), lifestyle, and preferences. This information may be stored to a customer profile. Can be from social media and by the customer, such as at Pinterest®、Instagram®、Facebook®、LinkedIn®Etc. to determine customer attributes.

The customer attributes may be updated as the customer makes purchases and provides feedback regarding the products. For example, a customer profile may be updated. The customer may provide feedback in a variety of formats, including completing surveys about one or more products, composing product reviews, posting social media posts, and so forth. The products recommended to the customer may be adapted to the customer's changing attributes and tastes. In one aspect, a computer system and/or stylist may learn a customer's tastes over time.

In various embodiments, customer attributes may be determined based on summaries about other users sharing characteristics with a particular customer. A summary about a customer group may be made from individual customer attributes. Customers may be grouped by any characteristic including gender, body type, shared preferences (e.g., a measure of similarity between customers, such as similarity in objective or subjective attributes of the customers or learned product preferences), and the like.

At 106, product options are determined based on the customer attributes. Product options may be determined by processing the customer attributes to select a subset of products from all products in inventory. Product options may be provided to the stylist. In various embodiments, rather than providing all of the product options directly to the customer, the stylist first selects a product from among the product options to provide to the customer.

At 108, a product selection is received from the stylist based on the product options. A stylist (e.g., a human user) selects a product selection from the product options. The product selection may then be provided to the customer. Suppose a customer is looking for a jacket. One or more jackets may be automatically selected from the inventory based on attributes of the customer. Instead of providing the jacket directly to the customer, the stylist selects a subset of the jacket to provide to the customer. Statistical information about the product selection may be stored, such as whether an item was selected as part of the product selection, when the item was selected as part of the product selection, for whom/what type of customer the item was selected, and so forth.

At 110, the product selection is provided to the customer. A batch of shipping items may be provided to a customer. The customer may then decide to retain or return one or more of the items in the shipment. If the customer decides to retain an item, the customer purchases the item. Statistics may be stored about the items, such as whether they were retained or returned, when they were retained or returned, who/what type of customer retained or returned the item.

At 112, feedback regarding the product selection is received. The customer may provide feedback regarding the product selection, such as why the customer retained or did not retain one or more items in the product selection. The customer may provide feedback in a variety of formats, including completing a survey, composing a product review, posting a social media post, and the like. The feedback may be stored and associated with the customer and/or the item. In various embodiments, the feedback may be used to design a product that is likely to appeal to a particular customer base or meet optimization goals.

At 114, information regarding product selection and/or feedback is stored. For example, the item's tracked sales metrics are updated. In some embodiments, sales and feedback information associated with the project may be stored in a database, such as database 206 of FIG. 2.

FIG. 2 is a block diagram illustrating an embodiment of a system for selecting and providing products. In some embodiments, the product selected and offered to the customer may be a product generated according to the processes described herein. The example system 200 shown in FIG. 2 includes a recommendation engine 202, a data platform 204 and database 206, and a design tool 208. Each of these components may be communicatively coupled via a network 250.

The recommendation engine 202 may be configured to employ adaptive machine learning to provide recommendations to stylists who select items for customers from an inventory of items. For example, the system may use machine learning trained models to score products. The stylists may be provided with the highest scoring products. A stylist (e.g., a human) then selects one or more of the highest scoring products to be offered to the customer. The customer may purchase/retain the product and/or provide feedback regarding the product. This feedback can be used to improve the machine-learned training model.

The data platform 204 may be configured to coordinate the operations of the recommendation engine 202, the database 206, and the design tools 208. For example, when data is generated by a customer, stylist designer, and/or vendor interaction with system 200, data platform 204 may determine what information is to be stored. For example, the data platform may store data as part of a training data set for machine learning, as further described herein. The data platform may be configured to perform processes described herein, such as the process shown in fig. 3. In various embodiments, design tool 208 may be communicatively coupled to data platform 204, and data platform 204 may be configured to perform the processes shown in fig. 4-6 based on input received at design tool 208.

The database 206 may be configured to store information about customers, products, sales data, performance metrics, and machine learning models. The product information may include data associated with a product or group of products. The product information may include objective attributes of the product, such as Stock Keeping Units (SKUs), item type, item properties (e.g., color, pattern, material), and the like. The product information may include subjective attributes of the product, such as adaptability to body type, season, and the like. Product attributes may be identified by human or machine. The product information may include a representation of the product, such as text, images, video, or other forms of data. In some embodiments, information about the item may be stored with associated information, such as customer feedback about the item. In some embodiments, information about the item may be stored with statistical information such as: sales metrics (e.g., statistical information related to sales of an item or group of items), inventory metrics (e.g., statistical information related to inventory, such as the number of units in inventory), categories (e.g., measures of inventory diversity and related information such as potential markets). In various embodiments, information about an item may be stored with one or more associated ratings, such as: a style rating (e.g., a measure of customer satisfaction with a style of the item), a size rating (e.g., a measure of accuracy of the identified size of the item), a fitness rating (e.g., a measure of how well the customer is satisfied with the item), a quality rating (e.g., a measure of customer satisfaction with the quality of the item), a retention measure (e.g., a measure of the likelihood that the product will result in a future purchase by the customer), a personalization measure (e.g., a measure of how well the customer matches the personality and uniqueness of the item with the customer), a style grouping measure (e.g., the likelihood that the item is classified in a particular group), a price value rating (e.g., a rating of the value of the item relative to its price), and so forth. In various embodiments, information about the items may be stored that score the styles with an aggregate metric that together represent an appropriate weight/value for any or all of the aforementioned metrics.

In various embodiments, database 206 stores information about how many units each item has in inventory. Supply chain information may be stored, such as how many units of items have been ordered, when they are expected to be received to replenish the inventory of the items, and so forth.

As described herein, alternative features of the underlying options may be evaluated based on their performance relative to the metrics. If the surrogate feature is predicted to perform well for the selected metric of the overall optimization objective, the surrogate feature may be highly ranked. The evaluation metrics may correspond to one or more machine learning models that quantify the evaluation metric values for different feature sets to evaluate whether the replacement/addition of a substitute feature in the feature set better achieves the optimization goal.

In various embodiments, the alternative features may be selected based at least in part on collaborative filtering and/or customer segmentation. For example, alternative features may be selected based on the likelihood that a style will fall within a cluster (e.g., a potential market). To determine whether a pattern will fall into a cluster, a set of features that make up the pattern may be analyzed to determine whether the set will cause the pattern to be classified in a particular way (e.g., whether it will be classified in a particular pattern category). The cluster may be based on feedback such as feedback from third party apps or platforms.

The machine learning model may include a trained model generated from a machine learning process, such as the process of fig. 3. The trained models may be classified by type, such as sales model, inventory model, category model, and the like. For each category of model, a model may be generated for each of one or more sections, such as sections based on one or more of: target body type, target seasonality, target accounting quarter, target customer type or line of business (e.g., female, male, child), target lifestyle, target product type (e.g., jacket, dress, pants), target style (e.g., foreward, city, northwest pacific), and the like. The model may correspond to a particular segment, such as a customer segment, a time period, and so forth. For example, a first sales model may be directed to sales performance of products having a first body type, such as a small group of customers, and a second sales model may be directed to performance of products having a second body type, such as a tall group of customers.

In some embodiments, a machine learning trained sales model may be utilized to predict sales of products having features indicated to the model. Past sales data may be used to train sales models.

In some embodiments, a machine learning trained inventory model may be utilized to predict inventory metrics associated with products having features indicated to the model. The inventory model may be trained using sales data, current inventory information, past inventory information, and the like.

In some embodiments, machine learning trained class models may be utilized to predict the following likelihood: whether a product having the characteristics indicated to the model would be desirably added to the inventory of the product offering to achieve the desired distribution of inventory categories.

In one aspect, a category model may be used to identify the value of categories in inventory, even where styles are specifically directed to who is not usually identifiable. The category model may weigh products according to the products that meet demand globally. The category model may help expand the overall potential market. That is, even if a product does perform poorly in the context of the current customer, the product may perform well in the context of potential/future customers. On the other hand, a customer may prefer to have more color choices even though he or she tends to purchase only one color. The customer may like to increase the likelihood that he or she is purchasing a unique jacket in his or her color because the jacket is provided in a number of color choices. The category model may be trained using a higher level indication of the desired allocation of inventory categories. For example, machine learning training may have to be utilized to determine a higher level model for ideal inventory allocation based on a higher level product category and train a category model using the higher level model to determine a model for ideal inventory allocation based on product characteristics. For example, a particular jacket style may be provided in three colors. While one of these colors may not sell as well as the other two colors, providing the third color as an option may indicate a value in inventory.

In various embodiments, other models may be utilized. Example models include: models for style rating, size rating, fitness rating, quality rating, retention, personalization, style grouping, and price value rating, as further described herein with respect to fig. 2.

Design tool 208 may be configured to employ adaptive machine learning to assist designers in designing items for customers according to their tastes. The designed items, which may be a mixture of basic options and one or more features, may be among the items that the stylist designer may choose to offer to the customer. The design tool may be configured to perform the processes described herein to design a product that incorporates predicted success characteristics selected to meet optimization objectives, as further described herein. For example, a designer may use design tool 208 to create a product. The selection of one or more characteristics that make up a product may be based on optimization objectives, such as increasing sales rates (e.g., as measured by sold units), selection rates (e.g., as measured by the frequency of selections to be provided for sales), attractiveness to particular market segments, performance for particular seasons or seasons, and so forth, as further described herein. Thus, the product may be the result of a combination of alternative features selected by machine learning/artificial intelligence for the underlying options, where the alternative features are those features that are automatically determined to be among the best options to meet the optimization objectives. For example, the alternative features may be ranked according to how well each alternative feature satisfies the goal, and ten best features (or other threshold number) may be selected and presented for use as an alternative or additional feature to the basic option to create the product. In other words, the product is the basic option combined with one or more alternative features or alternative feature combinations.

In various embodiments, the design tool may be configured to perform the processes described herein (e.g., the processes shown in fig. 4-6) to design a product. Design tool 208 can receive input, automatically make design recommendations using machine learning/artificial intelligence, and generate output design specifications. The output of the design tool may be provided to a vendor to manufacture a product according to the specifications of the output.

For example, to at least partially automate design of a product, the system aggregates data collected from customers, stylists, and/or designers. The data platform may use a machine learning process described further herein to build one or more trained models. The training data used to train the model may be based on the behavior of the customer, stylist designer, and/or designer, as stored over time in a customer database, recommendation database, and/or inventory database. When a designer provides optimization goals via a design tool, the system selects and provides one or more base options to the designer. The designer may select one of the basic options. In response, the system can utilize the model to automatically identify one or more alternative features (or combinations of alternative features) for the underlying options. The underlying options may be modified or augmented based on the alternative features. For example, if the underlying option is a jacket, the alternative features may identify alternatives or types of neckline, sleeve length, skirt length, and the like. The underlying options may also be automatically selected by the system based on the process described herein (e.g., the process of fig. 5).

FIG. 3 is a flow diagram illustrating an embodiment of a process for supervised machine learning to train one or more predictive models. The process of fig. 3 may be at least partially implemented on one or more components of the system 200 shown in fig. 2. In some embodiments, the process of FIG. 3 is performed by processor 1702 of FIG. 17. In some embodiments, the designer interacts with the process of FIG. 3 using GUI700 of FIG. 7.

At 302, training data is collected and prepared. In supervised machine learning, a predictive model may be trained with training data to perform predictions based on information "learned" from the training data. The collected data may also include validation data to validate the accuracy of the trained predictive model.

The training data may be derived from data about items stored in database 206 of fig. 2. Different prediction models may be trained for different prediction model classes or segments. To train each of the different models, different training data sets may be collected specifically for the different models to be trained. For example, past performance data associated with metrics predicted using a particular type of model is collected for various different segments, and a particular type of different model may be trained for each of various different segment combinations.

The trained models to be generated may be classified by type, such as sales model, inventory model, category model, rating model, and the like. For each category of model, a model may be generated for each of one or more customer segments, such as segments based on one or more of: target body type, target seasonality, target accounting quarter, target customer type or line of business (e.g., female, male, child), target lifestyle, target style (e.g., forward, city, northwest pacific), target product type (e.g., jacket, dress, pants), and the like.

In some embodiments, a sales model type of predictive model may be utilized to predict the success or sales of a product having characteristics indicated to the model. The training data collected to train the sales type model includes data associated with past sales performance of the product, as well as associated information about the particular product (e.g., characteristics of the product) and the particular sales.

In some embodiments, a predictive model of an inventory model type may be utilized to predict the likelihood of whether a product having characteristics indicated to the model will ultimately be provided to a customer. The training data collected to train the inventory type model includes data associated with past selection performance of the product (e.g., the style designer's selection rate), as well as associated information about the particular product (e.g., characteristics of the product) and the particular selection.

In some embodiments, a class model type of prediction model may be utilized to predict the following likelihood: whether a product having the characteristics indicated to the model would be desirably added to the inventory of the product offering to achieve the desired distribution of inventory categories. The training data collected for training the category type model includes data associated with a higher-level indication of a desired allocation of inventory categories. For example, machine learning training may be utilized to determine a higher level model for an ideal inventory allocation based on a higher level product category and train a category model using the higher level model to determine a model for the ideal inventory allocation based on product characteristics.

At 304, supervised machine learning features and parameters are selected. For example, a user may set control parameters for various machine algorithms that will be used to train the model. Selection of a feature refers to selection of an individually identifiable property of a machine-learned feature or item. Features and parameters may be selected based on the purpose of the trained model. Examples of features for an item of apparel include type (e.g., jacket, dress, pants), outline (e.g., shape of the apparel), printing (e.g., pattern on fabric), material, hypocycloid, sleeve, and the like. An example of a feature is described with respect to fig. 8. An identification of a feature may be received. The selection of features to be utilized in the predictive model may be defined, at least in part, by a human user, or by being automatically determined, at least in part. For example, human or artificial intelligence may define the features of the predictive model to be trained.

In various embodiments, the features may be based at least in part on Natural Language Processing (NLP). For example, the computer system may extract information from text according to NLP techniques. The NLP system may analyze text generated by and about the customer, such as in product reviews, review forms, social media, emails, and the like, to determine customer preferences. For example, when a customer receives an item, they may provide feedback (e.g., text) (e.g., 112 of fig. 2). The feedback provided by the customer can be processed using NLP techniques to extract features. NLP techniques include rule-based engines, clustering, and classification in order to make determinations about product characteristics that may be considered features. Features may be identified through machine learning or computer vision or NLP and recommended for inclusion in a product design. In various embodiments, the terms frequency-inverse document frequency (TFIDF), Latent Dirichlet Allocation (LDA), host hosting analytics (colocation analytics), etc. may be used to create low-dimensional representations of styles or to generate words or phrases that represent styles. Various machine learning methods may then use these features to predict metrics/optimization objectives. The features of the predicted optimization objective may then be correlated back to the representative pattern to convey the concept to the designer and/or manufacturer.

In various embodiments, the features may be selected based at least in part on computer vision. For example, information about an item (such as an item stored in database 206) may have an image representation. The computer system may extract information from images or videos of information about the project according to computer vision techniques. The computer vision system may identify objects, object properties, commonalities, or generalizations about groups of objects to identify patterns and features of the objects. Here, the computer vision system may identify common attributes between items and identify them as features. The computer vision system may identify features of the product that are not describable by a human. Using the example of a fabric, a computer vision system may identify a particular print that a human may not recognize as being common between two pieces of apparel. In some embodiments, the computer vision system may allow for quantification of the distance between the various printed patterns. In various embodiments, features may be found by creating a unique clustering space using color values. Color values can be created by edge detection and defining print dimensions and contrast. Edge detection may be used to provide a distance measure between patterns by quantifying the "busyness" of the pattern. In some embodiments, color values may be defined by using a neural network, convolution, or the like. In some embodiments, color values may be extracted from image data without a neural network. For example, the optimization objectives may be predicted using the potential dimensions and/or principal components of the potential dimensions or clusters (k-means) within the dimensions. In various embodiments, the features may be based at least in part on a neural network. The optimization objective can be predicted using the digital color descriptions of the color labels resulting from clustering within the RGB or LAB color space.

At 306, one or more models are trained. The trained model may predict/determine the performance of a product having a set of features provided as input to the trained model. The trained models may be classified by type, such as sales models, inventory models, category models, rating models, and the like. For each category of model, a model may be generated for one or more sections. For example, the trained model may receive features or feature combinations as inputs and predict/score performance metrics such as sales metrics, inventory metrics, category metrics, style ratings, size ratings, fitness ratings, quality ratings, retention, personalization, style grouping, and price value ratings.

In various embodiments, multiple models are trained, and each model corresponds to a respective performance metric. For example, a sales model is trained to determine sales metrics, an inventory model is trained to determine inventory metrics, and a rating model is used to determine style ratings, size ratings, fitness ratings, quality ratings, retention, personalization, style grouping, and price value ratings. The model may be trained using a training data set, where the training data set corresponds to particular classes and segments.

The sales model may score the input against the sales metrics of the input. Sales models can be used to predict what features will have a high rate of sales metrics. For example, a trained sales model may include information about past sales of a product in which features of the product are known. The inventory model may score the input against the input inventory metrics. The inventory model may predict the likelihood that a feature will be provided to a customer. The category model may score the input with respect to its value in providing categories or diversity to the inventory. The rating model may score the input relative to its predicted rating. By training with data for only a particular section, the model can be specific to that particular section. For example, a model for a particular segment may predict whether a particular combination of features will sell well for customers of a particular client segment.

The model may be trained according to supervised learning or other machine learning techniques. In supervised learning, the objective is to determine the weights of features in a function that optimizes a desired result, where the function is a representation of the relationships between features. In the training process, weights associated with the model features are determined via training. That is, the contribution of each feature to the prediction of the combination of features is determined. In various embodiments, the model may be trained using a mixed-effect model that takes into account several features, some of which may be dependent. The model may be trained by a ridge regression that attributes the work to a particular feature.

In some embodiments, when training the model, the attribution of each feature to the output of the function is determined. In some embodiments, a feature represents a combination of features. For example, when an individual feature is combined with another feature, the individual feature may have different weights. A feature or set of features may define a base option. As more inputs are provided to the model, the output of the function becomes closer to the target or validation result.

In various embodiments, the model may be evaluated after the model has been trained. The error of the model is the difference between the actual and modeled performance. On the other hand, in some cases, a well-trained model may nevertheless differ from the actual results. In this case, the product may have aspects that make the product perform better than expected. For example, the product may perform better than predicted by the trained model. The description of the factors for success is one aspect. As described further herein in Chinese, the aspect can be utilized by incorporating the aspect into a product.

As a result of the training, the trained sales/success model can predict the performance/success of the combined feature set in the product. For example, given two sets of features that differ only in color, the model can predict that success can be attributed to differences in color features. Because the trained model is able to attribute success to a particular surrogate feature or combination of surrogate features, the predictions made by the trained model may be used to identify the surrogate feature that best matches the optimization goal to be included in the product.

At 308, the trained model(s) are applied. The trained model(s) can be applied to make recommendations regarding features to be included in the product design. Thus, the trained model(s) may recommend one or more surrogate features to augment the underlying options (e.g., existing products utilized as a design starting point). That is, the product includes one or more attributes that are identified as features that are desirably included in the product using the trained model. As described further herein, the trained models can be used to make recommendations regarding one or more underlying options. For example, a search space of underlying options may be explored to select (e.g., according to an optimization goal) a well-behaved underlying option using a trained model. The results of the trained models may be weighted and combined with the weighted results of the other trained models to rank the surrogate features.

FIG. 4 is a flow diagram illustrating an embodiment of a process for computer-generated design of a product, including basic option determination and feature determination. The process of fig. 4 may be at least partially implemented on one or more components of the system 200 shown in fig. 2. In some embodiments, the process of FIG. 4 is performed by processor 1702 of FIG. 17. In some embodiments, the designer interacts with the process of FIG. 4 using GUI700 of FIG. 7.

At 402, an optimization objective is received. The optimization goals may describe design goals and/or performance goals for the product. The optimization goals describe the target results (e.g., design elements or performance) that are desired to be achieved by the product and may be used as a basis for evaluating the underlying options and/or substitute features. An example of an optimization goal is 860 of FIG. 8. In some embodiments, an example of a performance goal is "designing a new product that will be commercially successful for use by customers in a sector during winter. ". An example of a design goal is one or more features that must be included in a product design, such as "designing a new product containing polka dots".

The optimization objective may include one or more objective components. In some embodiments, the target composition may identify one or more characteristics, such as color, printing, sleeve length, skirt length, and the like. In some embodiments, the goal component may identify one or more optimization types, such as sales metrics (e.g., a goal of designing a product predicted to achieve the highest sales), inventory metrics (e.g., a goal of designing a product predicted to achieve the highest selection rate for presentation to a customer), category metrics (e.g., a goal of designing a product predicted to optimize product allocation in inventory), style ratings, size ratings, fitness ratings, quality ratings, retention, personalization, style grouping, and price value ratings, among others. The target component may have a target zone. In various embodiments, a target zone refers to a current zone and/or a future zone or area in which to expand. Example target segments include: target business lines (e.g., female, male, child), target product types (e.g., jacket, dress, pants), customer segment, seasonality (e.g., spring/summer, fall/winter), and the like. Optimization goals may be evaluated for optimization-type segments, e.g., target customers for a certain client segment and optimizing sales in a target product season. The optimization objective may be received via a GUI, such as GUI700 of FIG. 7.

In various embodiments, a user may provide one or more target components. For example, the user may select from among several target component options. In various embodiments, one or more target components may be determined by parsing the string. For example, the optimization goal may be entered by the user in an input text box.

At 404, an optimization objective is used to identify one or more base option candidates from among the possible base options. In some embodiments, the base option candidates are presented to the user/designer in a ranked order associated with their desirability with respect to the optimization goal. The basic option candidates may be selected from among the basic options: a catalog of products (e.g., products may be available from a vendor or may have been previously provided to a customer), a set of manually-curated base options, and/or a set of machine-selected base options. In various embodiments, the underlying option candidates are items available for sale. For example, the base option candidates may be selected based on sales data for previously sold items. Because the item sells well for the particular segment identified in the optimization objective, a base option candidate may be selected. In various embodiments, the base option candidates are determined based on past performance data associated with each base option candidate. For example, a ranked list of the best-selling items for a given product segment may be used to select one or more of these best-selling items for inclusion in a list of base option candidates (e.g., three best-selling jackets sold below $ 25 may be identified as base option candidates).

In various embodiments, the base option candidates are determined by evaluating a deviation between actual performance (e.g., as measured by sales data) and predicted performance as a base option for the product. The predicted performance of the underlying options may be determined from one or more trained models, such as the model trained by the process of FIG. 3. Assume that the product performs better than expected (e.g., as expected from a trained model trained by machine learning, such as the process of fig. 3). Such differences between actual and predicted performance may be scored, and the base option candidates may be ranked and ordered using scores associated with different base option candidates to help select the best base option. For example, a product may score higher if the deviation between the actual performance and the predicted performance is definitely larger. For example, a product may be ranked higher because some aspects of the product result in the product actually performing better than predicted.

By using scores for deviations of the product that perform better than predicted by the trained model, aspects of the product that make it successful may be incorporated into the product by using attributes/features that result in deviations between actual performance and expected performance. The discrepancy between actual and predicted performance accounts for unspecified features that result in a product performing better than predicted, and the model does not account for these unspecified features. A large deviation value indicates that the product has properties that the predictive model cannot model efficiently. By selecting the base option with the greatest positive deviation between the actual and predicted performance as the basis for the product design, the product design incorporates the positive attributes of the base options that cannot be effectively modeled by the predictive model.

Additional example details regarding selection of underlying options are provided in connection with FIG. 5.

At 406, a selection of the selected basic option is received. The base option may be defined by a set of one or more features. This selection may be made by the designer/user. For example, a user may select a base option via the GUI to serve as a basis for product design. Example GUIs for providing basic option candidates and receiving selections are shown in fig. 7 and 11.

At 408, one or more trained models are selected to evaluate the surrogate features of the selected underlying option. The trained model(s) may be selected based at least in part on the optimization objectives and/or the selected base options. That is, one or more machine learning predictive models (also referred to as "trained models") may be selected based at least in part on an optimization objective to determine predicted values associated with alternative features of the selected base option. For example, according to 306 of fig. 3, one or more machine learned predictive models may have been trained using training data to identify at least machine learned weight values associated with the surrogate features for the one or more machine learned predictive models.

The trained models may be selected to limit the number of trained models used, as using more models increases processing time. The trained models that are most relevant to the optimization objectives and/or underlying options may be selected to maximize prediction accuracy while reducing processing time. In various embodiments, a particular category (e.g., sales, inventory, or category) of the trained model may be selected based at least in part on the optimization type. In various embodiments, the particular segment may be selected based at least in part on an optimization objective component. Depending on the optimization objective, the predictions/results of the trained models may be weighted and weights associated with the different models determined. For example, the predictions of the different trained models may be weighted and combined to determine an overall prediction value that may be used as a basis for ranking the surrogate features.

At 410, qualified substitute features are identified for the selected base option. The eligible substitute features may be selected by a user and/or automatically determined. For example, among all possible features of the predictive model (e.g., the features determined in 304 of fig. 3), a subset of features that are applicable and eligible for inclusion in the underlying options as alternative or additional features is identified. In some embodiments, features that are eligible for the selected base option may be selected according to manually specified rules or machine learned models. Qualifying features include features that can be added or replaced given the already existing features of the selected underlying option. For example, if the underlying option is a skirt, the acceptable characteristics may include length, contour, and face fabric. Features such as sleeves and necklines would not be chosen because they are not suitable for skirts.

At 412, one or more sets of features are scored using the selected trained model(s). For example, the feature set may consist of at least one of a feature of the base option and a substitute feature, wherein the substitute feature replaces one of the features of the base option or is added to the feature set of the base option. As another example, a feature set may consist of a feature of a base option and a plurality of alternative features that replace some of the features of the base option or are added to the feature set of the base option. In various embodiments, a model trained according to the process of FIG. 3 employs the feature set and outputs a prediction score for the feature set. The model may be used to replace all possible combinations of features and underlying options to generate a prediction for each combination. Because the machine-learned weights for the features are known, the function of the weighted features generates an output that represents a prediction about the combination of features that make up the product. Using the output on the feature combinations, the feature combinations may be ranked. Accordingly, individual replacement features or combinations of replacement features may be ranked. In various embodiments, more than one trained model is utilized. For example, the results of each selected trained model may be weighted and the combination of the weighted results of several trained models determined as the overall score of the surrogate feature or combination of features.

At 414, an ordered list of alternative features is generated. In various embodiments, an ordered list may be generated based on the predicted values by sorting at least a portion of the alternative features to generate an ordered list of at least a portion of the alternative features for the selected base option. The values (e.g., scores) associated with the desirability of the different substitute features of the selected base option with respect to the optimization objective determined in 412 are ordered and ranked, and the corresponding substitute features are provided in an ordered list of recommendations (e.g., from best to worst) of substitute features to be utilized to modify the design of the selected base option. For example, the ten highest scoring alternative features or combination of alternative features may be output in the ranked list. This information can be used to design a product, for example, by incorporating at least some features into the underlying options.

At 416, an ordered list of alternative features is provided. The ordered list of substitute features allows a designer or computer processor to modify the underlying options and generate a product. The substitute features may be ranked according to their impact on the prediction. For example, features with high influence may be ranked higher than features with lower influence. If the optimization objective is sales metrics, then alternative features can be ranked according to how they would increase product sales if they were incorporated into the product.

In various embodiments, when an alternative feature is selected for inclusion in a product design, an already existing instance of the product having that alternative feature that best meets the indicated optimization objective may be identified and displayed. For example, because there may be slight variations in how a particular feature (e.g., the type of stripe pattern) is implemented, recommendations for exact feature implementation are provided by locating the best target optimization example. An example process is shown in fig. 6.

In various embodiments, the first alternative feature may be selected for incorporation into the base option. Upon incorporation, the ordered list of substitute features can be regenerated to reflect updates to the ranking of the substitute features. For example, 406 can be repeated 414, where the base option is the base option if the selected replacement feature(s) are incorporated. This may result in a subset or different set of substitute features to be incorporated into the base selection along with the first incorporated substitute features. In some embodiments, the ranking of the alternative features may change in response to the features incorporated into the base options, as the features may interact with each other to affect the success of the product. That is, a successful feature may not work well with another successful feature, such that having both features in a product may make the product less successful. For example, a long sleeve may not work well with a bourdon dot. As another example, the two features (e.g., long-sleeved and short-sleeved) may not be compatible.

In some embodiments, the designer may select a number of alternative features from the provided ordered list for incorporation into the base options to design the product. The resulting product may be input into a trained model to score the product. For example, a designer may select several combinations of features and underlying options to generate multiple products. Each product may be scored using the trained model(s). The products may then be compared to one another to determine a relative predicted performance.

In various embodiments, the representation of the product may be presented in response to an alternative feature selection for incorporation into the base option. This may help the designer visualize the product. For example, the image of the underlying option may be altered to show the incorporated alternative feature. The image of the underlying option may be combined with the image of the incorporated substitute feature. For example, if the half sleeves are replaced with long sleeves, the picture with the basic option of the half sleeves may be updated to the picture with the basic option of the long sleeves. In various embodiments, a representation of the product may be displayed along with the underlying options to allow the comparison to be made. Examples of visual representations of products are shown in fig. 9 and 16.

FIG. 5 is a flow diagram illustrating an embodiment of a process for identifying basic options. The process of fig. 5 may be at least partially implemented on one or more components of the system 200 shown in fig. 2. In some embodiments, the process of FIG. 5 is performed by processor 1702 of FIG. 17. In some embodiments, at least a portion of the process of fig. 5 is included in 404 of fig. 4.

At 502, one or more components of an optimization objective are determined. The optimization goals may describe design and/or performance goals for the product. The optimization goals describe the target results (e.g., design elements and/or performance) that are desired to be achieved by the product and may be used as a basis for evaluating the underlying options and/or substitute features. An example of an optimization goal is 860 of FIG. 8. The optimization objective may include one or more objective components. The goal component may identify one or more optimization types, such as sales metrics (e.g., a goal to design a product predicted to achieve the highest sales), inventory metrics (e.g., a goal to design a product predicted to achieve the highest selection rate for presentation to a customer), category metrics (e.g., a goal to design a product predicted to optimize the distribution of products in inventory), style ratings, size ratings, fitness ratings, quality ratings, retention, personalization, style grouping, price value ratings, and so forth. The target component may have a target segment, such as a target line of business (e.g., female, male, child), a target product type (e.g., jacket, dress, pants), a customer segment, seasonality (e.g., spring/summer, fall/winter), and so forth. Optimization goals may be evaluated for optimization-type segments, e.g., target customers for a certain client segment and optimizing sales in a target product season. The optimization objective may be received via a GUI, such as GUI700 of FIG. 7.

The optimization objective allows the underlying options to be compared to each other. For example, "I want to increase profit" may correspond to the goal of increasing sales rates and increasing selection rates. In various embodiments, a user may provide one or more target components. For example, the user may select from among several target component options. In various embodiments, one or more target components may be determined by parsing the string. For example, the optimization goal may be entered by the user in an input text box.

At 504, eligible base options are selected. The basic option candidates may be selected from among the basic options: a catalog of products (e.g., products may be available from a vendor or may have been previously provided to a customer), a set of manually-curated base options, a set of one or more sets of one or more features, and/or a set of machine-selected base options. In various embodiments, the underlying option candidates are items available for sale. For example, the base option candidates may be selected based on sales data for previously sold items. Because the item sells well for the particular segment identified in the optimization objective, a base option candidate may be selected. In various embodiments, the base option candidates are determined based on past performance data associated with each base option candidate. For example, a ranked list of the best-selling items for a given product segment may be used to select one or more of these best-selling items for inclusion in a list of base option candidates (e.g., three best-selling jackets sold below $ 25 may be identified as base option candidates). In some embodiments, the underlying options are defined by a set of one or more features, rather than a specific item available for sale. For example, the underlying option will be defined with a list of one or more features as the starting feature set.

Of all possible basis options, only the basis options for the identified sections that meet the optimization objective may be included in the eligible basis options. For example, the indicated section may narrow the possible choices for eligible base options. The target may indicate certain sections and only the underlying options in that section are selected. Using the example of a section to design optimization goals for a jacket that women of the first customer section will wear in summer, only the basic options identified as a jacket are included in the qualified basic options.

At 506, each of the eligible base options is evaluated with respect to each of the one or more components of the optimization objective. In various embodiments, the underlying options are scored based on performance data. For example, eligible base options may be ranked based on sales data, selection data, and category data. Performance may be based on historical data and/or how close measurements of how likely underlying options are actually performed relative to the prediction table. In some embodiments, the predicted outcome may be made for the underlying option based on data, formulas, models, and/or calculations. For example, a formula may be selected from a database as a base option to determine aspects of the base option.

In various embodiments, the model cannot fully predict the result because it is predictive and the actual performance may be different. The product may include unspecified features that cause the product to perform better than the forecast, but the model does not account for these unspecified features. If we know the predictions made using the model for a particular underlying option and the actual real past results, we can compare them. If the real result is more desirable, this can be accounted for by the following model: the model is incorrect for the product and includes some aspect that cannot be predicted by the model for some reason (the reason cannot be described). This is taken into account to identify product/base options that exhibit positive aspects that cannot be modeled.

In various embodiments, the base option candidates are determined by evaluating a deviation between actual performance (e.g., as measured by sales data) and predicted performance as a base option for the product. The predicted performance of the underlying options may be determined from one or more trained models, such as the model trained by the process of FIG. 3. Assume that the product performs better than expected (e.g., as expected from a trained model trained by machine learning, such as the process of fig. 3). Such differences between actual and predicted performance may be scored, and the base option candidates may be ranked and ordered using scores associated with different base option candidates to help select the best base option. For example, a product may score higher if the deviation between the actual performance and the predicted performance is definitely larger. For example, a product may be ranked higher because some aspects of the product result in the product actually performing better than predicted.

By using scores for deviations of the underlying option products that perform better than predicted by the trained model, aspects of the product that make it successful may be incorporated into the product by using attributes/features that result in deviations between actual performance and expected performance. The discrepancy between actual and predicted performance accounts for unspecified features that result in a product performing better than predicted, and the model does not account for these unspecified features. A large deviation value indicates that the product has properties that the predictive model cannot model efficiently. By selecting the base option with the greatest positive deviation between the actual and predicted performance as the basis for the product design, the product design incorporates the positive attributes of the base options that cannot be effectively modeled by the predictive model.

One or more of these various scores may be used alone and/or in a weighted and combined combination.

At 508, an overall evaluation for each of the possible underlying options is determined. The overall evaluation may be an aggregation of the static combination of scores for the score indicator for each target component and/or the deviation of the scores from the base options that perform better than predicted by the trained model.

At 510, qualified base options are ranked based on the respective overall evaluations. The base options may be ranked according to how well the base options perform as measured by historical data and/or modeling the base options using one or more trained models. For example, the underlying option candidates are presented to the user/designer in a ranked order associated with their desirability with respect to the optimization objective.

At 512, one or more candidate base options are identified. The candidate base options may be those possible base options that score above a threshold score. The candidate base options may be the highest predetermined number of possible base options. For example, the candidate base options are the top ten possible base options. Candidate underlying options may be provided on the GUI for selection by the user. An example GUI displaying candidate underlying options is the GUI shown in fig. 7 and 11.

FIG. 6 is a flow diagram illustrating an embodiment of a process to present an example product containing alternative features. The process of fig. 6 may be at least partially implemented on one or more components of the system 200 shown in fig. 2. In some embodiments, the process of FIG. 6 is performed by processor 1702 of FIG. 17. In some embodiments, at least a portion of the process of fig. 6 is performed after 416 of fig. 4.

At 602, a selection of an alternate feature is received. For example, the designer may select an alternate feature to present in the alternate feature list. Alternative features may be provided in 416 of fig. 4, and may be provided on a GUI, such as GUI 800 of fig. 8.

At 604, one or more examples of items having the selected alternative characteristic are identified. In various embodiments, when an alternative feature is selected for inclusion in a product design, an example of an existing product having that alternative feature that best meets the indicated optimization objective may be identified and displayed. For example, because there may be slight variations in how a particular feature (e.g., the type of stripe pattern) is implemented, recommendations for exact feature implementation are provided by locating the best target optimization example. In some embodiments, among all possible example items, only items are identified that satisfy the identified section of the optimization objective and that also exhibit the selected alternative characteristic. In some embodiments, each of the identified item examples is evaluated against each of one or more components of the optimization objective. In various embodiments, the items are scored based on known past performance data relative to the optimization objective. For example, example items may be ranked based on sales data, selection data, and category data. In various embodiments, example items are ranked based at least in part on a deviation between an actual performance associated with the item (e.g., as measured by sales data) and a predicted performance of the item (as predicted by one or more predictive models trained by the process of FIG. 3).

At 606, examples of items having alternative characteristics are provided to the user. For example, an example is presented on a GUI, such as GUI 900 of FIG. 9.

FIG. 7 is an example of a GUI for generating a product design. GUI700 may be provided as part of a design tool for designing a product. In some embodiments, GUI700 is used by a designer to design a product.

The GUI700 includes: an input field that accepts user input regarding optimization goals (collectively referred to as input section 702); an output field that provides one or more responses based on the received input (the output field is collectively referred to as output portion 704); and a navigation menu 730. Designers can use design tools to design products that meet optimization goals. For example, a designer may wish to design a garment that is predicted to perform well (e.g., sell well) in a particular customer segment.

In this example, the input section 702 receives optimization objectives via a drop down menu 750. Here, the optimization objective includes the following components: business lines (e.g., female, male, child), product types (e.g., jacket, dress, pants), customer segments, accounting or seasonal (e.g., spring/summer, fall/winter), and outlines (e.g., shape of clothing). The target component may be selected via a drop down menu as shown or by other input methods such as text entry, button selection, etc. The target component may be pre-populated with default values. Here, default selections are shown: "female" is for line of business, "jacket" is for class, "sector 1" is for customer sector, "Q4" is for quarterly accounting, and "all" is for silhouette. The target components are merely examples, and other components/options are possible.

In alternative embodiments, the optimization goal may be indicated by other selection or input methods. FIG. 10 is an example of a GUI for receiving optimization goals for generating a product. In the example GUI shown in fig. 10, the user may select the target customer segment by clicking/touching the target customer segment ("under 30", "30-50", or "50 and over"). The user may select a season by clicking/touching the season ("Q1", "Q2", "Q3", "Q4").

Returning to FIG. 7, in this example, input section 702 includes a presentation option 740. The presentation options may determine how to display an output, such as a base option, in output portion 704. Here, the presentation options include: whether to display only the vendors we are using (e.g., vendors that will be available in the future), the brand type (e.g., internal brands versus other brands), the style or variation in style, and the number of pages on which to split the results. For example, the results may be displayed on a single page, or each page may be defined to display a predetermined number of results.

In this example, the output portion 704 includes a base option 710 generated from the optimization target component 750. The base options may be displayed in a ranked order of best matching base options selected based on the optimization objective component 750. In this example, eight basic options are displayed. For each base option, associated information, such as ranking metrics, may be displayed. In various embodiments, the base option may be selected, for example, when the user clicks or touches the base option. One or more alternative features may be displayed in response to user selection of the base option. An example process for selecting an alternative feature for a given underlying option is the process of fig. 6. In various embodiments, information corresponding to the navigation menu 730 item "replacement recommendations based on style variations" is automatically updated in response to selection of the underlying option. An example of an output corresponding to "replacement recommendation based on style variation" is shown in fig. 8.

In an alternative embodiment, output portion 704 includes instructions for selecting a recommended underlying option. FIG. 11 is an example of a GUI for displaying one or more base options and receiving a selection of a base option for generating a product. In this example, the user selects the basic option in response to a prompt to select a recommended subject (basic option) for a previously selected client group and time period. The example of fig. 11 corresponds to fig. 10, where target customer segment "segment 3" and season "Q4" are selected.

In various embodiments, the navigation menu 730 allows the user to view a variety of different outputs generated from the optimization objectives. Here, the menu includes: "subject recommendations" (e.g., basic options), "style variant-based replacement recommendations" (e.g., replacement features), "best style variants with features" (e.g., products), "individual feature success," feature contribution, "" feature success by profile, "" feature-to-interaction, "and" inventory holes. In various embodiments, a "subject recommendation" includes a list of one or more underlying options based on an optimization goal. For example, the subject recommendation may be the result of 404 of FIG. 4. In various embodiments, a "style variant based replacement recommendation" includes a list of one or more alternative features for a base option selected from a subject recommendation. For example, the substitution recommendation based on the style variation may be the result of 414 of fig. 4. In various embodiments, the "best mode variation with features" includes: a preview of one or more alternative features selected from the "base recommendations" based on the products of the selected base options from the "base recommendations" and from the "style variant based alternative recommendations". For example, the best style variant with features may be presented after 416 of FIG. 4.

In response to input received via input portion 702, output can be presented in output portion 704. Different types of outputs may be generated based on the inputs, and the information may be grouped and displayed. In various embodiments, the user may navigate between different portions using the navigation menu 730. FIG. 7 shows the state of the GUI after input is received at input section 702, and one or more "subject recommendations" are presented in output section 704. An example of a "style variant based alternative recommendation" presented in the output portion 704 is shown in fig. 8. An example of "best style variant with feature" presented in the output section 704 is shown in fig. 9.

FIG. 8 illustrates an example GUI of a design tool for generating a product design. In some embodiments, GUI 800 is used by a designer to design a product. FIG. 8 shows the state of the GUI after input is received at the input section, and one or more "style variant based alternate recommendations" are presented in the output section.

One or more optimization objective components may be selected via a drop down menu 850. Here, the optimization objective includes the following components: lines of service, classes, customer segments, and quarterly meetings. Examples of optimization goal selection/provisioning are discussed with respect to the optimization goal component 750 of FIG. 7.

To determine one or more trained models (e.g., 408 of fig. 4) to be used to determine the surrogate features, input may be received via model selection input 854. Model selection input 854 may be provided by a user. In this example, the options for the model are: metric 1, metric 1 and metric 2, metric 3, and all three metrics (e.g., metrics 1, 2, and 3). For example, metric 1 may be a sales model, metric 2 may be an inventory model, and metric 3 may be a category model. Examples of sales models, inventory models, and category models are discussed with respect to FIG. 4. Although not shown, other models or combinations of models may be used, including but not limited to: style rating, size rating, fitness rating, quality rating, retention, personalization, style grouping, price value rating, or aggregation measure. Surrogate features may be determined from the selected model(s) by providing one or more optimization objective components to the selected model(s). The trained model(s) may then output one or more surrogate features, where the output surrogate features best meet the optimization objective. In various embodiments, the number of alternative features may be predefined (e.g., outputting the highest threshold number of features). An example process of selecting an alternative feature is shown in fig. 4 and 6. In other embodiments, other options/combinations of models are possible.

In some embodiments, the output for presentation is selected according to ranking options 840. Ranking options 840 may allow the user to indicate how the results are displayed in the user interface. For example, the processing results of the trained model (results selected via model selection input 854) may be processed according to a ranking selection via ranking option 840. In this example, the user may select whether to display features with "color and print" and "vendor and brand" and whether to remove the effect of price. The "color and print" option allows color and print recommendations to be displayed or not. The "vendor and brand" option allows vendor and brand recommendations to be displayed or not. The "price" option allows the user to consider the following model: the model predicts what will satisfy the metric as a whole, or what will satisfy the metric relative to the price point at which the garment will be sold.

Output may be presented in response to input received via the optimization goal components 850, the model selection input 854, and/or the ranking options 840. Different types of outputs may be generated based on the inputs, and the information may be grouped and displayed. In various embodiments, the user may navigate between different portions using the navigation menu 830. The output displayed in this example corresponds to a "replacement recommendation based on style variations".

In this example, the output includes a substitute feature 820 for the base option 814. Here, the base option is identified by its name "style variant 12345" at 852. The alternative features selected for the underlying option may be based at least in part on an optimization goal. Here, optimization objective 860 ("designed for customer segment 1 and Q4, for replacement with the highest feature of lifting metric 1") is displayed with base option 814. The optimization objective 860 is a generalization of the components selected via the optimization objective components 850. In this example, the surrogate feature is selected based in part on the selected training model ("metric 1").

Representations of the underlying options, such as images/photos, videos, or pictures, may be displayed. Here, an image 810 of the basic option "style variation 12345" is displayed.

In this example, the alternative features 820 include contours, printing, hypocycloids, sleeves, and types. Each feature may have sub-features. Here, the sub-features for the contour are: half sleeves, no sleeves, and long sleeves. Sub-features for sleeve type are: hemmed sleeves, cap sleeves, thin shoulder straps, and others. The sub-features for printing are: a Persley pattern, a floral pattern, and an abstract pattern. The sub-features for the hypocycloid are: standard, high and low, curved and side-slit. The substitute features 820 can be listed in an ordered/ranked list. Here, "profile: half sleeve → no sleeve "because altering the profile, and more specifically replacing the half sleeve of the underlying option with the no sleeve profile, will boost metric 1 the most.

FIG. 9 illustrates an example GUI of a design tool for generating a product design. In some embodiments, GUI 900 is used by a designer to design a product. FIG. 9 shows the state of the GUI after input is received at the input portion, and one or more "base style variants with features" with selected substitute features or feature combinations are presented in the output portion. "base style variation" refers to the following product: which is a product with basic options for one or more alternative features.

GUI 900 includes a substitute feature selection portion 920 where the user can indicate the selected substitute feature to add to/modify the base option to produce the product. In this example, the selected substitute feature is substitute feature 7 ("stamp name: graphic pattern → abstract pattern) of substitute feature 820 of FIG. 8.

In alternative embodiments, the user may select one or more alternative features by selecting a button that uses the interior of the drop down menu in the feature selection portion 920 or a button other than the drop down menu in the feature selection portion 920. FIG. 12 is an example of a GUI for providing alternative feature selection options and receiving a selection of one or more alternative features for generating a product. In the example of fig. 12, the user selects the "collar" as the alternate feature in response to a prompt to select some of the recommended feature variations (alternate features) tailored to the selected subject (base options) and time. Here, the collar of the base option is a split collar (which performs lower than average for customer segment 1), the color of the base option is navy blue (which is too much stocked at a given performance in the summer months), and the hypocycloid is a high-low pendulum (which performs well with a sleeveless profile). A description of the feature for the base option may be automatically generated based on an evaluation of the performance of the feature. Here, both the neckline and the color perform poorly, and a cause of the poor performance can be provided. For example, navy blue is in excess stock.

In some embodiments, each alternative feature may be displayed with a corresponding chart of that alternative feature compared to other features of that type. For example, for a "neck opening," the chart shows the impact of various neck opening types (e.g., feature 1 through feature 9) on the metric ("metric impact"). Each bar in the bar graph represents the metric impact of the neckline type with respect to the selected substitute feature. In some embodiments, the chart may be the basis for the description, but is not displayed on the GUI as shown.

In various embodiments, an alternative feature may include one or more sub-features. FIG. 13 is an example of a GUI for providing alternative feature selection options and receiving a selection of one or more alternative features for generating a product. In the example of FIG. 13, additional sub-features are displayed in response to selection of a neckline. Here, a ranked list of collar replacements (e.g., collar a, collar B, collar C) is displayed. The user selects "neckline B" to replace the neckline of the basic option with the neckline of type neckline B.

In various embodiments, an alternative feature may include one or more sub-features. Sub-features are feature classifications made at a finer granularity. FIG. 14 is an example of a GUI for providing sub-feature selection options and receiving a selection of one or more sub-features for generating a product. In the example of fig. 14, four jackets with V-neck necklines are shown. Visual representations of the sub-features may assist the designer in visualizing the sub-features. Here, the user selects option 2, i.e. the collar corresponding to example 2 (e.g. collar b.2).

Returning to FIG. 9, GUI 900 includes a manufacture options section 940. Here, the manufacturing options include: brand type selection (both example internal brand "EB" and "market" referring to typical market brands) and minimum shipping quantity. Manufacturing options may be used to filter out products. For example, assume that a particular fabric supplier requires a minimum order volume that exceeds the order volume that would correspond to 25 lots. The abstract design face from the vendor will be filtered out and no product using the abstract design is displayed in section 960.

In response to input received via the alternate feature selection portion 920 and the manufacturing options portion 940, output may be presented. Different types of outputs may be generated based on the inputs, and the information may be grouped and displayed. In various embodiments, the user may navigate between different portions using the navigation menu 930. The output displayed in this example corresponds to "base style variant with features".

In this example, the base style variation is generated from a combination of the base options 814 (corresponding to the image 810) and the substitute feature 2 ("stamp name: graphic pattern → abstract pattern") in the substitute feature 820 of FIG. 8. That is, the product is the basic option 814 with the alternative feature of an abstract print. Here, eight example products 960 are displayed, each of which is a basic option 814 with an abstract stamp. Additional example products may be displayed on other pages or as the user scrolls down. These example products may be selected from a database of products. In various embodiments, the example products are ordered by optimization objectives and displayed in a ranked order.

In various embodiments, the preview of the product may be updated in real-time as an alternative feature or combination of features is selected for incorporation into the underlying option or product. FIG. 16 is an example of a GUI for generating product previews. In this example, the product preview 1610 is presented based on selection of various features. Here, the alternative features are collar 1620, sleeves 1630, and fabric 1640. Each of these alternative features also has sub-features as shown. In this example, sub-feature 1622 is selected, sub-feature 1632 is selected, and a combination of sub-features 1642 and 1644 is selected.

In various embodiments, the generated product may be described by a design table having specifications for manufacturing the product. For example, a design sheet for a product may be generated by the process shown in FIG. 4. The design form may be provided directly to the manufacturer to produce the product. FIG. 15 is an example of a design table associated with a computer-generated product. In the example of fig. 15, the product is a jacket utilizing the body of SVID 00000, the neck of SVID 00001 and the hypocycloid of SVID 00002. Additionally, designer notes may be provided. In this example, the annotations reflect optimization objectives (customer section 1 and above for Q4). In various embodiments, the design form may include a visual representation of the product (not shown). FIG. 16 is an example of a visual representation of a product.

FIG. 17 is a functional diagram illustrating a programmed computer system for generating a product design, in accordance with some embodiments. As will be apparent, other computer system architectures and configurations may be used to implement the described product generation techniques. Computer system 1700, which includes various subsystems described below, includes at least one microprocessor subsystem (also referred to as a processor or Central Processing Unit (CPU) 1702). For example, the processor 1702 may be implemented by a single chip processor or by multiple processors. In some embodiments, processor 1702 is a general-purpose digital processor that controls the operation of computer system 1700. In some embodiments, the processor 1702 also includes one or more coprocessors or special-purpose processors (e.g., graphics processors, network processors, etc.). Using instructions retrieved from memory 1710, the processor 1702 controls the reception and manipulation of input data received at input devices (e.g., image processing device 1706, I/O device interface 1704) and controls the output and display of data at output devices (e.g., display 1718).

The processor 1702 is bidirectionally coupled to memory 1710, which memory 1710 may include, for example, one or more Random Access Memories (RAMs) and/or one or more Read Only Memories (ROMs). The memory 1710 may be used as a general purpose memory area, a temporary (e.g., scratch pad) memory, and/or a cache memory, as is known in the art. Memory 1710 may also be used to store input data and processed data in the form of data objects and text objects, as well as to store programming instructions and data, in addition to other data and instructions for operating on the processor 1702. As is also known in the art, the memory 1710 typically includes basic operating instructions, program code, data, and objects (e.g., programmed instructions) that are used by the processor 1702 to perform its functions. For example, memory 1710 may include any suitable computer-readable storage medium as described below depending on, for example, whether data access needs to be bi-directional or unidirectional. For example, the processor 1702 may also retrieve and store frequently needed data directly and very quickly in cache memory included in the memory 1710.

Removable mass storage device 1712 provides additional data storage capacity for computer system 1700 and is optionally coupled bi-directionally (read/write) or uni-directionally (read only) to processor 1702. The fixed mass storage 1720 may also provide additional data storage capacity, for example. For example, storage devices 1712 and/or 1720 may include computer-readable media such as magnetic tape, flash memory, PC-CARDS, portable mass storage devices such as hard disk drives (e.g., magnetic, optical, or solid state drives), holographic storage devices, and other storage devices. The mass storage devices 1712 and/or 1720 typically store additional programming instructions, data, and the like that are not normally actively used by the processor 1702. It will be appreciated that the information retained within the mass storage devices 1712 and 1720 may, in standard fashion, be incorporated as part of memory 1710 (e.g., RAM) serving as virtual memory, if desired.

In addition to providing processor 1702 access to the memory subsystem, the bus 1714 may also be used to provide access to other subsystems and devices. As shown, these may include a display 1718, a network interface 1716, an input/output (I/O) device interface 1704, an image processing device 1706, and other subsystems and devices. For example, the image processing device 1706 may include a camera, a scanner, or the like; the I/O device interface 1704 may include a device interface for interacting with a touch screen (e.g., a capacitive touch-sensitive screen that supports gesture interpretation), a microphone, a sound card, a speaker, a keyboard, a pointing device (e.g., a mouse, a stylus, a human finger), a Global Positioning System (GPS) receiver, an accelerometer, and/or any other suitable device interface for interacting with the system 1700. Multiple I/O device interfaces may be used in conjunction with computer system 1700. The I/O device interfaces may include a general purpose and customized interface that allows the processor 1702 to send data and, more typically, receive data from other devices, such as keyboards, pointing devices, microphones, touch screens, transducer card readers, tape readers, voice or handwriting recognizers, biometric readers, cameras, portable mass storage devices, and other computers.

The network interface 1716 allows the processor 1702 to couple to another computer, computer network, or telecommunications network using a network connection as shown. For example, through the network interface 1716, the processor 1702 may receive information (e.g., data objects or program instructions) from another network, or output information to another network in the course of performing the method/process steps. Information, which is often represented as a sequence of instructions to be executed on a processor, may be received from and output to another network. An interface card or similar device, and appropriate software implemented by the processor 1702 (e.g., executed/executed on the processor 1702) may be used to connect the computer system 1700 to an external network and communicate data according to standard protocols. For example, various process embodiments disclosed herein may execute on the processor 1702 or may be executed across a network, such as the internet, an intranet network, or a local area network, in conjunction with a remote processor that shares a portion of the processing. Additional mass storage devices (not shown) may also be connected to the processor 1702 through the network interface 1716.

Additionally, various embodiments disclosed herein are additionally directed to computer storage products with a computer-readable medium that include program code for performing various computer-implemented operations. The computer readable medium includes any data storage device that can store data which can thereafter be read by a computer system. Examples of computer readable media include, but are not limited to: magnetic media such as magnetic disks and tapes; optical media such as CD-ROM disks; magneto-optical media such as optical disks; and specially-configured hardware devices such as application-specific integrated circuits (ASICs), Programmable Logic Devices (PLDs), and ROM and RAM devices. Examples of program code include both: machine code such as produced by a compiler, for example; or files containing higher level code (e.g., scripts) that may be executed using an interpreter.

The computer system shown in FIG. 17 is merely an example of a computer system suitable for use with the various embodiments disclosed herein. Other computer systems suitable for such use may include additional or fewer subsystems. In some computer systems, the subsystems may share components (e.g., for touch screen-based devices, such as smartphones, tablets, etc., the I/O device interface 1704 and display 1718 share a touch-sensitive screen component that both detects user input and displays output to the user). Additionally, bus 1714 illustrates any interconnection scheme for linking subsystems. Other computer architectures having different configurations of subsystems may also be utilized.

Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, the invention is not limited to the details provided. There are many alternative ways of implementing the invention. The disclosed embodiments are illustrative and not restrictive.

40页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于诠释与能力相关联的权限的装置及方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类